<p>我给你推荐这个包裹
<a href="https://pypi.org/project/Scrapy-UserAgents/" rel="nofollow noreferrer">Scrapy-UserAgents</a></p>
<pre><code>pip install scrapy-useragents
</code></pre>
<p>在你的设置.py文件</p>
^{pr2}$
<p>}</p>
<h2>要旋转的用户代理示例列表</h2>
<p><a href="https://developers.whatismybrowser.com/useragents/explore/" rel="nofollow noreferrer">More User Agents</a></p>
<pre><code>USER_AGENTS = [
('Mozilla/5.0 (X11; Linux x86_64) '
'AppleWebKit/537.36 (KHTML, like Gecko) '
'Chrome/57.0.2987.110 '
'Safari/537.36'), # chrome
('Mozilla/5.0 (X11; Linux x86_64) '
'AppleWebKit/537.36 (KHTML, like Gecko) '
'Chrome/61.0.3163.79 '
'Safari/537.36'), # chrome
('Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:55.0) '
'Gecko/20100101 '
'Firefox/55.0'), # firefox
('Mozilla/5.0 (X11; Linux x86_64) '
'AppleWebKit/537.36 (KHTML, like Gecko) '
'Chrome/61.0.3163.91 '
'Safari/537.36'), # chrome
('Mozilla/5.0 (X11; Linux x86_64) '
'AppleWebKit/537.36 (KHTML, like Gecko) '
'Chrome/62.0.3202.89 '
'Safari/537.36'), # chrome
('Mozilla/5.0 (X11; Linux x86_64) '
'AppleWebKit/537.36 (KHTML, like Gecko) '
'Chrome/63.0.3239.108 '
'Safari/537.36'), # chrome
]
</code></pre>
<p>小心这个中间件不能处理COOKIES启用为True的情况,并且网站将COOKIES与用户代理绑定,这可能会导致蜘蛛的不可预知的结果。在</p>