擅长:python、mysql、java
<p>或者您可以这样运行,您需要使用scrapy.cfg(我的scrapy版本是1.3.3)将此代码保存在同一目录中:</p>
<pre><code>from scrapy.utils.project import get_project_settings
from scrapy.crawler import CrawlerProcess
setting = get_project_settings()
process = CrawlerProcess(setting)
for spider_name in process.spiders.list():
print ("Running spider %s" % (spider_name))
process.crawl(spider_name,query="dvh") #query dvh is custom argument used in your scrapy
process.start()
</code></pre>