擅长:python、mysql、java
<p>我想你要找的是这样的东西:</p>
<pre><code>import scrapy
from scrapy.crawler import CrawlerProcess
class MySpider1(scrapy.Spider):
# Your first spider definition
...
class MySpider2(scrapy.Spider):
# Your second spider definition
...
process = CrawlerProcess()
process.crawl(MySpider1)
process.crawl(MySpider2)
process.start() # the script will block here until all crawling jobs are finished
</code></pre>
<p>您可以在<a href="http://doc.scrapy.org/en/1.1/topics/practices.html#running-multiple-spiders-in-the-same-process" rel="noreferrer">running-multiple-spiders-in-the-same-process</a>上阅读更多内容。</p>