运行多个爬虫
定义程序,集中启动
在项目路径下创建crawl.py
文件,内容如下:
from scrapy.crawler import CrawlerProcess
from scrapy.utils.project import get_project_settings
process = CrawlerProcess(get_project_settings())
# myspd1是爬虫名
process.crawl('myspd1')
process.crawl('myspd2')
process.crawl('myspd3')
process.start()