scrapy一个项目中多个spider,同时并发执行

苍志文
2023-12-01


运行多个爬虫


定义程序,集中启动
在项目路径下创建crawl.py文件,内容如下:

from scrapy.crawler import CrawlerProcess
from scrapy.utils.project import get_project_settings

process = CrawlerProcess(get_project_settings())

# myspd1是爬虫名
process.crawl('myspd1')
process.crawl('myspd2')
process.crawl('myspd3')

process.start()

 类似资料: