scrapy中同时启动多个爬虫

2018-06-09  本文已影响0人  俊采星驰_87e0

直接贴代码:

!/usr/bin/env python3

-- coding: utf-8 --

from scrapy.crawler import CrawlerProcess
from scrapy.utils.project import get_project_settings

process = CrawlerProcess(get_project_settings())
process.crawl('A_spider')
process.crawl('B_spider')
process.crawl('C_spider')

process.start()

园宝科技

上一篇 下一篇

猜你喜欢

热点阅读