Skip to content Skip to sidebar Skip to footer

Pass Scraped Url's From One Spider To Another

How can I send the scraped URL's from one spider to the start_urls of another spider? Specifically, I want to run one spider which gets a list of URL's from an XML page. After the

Solution 1:

From first spider you can save url in some DB or send to some queue (Zerro, Rabbit MQ, Redis) for example via pipeline.

Second spider can get the url with method - start_requests

classMySpider(scrapy.Spider):
    name = 'myspider'defstart_requests(self):
        urls = my_db.orm.get('urls');
        for url inurls:yield scrapy.Request(url)

Or urls can be passed to spider from queue broker via cli or API. Or spider can be just launched from broker and launched spider get his url by start_requests.

Really exists many ways how you can do it. The way depend of the criteria why you need to pass urls from one spider to other.

You can check this projects: Scrapy-Cluster, Scrapy-Redis. May be its what you searching for.

Solution 2:

Write the URLs to a file as strings. Read them from the same file in the other spider.

Solution 3:

Why you want to use different spiders for such requirment?

You can just have 1 spider and then instead of passing URL to another spider, just yield another Request in your parse method.

from scrapy.spiders import SitemapSpider

classDaily(SitemapSpider):
    name = 'daily'
    sitemap_urls = ['http://example.com/sitemap.xml']

    defparse(self, response):

        yield Request(url="URL here", callback=callback_function)

Post a Comment for "Pass Scraped Url's From One Spider To Another"