Unable to use multiple proxies within Scrapy spider Unable to use multiple proxies within Scrapy spider python-3.x python-3.x

Unable to use multiple proxies within Scrapy spider


You can also set the meta key proxy per-request, to a value like http://some_proxy_server:port or http://username:password@some_proxy_server:port.

from official docs: https://doc.scrapy.org/en/latest/topics/downloader-middleware.html#scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware

So you need to write your own middleware that would do:

  1. Catch failed responses
  2. If response failed because of proxy:
    1. replace request.meta['proxy'] value with new proxy ip
    2. reschedule request

Alternative you can look into scrapy extensions packages that are already made to solve this: https://github.com/TeamHG-Memex/scrapy-rotating-proxies