Unable to use multiple proxies within Scrapy spider
You can also set the meta key proxy per-request, to a value like http://some_proxy_server:port or http://username:password@some_proxy_server:port.
from official docs: https://doc.scrapy.org/en/latest/topics/downloader-middleware.html#scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware
So you need to write your own middleware that would do:
- Catch failed responses
- If response failed because of proxy:
- replace
request.meta['proxy']
value with new proxy ip - reschedule request
- replace
Alternative you can look into scrapy extensions packages that are already made to solve this: https://github.com/TeamHG-Memex/scrapy-rotating-proxies