scrapy from script output in json scrapy from script output in json json json

scrapy from script output in json


You need to set FEED_FORMAT and FEED_URI settings manually:

settings.overrides['FEED_FORMAT'] = 'json'settings.overrides['FEED_URI'] = 'result.json'

If you want to get the results into a variable you can define a Pipeline class that would collect items into the list. Use the spider_closed signal handler to see the results:

import jsonfrom twisted.internet import reactorfrom scrapy.crawler import Crawlerfrom scrapy import log, signalsfrom scrapy.utils.project import get_project_settingsclass MyPipeline(object):    def process_item(self, item, spider):        results.append(dict(item))results = []def spider_closed(spider):    print results# set up spider    spider = TestSpider(domain='mydomain.org')# set up settingssettings = get_project_settings()settings.overrides['ITEM_PIPELINES'] = {'__main__.MyPipeline': 1}# set up crawlercrawler = Crawler(settings)crawler.signals.connect(spider_closed, signal=signals.spider_closed)crawler.configure()crawler.crawl(spider)# start crawlingcrawler.start()log.start()reactor.run() 

FYI, look at how Scrapy parses command-line arguments.

Also see: Capturing stdout within the same process in Python.


I managed to make it work simply by adding the FEED_FORMAT and FEED_URI to the CrawlerProcess constructor, using the basic Scrapy API tutorial code as follows:

process = CrawlerProcess({'USER_AGENT': 'Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)','FEED_FORMAT': 'json','FEED_URI': 'result.json'})


Easy!

from scrapy import cmdlinecmdline.execute("scrapy crawl argos -o result.json -t json".split())

Put that script where you put scrapy.cfg