CURL or file_get_contents to update a list of feeds?
Fetching google.com using file_get_contents took (in seconds):
2.313190942.303742172.215126043.305538892.30124092
CURL took:
0.687191010.646755930.643260.819831130.63956594
This was using the benchmark class from http://davidwalsh.name/php-timer-benchmark
Because you will be updating 50 feeds at once, I would strongly suggest using CURL for two reasons:
you can use curl_multi() functions that will allow you to sendall 50 requests at once, while file_get_contents() will only goone-by-one. The documentation for these functions is a bit sparse,so I would suggest using a lightweight library - it's much easier towork with. I personally usehttps://github.com/petewarden/ParallelCurl, but you will find manyaround.
as you are pinging the services, you do not really need to know the response, I guess (as long as it's HTTP 200). So you coulduse the CURL option CURLOPT_NOBODY to make it into a HEAD request,thus in response you would get the headers only, too. This shouldspeed up the process even more.
Put it otherwise, file_get_contents might be faster for simple requests, but in this case your situation is not simple. Firing 50 requests without really needed to get the whole document back is not a standard request.
Actually i think curl is faster than file_get_contents.
Googling a bit I've found out some benchmarks here in SO: file_get_contents VS CURL, what has better performance?