Download multiple images from remote server with PHP (a LOT of images)
Getting them one by one might be quite slow. Consider splitting them into packs of 20-50 images and grabbing them with multiple threads. Here's the code to get you started:
$chs = array();$cmh = curl_multi_init();for ($t = 0; $t < $tc; $t++){ $chs[$t] = curl_init(); curl_setopt($chs[$t], CURLOPT_URL, $targets[$t]); curl_setopt($chs[$t], CURLOPT_RETURNTRANSFER, 1); curl_multi_add_handle($cmh, $chs[$t]); }$running=null;do { curl_multi_exec($cmh, $running);} while ($running > 0);for ($t = 0; $t < $tc; $t++){ $path_to_file = 'your logic for file path'; file_put_contents($path_to_file, curl_multi_getcontent($chs[$t])); curl_multi_remove_handle($cmh, $chs[$t]); curl_close($chs[$t]);}curl_multi_close($cmh);
I used that approach to grab a few millions of images recently, since one by one would take up to a month.
The amount of images you grab at once should depend on their expected size and your memory limits.
I used this function for that and worked pretty well.
function saveImage($urlImage, $title){ $fullpath = '../destination/'.$title; $ch = curl_init ($urlImage); curl_setopt($ch, CURLOPT_HEADER, 0); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_BINARYTRANSFER,1); $rawdata=curl_exec($ch); curl_close ($ch); if(file_exists($fullpath)){ unlink($fullpath); } $fp = fopen($fullpath,'x'); $r = fwrite($fp, $rawdata); setMemoryLimit($fullpath); fclose($fp); return $r;}
Combined with this other one to prevent memory overflow:
function setMemoryLimit($filename){ set_time_limit(50); $maxMemoryUsage = 258; $width = 0; $height = 0; $size = ini_get('memory_limit'); list($width, $height) = getimagesize($filename); $size = $size + floor(($width * $height * 4 * 1.5 + 1048576) / 1048576); if ($size > $maxMemoryUsage) $size = $maxMemoryUsage; ini_set('memory_limit',$size.'M');}