Interestingly, I have not been able to find a working example of this. Using php, I'm trying to scrape/re display all the images of a given url, onto another website. I know how to do this with text, but images, I'm not sure. Anyone know of a good working example? I get how to grab all contents, but not specifically just the images. ex this does the whole page:
<?php
$curl = curl_init();
curl_setopt ($curl, CURLOPT_URL,
"https://en.wikipedia.org/wiki/Wikipedia:Picture_of_the_day");
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
$result = curl_exec ($curl);
curl_close ($curl);
echo $result;
?>
Thanks much. -Wilson
*ideally actually this would just grab the first image, such as in the example above. But I won't get ahead of myself, just trying to get this function down.
You may use a file to save the result.
$fp = fopen($filename, 'a+');
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:29.0) Gecko/20100101 Firefox/29.0');
curl_setopt($ch, CURLOPT_ENCODING, 'gzip, deflate');
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 5);
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_NOPROGRESS, false);
curl_setopt($ch, CURLOPT_PROGRESSFUNCTION, function ($dltotal, $dlnow, $ultotal, $ulnow) {
});
curl_setopt($ch, CURLOPT_LOW_SPEED_LIMIT, 1);
curl_setopt($ch, CURLOPT_LOW_SPEED_TIME, 8);
curl_exec($ch);
$error = curl_error($ch);
$http_code = curl_getinfo($ch, CURLINFO_HTTP_CODE);
$content_type = curl_getinfo($ch, CURLINFO_CONTENT_TYPE);
$end_size = $begin_size + curl_getinfo($ch, CURLINFO_SIZE_DOWNLOAD);
Log::info('end_size='.$end_size);
curl_close($ch);
fclose($fp);