I'm displaying a gallery of pictures which I store below the root for security. There are thumbnails for each jpeg. When displaying the gallery, I have successfully set
<img src='./php/getfile.php?file=xyz-thumb.jpg'></a>
getfile.php processes each thumbnail with the following code. When clicking on the thumbnail, the same code loads the larger version of the image.
I can already tell this code is slower than html and with potentially 20-30 thumbnails on a page, I am debating whether I need to keep the thumbnails visible to public_html for performance sake. Is there a quicker way to display the thumbnails? Is fpassthru() any quicker or more desirable for other reasons?
// File Exists?
if( file_exists($fullfilename)){
// Parse Info / Get Extension
$fsize = filesize($fullfilename);
$path_parts = pathinfo($fullfilename);
$ext = strtolower($path_parts["extension"]);
// Determine Content Type
switch ($ext) {
case "pdf": $ctype="application/pdf"; break;
case "exe": $ctype="application/octet-stream"; break;
case "zip": $ctype="application/zip"; break;
case "doc": $ctype="application/msword"; break;
case "xls": $ctype="application/vnd.ms-excel"; break;
case "ppt": $ctype="application/vnd.ms-powerpoint"; break;
case "gif": $ctype="image/gif"; break;
case "png": $ctype="image/png"; break;
case "jpeg":
case "jpg": $ctype="image/jpg"; break;
default: $ctype="application/force-download";
}
header("Pragma: public"); // required
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: private",false); // required for certain browsers
header("Content-Type: $ctype");
if ($mode == "view"){
// View file
header('Content-Disposition: inline; filename='.basename($fullfilename));
}
else {
// Download file
header('Content-Disposition: attachment; filename='.basename($fullfilename));
}
header("Content-Transfer-Encoding: binary");
header("Content-Length: ".$fsize);
ob_clean();
flush();
readfile( $fullfilename );
} else
die('File Not Found:' . $fullfilename);
Based on your comments above, I would say this sounds like a very inefficient way to do it, mostly because it stops normal caching. If somebody is likely to automate scraping of your full size images, then they will find a way around it (e.g. Selenium RC).
If you're only concern is about someone scraping the images, then use another solution. Here are some other solutions:
The honeypot is a very common implementation.