I have the following scenario: You can download from our server some files. If you're a "normal" user, you have a limited bandwidth for example 500kbits. If you're a premium user, you haven't a limit of your bandwidth and can download as fast a possible. But how I can realize this? How does this uploaded & co?
Note: you can do this with PHP, but I would recommend you let the server itself handle throttling. The first part of this answer deals with what your options are if you want to cap the download speed with PHP alone, but below you'll find a couple of links where you'll find how to manage download limits using the server.
There is a PECL extension that makes this a rather trivial task, called pecl_http, which contains the function http_throttle
. The docs contain a simple example of how to do this already. This extension also contains a HttpResponse
class, which isn't well documented ATM, but I suspect playing around with its setThrottleDelay
and setBufferSize
methods should produce the desired result (throttle delay => 0.001, buffer size 20 == ~20Kb/sec). From the looks of things, this ought to work:
$download = new HttpResponse();
$download->setFile('yourFile.ext');
$download->setBufferSize(20);
$download->setThrottleDelay(.001);
//set headers using either the corresponding methods:
$download->setContentType('application/octet-stream');
//or the setHeader method
$download->setHeader('Content-Length', filesize('yourFile.ext'));
$download->send();
If you can't/don't want to install that, you can write a simple loop, though:
$file = array(
'fname' => 'yourFile.ext',
'size' => filesize('yourFile.ext')
);
header('Content-Type: application/octet-stream');
header('Content-Description: file transfer');
header(
sprintf(
'Content-Disposition: attachment; filename="%s"',
$file['fname']
)
);
header('Content-Length: '. $file['size']);
$open = fopen($file['fname'], "rb");
//handle error if (!$fh)
while($chunk = fread($fh, 2048))//read 2Kb
{
echo $chunk;
usleep(100);//wait 1/10th of a second
}
Of course, don't buffer the output if you do this :), and it might be best to add a set_time_limit(0);
statement, too. If the file is big, it's quite likely your script will get killed mid-way through the download, because it hits the max execution time.
Another (and probably preferable) approach would be to limit the download speed through the server configuration:
I've never limited the download rates myself, but looking at the links, I think it's fair to say that nginx is the easiest by far:
location ^~ /downloadable/ {
limit_rate_after 0m;
limit_rate 20k;
}
This makes the rate limit kick in immediately, and sets it to 20k. Details can be found on the nginx wiki.
As far as apache is concerned, it's not that much harder, but it'll require you to enable the ratelimit module
LoadModule ratelimit_module modules/mod_ratelimit.so
Then, it's a simple matter of telling apache which files should be throttled:
<IfModule mod_ratelimit.c>
<Location /downloadable>
SetOutputFilter RATE_LIMIT
SetEnv rate-limit 20
</Location>
</IfModule>