I have a slow internet connection, like 128k. If I write a script which would get 100 pages from another website and put its contents in my database with curl or anything else, it would take ages to leech all 100 pages.
$get = file_get_contents("www.google.com?search=something");
$res = pareser::parse()$get;
foreach($res as $r )
{
$db->insert($r['title']);
}
But what if I use AJAX and send a AJAX request to my server to activate my spider? My server speed is much faster then my client speed, and it's the server who runs the script now (at least i think!). Why it doesn't make much different in speed of leeching those 100 pages though?