Search code examples
hostingcloud

Site is slow due to processing-intensive scripts


I have a site that has to crawl different sites to aggregate information. When the crawling scripts are running, the site's speed slows down. I have done as much as possible to optimize the crawling, but it's really CPU- and RAM- intensive. These crawls have to occur based on some user action (e.g. search). It is not an option to "pre-crawl" the information as the information is time-sensitive.

What are the general strategies I can use to solve this? Here are 2 of my ideas:

  • Get more CPU and RAM on current server
  • Offload these processing intensive scripts on a separate physical server

I'm wondering about cloud computing, but don't have any experience in it. Suggestions?


Solution

  • You've already identified the options. "Cloud computing" doesn't mean anything but being able to quickly allocate a VPS with hourly pricing. It's the same as buying another physical server, except without waiting for the host to put it online and e-mail you access info, and without a monthly commitment. You still have to write your application to make use of multiple servers, you have to write code to "scale up" or "scale down" as needed (purchase or terminate virtual servers, and write code to automatically start whatever programs you need on them), you still have to properly manage the servers (install and maintain an OS, keep packages updated with security fixes) etc.