I need to do a bunch of HTTP requests and then a bunch of processing of the resulting data. It gets quite intensive, to the point that I can't reliably do it on one server anymore, so I created two more Linode VPS in addition to the main one to spread the work load over.
The work is basically the main server sends a bunch of commands to the other servers to do, which includes an HTTP request and what to do with the results, then the other servers perform the fetches, manipulate the data then store it in a central database.
This is working now but the work load is increasing and I'll need to spin up another VPS soon, and the configuration is kind of annoying.
Is there a better way to do this? An automated way to create more servers as needed that isn't super hard to understand? Creating new VPSes is easy just time consuming.
If your http requests and processing time is kept to a minimum (less than 15 minutes) AWS Lambda is a great option. There is ZERO infrastructure/autoscaling configuration needed, and it is extremely simple to get code up and running. You simply need to upload your code in one of a variety of languages, then add a trigger that kicks off your function.
Another option is to use AWS Batch to distribute jobs over a scalable number of instances. You would populate the Batch Job Queue with the requests that you wish for your instances to make. Batch then runs a "job" by passing in a run command to a container that runs on an auto-scaled number of instances based on your job definition.
Setting up a batch environment requires very minimal configuration (setting max/min config values for cpu, mem, etc.), but it does require that your application is stored in ECR as a docker image. Once your batch environment is properly setup, there is very little configuration and maintenance needed.