I have an application that I would like to host on AWS. It will need to have access to a relational database that I could host through Aurora, RDS, etc. all of which offer per-hour pricing.
Does this mean that in order to have this application (or at least the DB side of it) available all the time I would be paying (hourly price)*24*30 dollars per month?
Or is this per-hour rate based on the time used to run queries, etc. on the database? For example, if in a month I run a query 1000 times that takes 0.1 second to run, I would pay (1000*0.1)/60/60 * (hourly price) for that month.
Yes, it's based on how long the instance is running, not how long it spends executing queries.
https://aws.amazon.com/rds/mysql/pricing/
On-Demand DB Instances let you pay for compute capacity by the hour your DB Instance runs