I am looking at the pricing of various cloud computing platforms, particularly Amazon's EC2, and a lot of the quotes are based on a unit called an Instance-Hour.
I am trying to get a handle on the exact definition of an instance-hour to better compare the costs of continuing to host a web-application versus putting it out on the cloud.
(1) Does it correspond to any of the Windows performance counters in such a way that I could benchmark our current implmentation and use it in their pricing calculators?
(2) How does a multi-processor instance figure into the instance-hour calculation?
An instance hour is simply a regular hour where the instance was available to you, wether you used it or not. Amazon has priced their different types of instances differently, so you pay for the type of resource you are getting, not how much you use it.
So... 1. No, it's just a regular hour. 2. It doesn't, it's already factored into the price you pay for the instance pr hour.