Search code examples
amazon-web-servicesaws-fargatecost-management

ECS Fargate Pricing


I'd like to make sure I'm interpreting AWS's ECS Fargate pricing model correctly when compared to an m4.large EC2 instance (2vCPU, 8GB Mem) running non stop (even dropping to 1% cpu/mem utilization) for an entire month (730hrs).

# Monthly cost estimates

Fargate:
    cpu = 730hrs * 2vCPU * $0.056 = $73.88
    mem = 730hrs * 8GB Mem * $0.0127 = $74.17
    total = $73.88 + $74.17 = $148.05

EKS ec2 node (1 yr reserved no upfront):
    total = 730hrs * $0.062 = $45.26

EKS ec2 node (on demand):
    total = 730hrs * $0.10 = $73.00

It appears Fargate would be ~3x as expensive as an EC2 instance. Does my Fargate pricing look accurate? I'm assuming Fargate isn't intended to be used for something like a 24/7 website, but more like a one off job, analogous perhaps to a Lamba function that runs a container image.

Am I correct that I'm billed for the entire Fargate task cpu & mem allocation, regardless if I'm utilizing 1% or 100% of the resources?

References:


Solution

  • Your calculations seem correct to me.

    I run a bunch of 24/7 websites as 0.25vCPU and 0.5GB RAM Fargate tasks just because of the ease of setting them up. They don't have lots of traffic and they are cached pretty heavily, but if they need to they can scale to 10x based on target CPU.

    Used that way I think they are pretty cost efficient.

    Update: AWS updated Fargate prices January 7, 2019. The prices now are $0.04048 per vCPU per hour and $0.004445 per GB memory per hour. Your example would now be:

    Fargate:
      cpu = 730hrs * 2vCPU * $0.04048 = $59.10
      mem = 730hrs * 8GB Mem * $0.004445 = $25.96
      total = $59.10 + $25.96 = $85.06