Search code examples
google-cloud-platformgoogle-cloud-storagegoogle-cloud-logging

Google cloud storage - How to get object count which are less than 100 days


We are trying to move our GCS buckets from one storage class to other:

  • Few buckets from Standard class to Archive
  • Few buckets from standard class to Coldline (files which are 50+ days old).

the storage it too large (90tb). We wanted to get a estimate cost for this move.

I read from Google documentation that, per every 1000 objects the network usage charges (moving from one storage class to other) is $0.10c.

From the Google cloud console, I can get the total objects count in a project. But, I'm struggling to find number of objects which are 50+ days old, so we can get a cost estimate for this move.

Any gcloud query to find that?


Solution

  • GCS object count metrics can be grouped by bucket, but not by object prefix (e.g. "folder/folder"). You can have a look at this Storage Insights inventory report feature.

    The Storage Insights inventory report feature helps you manage your object storage at scale. It is a faster and scheduled alternative to the Objects: list API operation.

    Inventory reports contain metadata information about your objects, such as the object's storage class, ETag, and content type. This information helps you analyze your storage costs, audit and validate your objects, and ensure data security and compliance. You can export inventory reports as comma-separated value (CSV) or Apache Parquet files so you can further analyze it using tools such as BigQuery.