I am currently trying to investigate some strange activity on one of our GCP Cloud Functions and one thing that would really help would be know the number of log messages produced by that cloud function in a given period - this would enable us to compare day-by-day and see if there is any significant difference.
Note that the period in question occurred in the past and we have not been capturing logs to BigQuery and we do not have any logs-based metrics in place.
Is there any in which I can count the number of log messages outputted from a cloud function in a given period?
My immediate thought is to sink the Cloud Logging logs in question to BigQuery and then run queries against BQ. However, if the logs have been written in the past then you can't sink those history logs directly into BQ. What you might be able to do is run a Cloud Logging filter and return the superset of logs that you are interested in and have those written to a GCS bucket object. From there, you can perform processing using Pandas in a Jupyter notebook or manually ingest them into a BQ table and then run your queries.
At the highest level, start looking at your cloud logs and validate that they contain the information that you could use. Make a plan on how you would query that data to generate the answers you want.