I have deployed a Python 3.7 function to Google Cloud. The following statements set up the logging for the function. If I execute the function from command line, the recommender_logs.log file will be stored in the current working directory.
logging.basicConfig(level=logging.DEBUG,
format='%(asctime)s %(name)-12s %(levelname)-8s %(message)s',
datefmt='%m-%d %H:%M',
filename='recommender_logs.log',
filemode='w')
console = logging.StreamHandler()
console.setLevel(logging.DEBUG)
# set a format which is simpler for console use
formatter = logging.Formatter('%(name)-12s: %(levelname)-8s %(message)s')
# tell the handler to use this format
console.setFormatter(formatter)
# add the handler to the root logger
logging.getLogger('').addHandler(console)
When I execute the function from the Google Cloud console, where do I need to set up this file, so that the function logs file will be stored in that? If I do not set up any path, where the file will be stored by default?
I can see the logs in Logging console, but apart from that is there a place to see this file, or only the cloud bucket is the option?
Every function deployed in GCP runs in a separate container and if it needs to store any data then it can only do so in the /tmp
directory if the container it's running it.
It doesn't matter if you deploy it from your compute, Cloud Shell or using API. Your functions file system is isolated from any other function or even another instance of itself.
You store your file in some persisten way you have to export it for exaple to a GCP bucket. Have a look at this Stack answer - it may be usefull to you.