I have some Python code in a Jupyter notebook and I need to run it automatically every day, so I would like to know if there is a way to set this up. I really appreciate any advice on this.
It's better to combine with airflow if you want to have higher quality. I packaged them in a docker image, https://github.com/michaelchanwahyan/datalab.
It is done by modifing an open source package nbparameterize and integrating the passing arguments such as execution_date. Graph can be generated on the fly The output can be updated and saved within inside the notebook.
When it is executed
Besides, it also installed and configured common tools such as spark, keras, tensorflow, etc.