When running Jupyter notebook natively it is simple to import functions and utilities from a saved .py script.
When I work in a Jupyter notebook running on a Google Cloud Platform dataproc cluster and try the same thing- (after having uploaded a .py script to my dataproc Jupyter notebook- it is therefore in the cloud***) I am unable to import the function into the (dataproc) notebook.
Does anyone know how I can do this? Does it just have to do with figuring out the correct, but not obvious path? (I am trying to import a .py file from within the same folder as the Jupyter notebook, so if this were running natively it wouldn't require a path, but perhaps it is different with dataproc?
*** I am not making the mistake of trying to import a desktop/native .py script into a GC dataproc notebook.
Any help or leads would be very much appreciated!
Unfortunately this is not supported. But you can download the .py
file then import, as a workaround - details can be found in the answer in a similar question:
Dataproc import python module stored in google cloud storage (gcs) bucket.