Search code examples
google-cloud-dataflowgoogle-cloud-mlgoogle-cloud-ml-engine

Dataflow Error when running flowers-sample preprocessing


I am trying to use the preprocessing script contained in the flowers-sample (I saw that it has been modified today and it is no more deprecated). However, after installing the required packages, the pipeline fails and outputs these error logs

(caeb3b0a930d0a6): Workflow failed. Causes: (caeb3b0a930d587): S01:Save to disk/Write/WriteImpl/InitializeWrite failed.

and

(d50acb0dd46c44c6): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 666, in run
    self._load_main_session(self.local_staging_directory)
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 411, in _load_main_session
    pickler.load_session(session_file)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 230, in load_session
    return dill.load_session(file_path)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in load_session
    module = unpickler.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1090, in load_global
    klass = self.find_class(module, name)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 423, in find_class
    return StockUnpickler.find_class(self, module, name)
  File "/usr/lib/python2.7/pickle.py", line 1124, in find_class
    __import__(module)
ImportError: No module named util

I get the same errors running the process from two different Google Compute Engines where I have installed the packages listed in requirements.txt.

Does it refer to the util.py file in the trainer directory or else, are there additional packages I should install to avoid this error?


Solution

  • I have found a workaround: in preprocess.py I have replaced the import of the util package with the definition of get_cloud_project() that is contained in util.py.

    I don't know if the issue was caused by the local package employed on a dataflow. I don't think this is the case because get_cloud_project() is not called inside the pipeline definition, but this is the first time I use dataflow.

    If someone else knows if it is possibile to make the code work without modifying it, please tell me!