Search code examples
python-3.xconfigurationpysparkgoogle-cloud-platformgoogle-cloud-dataproc

How to run python3 on google's dataproc pyspark


I want to run a pyspark job through Google Cloud Platform dataproc, but I can't figure out how to setup pyspark to run python3 instead of 2.7 by default.

The best I've been able to find is adding these initialization commands

However, when I ssh into the cluster then
(a) python command is still python2,
(b) my job fails due to a python 2 incompatibility.

I've tried uninstalling python2 and also aliasing alias python='python3' in my init.sh script, but alas, no success. The alias doesn't seem to stick.

I create the cluster like this

cluster_config = {
    "projectId": self.project_id,
    "clusterName": cluster_name,
    "config": {
        "gceClusterConfig": gce_cluster_config,
        "masterConfig": master_config,
        "workerConfig": worker_config,
        "initializationActions": [
            [{
            "executableFile": executable_file_uri,
            "executionTimeout": execution_timeout,
        }]
        ],
    }
}

credentials = GoogleCredentials.get_application_default()
api = build('dataproc', 'v1', credentials=credentials)

response = api.projects().regions().clusters().create(
    projectId=self.project_id,
    region=self.region, body=cluster_config
).execute()

My executable_file_uri is sits on google storage; init.sh:

apt-get -y update
apt-get install -y python-dev
wget -O /root/get-pip.py https://bootstrap.pypa.io/get-pip.py
python /root/get-pip.py
apt-get install -y python-pip
pip install --upgrade pip
pip install --upgrade six
pip install --upgrade gcloud
pip install --upgrade requests
pip install numpy

Solution

  • I found an answer to this here such that my initialization script now looks like this:

    #!/bin/bash
    
    # Install tools
    apt-get -y install python3 python-dev build-essential python3-pip
    easy_install3 -U pip
    
    # Install requirements
    pip3 install --upgrade google-cloud==0.27.0
    pip3 install --upgrade google-api-python-client==1.6.2
    pip3 install --upgrade pytz==2013.7
    
    # Setup python3 for Dataproc
    echo "export PYSPARK_PYTHON=python3" | tee -a  /etc/profile.d/spark_config.sh  /etc/*bashrc /usr/lib/spark/conf/spark-env.sh
    echo "export PYTHONHASHSEED=0" | tee -a /etc/profile.d/spark_config.sh /etc/*bashrc /usr/lib/spark/conf/spark-env.sh
    echo "spark.executorEnv.PYTHONHASHSEED=0" >> /etc/spark/conf/spark-defaults.conf