Search code examples
pythonpysparkpyscripter

How to run pyspark script without show the running process in the window


I have a pyspark file test.py in a server and I want to run this file with an argument(teo).

spark-submit --driver-memory=32g --executor-memory=32g test.py teo 2>&1 test.logs.

Although when I am running it starts to show the whole process in the window but I do not want that. Instead of that I want to store all the running process in a .logs file.


Solution

  • Please try like below

    spark-submit --driver-memory=32g --executor-memory=32g test.py teo >> test.logs 2>&1 
    

    This will redirect the terminal output to test.logs