Search code examples
pythonpysparkanacondalog4jspyder

How to change Spark's default log4j profile


I'm running PySpark on Spyder IDE and this warnings are comming out everytime:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/02/15 17:05:12 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/02/15 17:05:29 WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped

I have tried to edit the file C:\spark\spark-3.2.1-bin-hadoop2.7\conf\log4j.properties.template to change the warn level to 'ERROR' but it didn't do anything


Solution

    1. Rename log4j.properties.template into log4j.properties
    2. Make sure log4j.properties is inside classpath or under $SPARK_HOME/conf/