Generally python doesn't work well with multi threading because of the Global Interpreter Lock.
Does this affect also pyspark applications running in multi threaded local mode (local[n])?
Parallelization in pyspark is achieved by daemon.py calling os.fork()
to create multiple worker processes, so there won't be GIL issues.