I am using Python 2.7 and Spark 2.2.0. I have created a data frame in PySpark which has a string column type and contains URLs.
df = spark.createDataFrame([('example.com?title=%D0%BF%D1%80%D0%B0%D0%B2%D0%BE%D0%B2%D0%B0%D1%8F+%D0%B7%D0%B0%D1%89%D0%B8%D1%82%D0%B0',)], ['url'])
df.show(1, False)
+-------------------------------------------------------------------------------------------------------+
|url |
+-------------------------------------------------------------------------------------------------------+
|example.com?title=%D0%BF%D1%80%D0%B0%D0%B2%D0%BE%D0%B2%D0%B0%D1%8F+%D0%B7%D0%B0%D1%89%D0%B8%D1%82%D0%B0|
+-------------------------------------------------------------------------------------------------------+
To decode all the URLs in the column I tried to use urllib. I created a udf
. I'm using it like this:
from pyspark.sql.types import StringType
from pyspark.sql.functions import udf
decode_url = udf(lambda val: (urllib.unquote(val).decode('utf8'), StringType()))
After applying a udf
over my column I was expecting this :
+---------------------------------+
|url |
+---------------------------------+
|example.com?title=правовая+защита|
+---------------------------------+
But I got an error:
UnicodeEncodeError: 'ascii' codec can't encode characters in position 18-33: ordinal not in range(128)
at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRDD.scala:193)
at org.apache.spark.api.python.PythonRunner$$anon$1.<init>(PythonRDD.scala:234)
at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:152)
at org.apache.spark.sql.execution.python.BatchEvalPythonExec$$anonfun$doExecute$1.apply(BatchEvalPythonExec.scala:144)
at org.apache.spark.sql.execution.python.BatchEvalPythonExec$$anonfun$doExecute$1.apply(BatchEvalPythonExec.scala:87)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
If I take out some url from the column and try to decode it separately, it's working fine:
import urllib
url='example.com?title=%D0%BF%D1%80%D0%B0%D0%B2%D0%BE%D0%B2%D0%B0%D1%8F+%D0%B7%D0%B0%D1%89%D0%B8%D1%82%D0%B0'
print urllib.unquote(url).decode('utf8')
example.com?title=правовая+защита
It seems like under the hood there is some strange encoding going on. Why don't you explicitly encode it yourself?
>>> decode_udf= udf(lambda val: urllib.unquote(val.encode('utf-8')).decode('utf-8'), StringType())
>>> df.withColumn('decoded_url', decode_udf('url')).show(truncate=False)
+-------------------------------------------------------------------------------------------------------+---------------------------------+
|url |decoded_url |
+-------------------------------------------------------------------------------------------------------+---------------------------------+
|example.com?title=%D0%BF%D1%80%D0%B0%D0%B2%D0%BE%D0%B2%D0%B0%D1%8F+%D0%B7%D0%B0%D1%89%D0%B8%D1%82%D0%B0|example.com?title=правовая+защита|
+-------------------------------------------------------------------------------------------------------+---------------------------------+
``