Search code examples
pythonpysparksemi-join

I want to select all records from one dataframe where its value exists/not exists in another dataframe. How to do this using pyspark dataframes?


I have the two pyspark dataframes. I want to select all records from voutdf where its "hash" does not exist in vindf.tx_hash

How to do this using pyspark dataframe.? I tried a semi join but I am ending up with out of memory errors.

voutdf = sqlContext.createDataFrame(voutRDD,["hash", "value","n","pubkey"])

vindf = sqlContext.createDataFrame(vinRDD,["txid", "tx_hash","vout"])

Solution

  • You can do it with left-anti join:

    df = voutdf.join(vindf.withColumnRenamed("tx_hash", "hash"), "hash", 'left_anti')
    

    left-anti join:

    It takes all rows from the left dataset that don't have their matching in the right dataset.