Search code examples
pythonpandasdataframepysparkspark-koalas

TypeError: 'module' object is not callable for time on Koalas dataframe


I am facing a small issue with a line of code that I am converting from pandas into Koalas.

Note: I am executing my code in the databricks.

The following line is pandas code:

input_data['t_avail'] = np.where(input_data['purchase_time'] != time(0, 0), 1, 0)

I did the conversion to Koalas as follows. Just to mention that I already have defined the input_data dataframe as Koalas type before the following line of code.

# Add a new column called 't_avail' in input_data Koalas dataframe
        input_data = input_data.assign( 
           t_avail = (input_data['purchase_time'] != time(0, 0))
           ) 

I get the following error with the Koalas conversion: TypeError: 'module' object is not callable

I am not sure what is the issue with the time module as I just want to assign the t_avail column with entries from the purchase_time column with entries that have a not empty time.

May someone help me resolve the issue? I think I am missing something silly.

Thank you to all.


Solution

  • As you say you import time module in your code.

    This is because you write time(0,0). However, time is a module and you use it as a function

    You can use this

     input_data = input_data.assign( 
               t_avail = ((input_data['purchase_time']).str.strip() != "")
               )