Search code examples
pythonpandasdatetimegroup-bymean-square-error

pandas apply function to data grouped by day


I have a dataset that looks like this:

date,value1,value2
2016-01-01 00:00:00,3,0
2016-01-01 01:00:00,0,0
2016-01-01 02:00:00,0,0
2016-01-01 03:00:00,0,0
2016-01-01 04:00:00,0,0
2016-01-01 05:00:00,0,0
2016-01-01 06:00:00,0,0
2016-01-01 07:00:00,0,2
2016-01-01 08:00:00,3,11
2016-01-01 09:00:00,14,14
2016-01-01 10:00:00,12,13
2016-01-01 11:00:00,11,13
2016-01-01 12:00:00,11,9
2016-01-01 13:00:00,17,21
2016-01-01 14:00:00,9,22
2016-01-01 15:00:00,10,9
2016-01-01 16:00:00,11,9
2016-01-01 17:00:00,8,8
2016-01-01 18:00:00,4,2
2016-01-01 19:00:00,5,7
2016-01-01 20:00:00,5,5
2016-01-01 21:00:00,3,4
2016-01-01 22:00:00,2,4
2016-01-01 23:00:00,2,4
2016-01-02 00:00:00,0,0
2016-01-02 01:00:00,0,0
2016-01-02 02:00:00,0,0
2016-01-02 03:00:00,0,0
2016-01-02 04:00:00,0,0
2016-01-02 05:00:00,0,0
2016-01-02 06:00:00,1,0
2016-01-02 07:00:00,0,0
2016-01-02 08:00:00,0,0
2016-01-02 09:00:00,0,0
2016-01-02 10:00:00,0,0
2016-01-02 11:00:00,0,0
2016-01-02 12:00:00,0,0
2016-01-02 13:00:00,1,0
2016-01-02 14:00:00,0,0
2016-01-02 15:00:00,0,0
2016-01-02 16:00:00,0,0
2016-01-02 17:00:00,0,0
2016-01-02 18:00:00,0,0
2016-01-02 19:00:00,0,0
2016-01-02 20:00:00,1,0
2016-01-02 21:00:00,0,0
2016-01-02 22:00:00,0,0
2016-01-02 23:00:00,0,0

What I want to do is calculate the rmse between value1 and value2 per day. So basically, I want to run the function 31 times (once per day), and the input would be the 24 entries of the day (one every hour) I tried using

rmse(df.groupby([df.index.day]).mean().value1, 
    df.groupby([df.index.day]).mean().value2)

but it gave me a single value, and what I want is a list with the rmse of each day, such as

daily_rmse = [rmse01_01, rmse01_02, ..., rmse01_31]

Solution

  • use sklearns mean_squared_error

    from sklearn.metrics import mean_squared_error
    
    df.groupby(df.date.dt.date).apply(
        lambda x: mean_squared_error(x.value1, x.value2) ** .5)
    
    date
    2016-01-01    3.494043
    2016-01-02    0.377964
    dtype: float64