Search code examples
pythonpandasgroup-byuniquedistinct

How to group by two columns in pandas where the combination of the two is unique


I have a dataframe of two columns that together are unique, that I would like to group by and be able to show the result. For example:

import pandas as pd

d = {'label': ['A','A','B','B','B','C','E','E'],
     'value': [30,30,40,40,40,0,10,10]}
df = pd.DataFrame(d)
print(df)

results in

  label  value
0     A     30
1     A     30
2     B     40
3     B     40
4     B     40
5     C      0
6     E     10
7     E     10

Notice how all A's are 30, B's are 40, etc.
If I try to groupby both:

grouped = df.groupby(['label','value'])
print(grouped)

results in <pandas.core.groupby.generic.DataFrameGroupBy object at 0x00000208BD8470D0.

I'd really like to just get rid of these duplicate lines in the dataframe.

My goal is to get the resulting dataframe:

  label  value
1     A     30
2     B     40
3     C      0
4     E     10

Solution

  • head can be used to get the first n rows from each group:

    import pandas as pd
    
    d = {'label': ['A', 'A', 'B', 'B', 'B', 'C', 'E', 'E'],
         'value': [30, 30, 40, 40, 40, 0, 10, 10]}
    df = pd.DataFrame(d)
    
    print(df.groupby(['label', 'value']).head(1))
    

    drop_duplicates can be used to remove duplicate rows from the DataFrame as a whole:

    print(df.drop_duplicates())
    

    Output For Both:

      label  value
    0     A     30
    2     B     40
    5     C      0
    6     E     10
    

    Assuming there are other columns in the df and duplicate 'label', 'value' pairs are the only values that should be considered the subset flag can be used:

    import pandas as pd
    
    d = {'label': ['A', 'A', 'B', 'B', 'B', 'C', 'E', 'E'],
         'value': [30, 30, 40, 40, 40, 0, 10, 10],
         'third': [1, 2, 3, 4, 5, 6, 7, 8]}
    df = pd.DataFrame(d)
    
    print(df.drop_duplicates(subset=['label', 'value']))
    

    Output:

      label  value  third
    0     A     30      1
    2     B     40      3
    5     C      0      6
    6     E     10      7