Search code examples
python-2.7scikit-learngisimage-segmentation

How to calculate dice coefficient for measuring accuracy of image segmentation in python


I have an image of land cover and I segmented it using K-means clustering. Now I want to calculate the accuracy of my segmentation algorithm. I read somewhere that dice co-efficient is the substantive evaluation measure. But I am not sure how to calculate it. I use Python 2.7 Are there any other effective evaluation methods? Please give a summary or a link to a source. Thank You!

Edits: I used the following code for measuring the dice similarity for my original and the segmented image but it seems to take hours to calculate:

for i in xrange(0,7672320):
  for j in xrange(0,3):
    dice = np.sum([seg==gt])*2.0/(np.sum(seg)+np.sum(gt)) #seg is the segmented image and gt is the original image. Both are of same size

Solution

  • Please refer to Dice similarity coefficient at wiki

    A sample code segment here for your reference. Please note that you need to replace k with your desired cluster since you are using k-means.

    import numpy as np
    
    k=1
    
    # segmentation
    seg = np.zeros((100,100), dtype='int')
    seg[30:70, 30:70] = k
    
    # ground truth
    gt = np.zeros((100,100), dtype='int')
    gt[30:70, 40:80] = k
    
    dice = np.sum(seg[gt==k])*2.0 / (np.sum(seg) + np.sum(gt))
    
    print 'Dice similarity score is {}'.format(dice)