I have two images as numpy array, each is 180x180 and have R,G,B values which is total of 97200 individual values. I am traversing each pixel and each R,G,B; calculating the difference between corresponding two pixels and summing up to an integer. It takes approximately 5 seconds. How can I speed up the procedure?
Using numpy
you can do it directly
result = (array1 - array2).sum()
You can also calculate only in one direction
result = (array1 - array2).sum(axis=0)
result = (array1 - array2).sum(axis=1)
result = (array1 - array2).sum(axis=2)
On my old computer for image 800x600
it takes about 0.003
second.
Example with cv2
which gives image as numpy array
import cv2
import time
img1 = cv2.imread('image1.jpg')
img2 = cv2.imread('image2.jpg')
print('shape:', img1.shape)
start = time.time()
result = (img1 - img2).sum()
end = time.time()
print('result:', result)
print(' time:', end-start)
EDIT: numpy array with image may use data type uint8
which can use only values 0..255
so substraction 1-2
may gives 254
instead of -1
. You may convert data to int
to get negative values and -1
instead of 254
. And then you can use abs()
or **2
to convert negative values to positive to create correct sum - like in mean squared error.
print(img1.dtype, img1.dtype)
img1 = img1.astype(int)
img2 = img2.astype(int)
diff = img1 - img2
print( diff.sum() )
print( (diff**2).sum() )
print( np.abs(diff).sum() )
and all these calculations still are fast.