Search code examples
matlabcameraarea

matlab will two different cameras give me different results?


the next code gets an image of a grape that I photograph (is called: 'full_img') and calculate the area of the grape:

RGB = imread(full_img);
GRAY = rgb2gray(RGB);

threshold = graythresh(GRAY);
originalImage = im2bw(GRAY, threshold);

originalImage = bwareaopen(originalImage,250);
SE = strel('disk',10);
IM2 = imclose(originalImage,SE);
originalImage = IM2;

labeledImage = bwlabel(originalImage, 8);     % Label each blob so we can make measurements of it

blobMeasurements = regionprops(labeledImage, originalImage, 'all');   
numberOfBlobs = length(blobMeasurements);

pixperinch=get(0,'ScreenPixelsPerInch');   %# find resolution of your display
dpix=blobMeasurements(numberOfBlobs).Area; %# calculate distance in pixels 
dinch=dpix/pixperinch;                     %# convert to inches from pixels
dcm=dinch*2.54;                            %# convert to cm from inches
blobArea = dcm;                            % Get area.

If I photograph the same grape with the same conditions by different cameras (photographed it from the same distance and the same lightning), will I get the same results? (what if I have a camera of 5 Mega Pixel and 12 Mega Pixel?).


Solution

  • No, it won't. You go from image coordinates to world coordinates using dpix/pixperinch. In general this is wrong. It will only work for a specific image (and that alone), if you know the pixperinch. In order to get the geometric characteristics of an object in an image (eg length, area etc), you must back-project the image pixels in the Cartesian space using the Camera matrix and the inverse projective transformation, in order to get Cartesian coordinates (let along calibrating the camera for lens distortion, which is a nonlinear problem). Then, you can perform the calculations. You code won't work even for the same camera. See this for more.