Search code examples
javaalgorithmimage-processingdistortionfisheye

Barrel distortion correction algorithm to correct FishEye lens - failing to implement with Java


I have a large bulk of photographs taken with a fisheye lens. As I want to do some image-processing (e.g. edge detection) on the photos I want to remove the barrel distortion which effects my results heavily.

After some research and lots of read articles I found this page: They describe an algorithm (and some formulas) to solve this problem.

M = a *rcorr^3 + b * rcorr^2 + c * rcorr + d
rsrc = (a * rcorr^3 + b * rcorr^2 + c * rcorr + d) * rcorr

rsrc = distance of a pixel from the center of the source image
rcorr = distance of a pixel from the center in the corrected image
a,b,c = distortion of image d = linear scaling of image

I used these formulas and tried to implement this in a Java application. Unfortunately it doesn't work and I failed to make it work. "Corrected" image look nothing like the original photograph and instead show some mysterious circles in the middle. Look here:

http://imageshack.us/f/844/barreldistortioncorrect.jpg/ (this used to be a photograph of a white cow in front a blue wall)

Here is my code:

protected int[] correction(int[] pixels) {

    //
    int[] pixelsCopy = pixels.clone();

    // parameters for correction
    double paramA = 0.0; // affects only the outermost pixels of the image
    double paramB = -0.02; // most cases only require b optimization
    double paramC = 0.0; // most uniform correction
    double paramD = 1.0 - paramA - paramB - paramC; // describes the linear scaling of the image

    //
    for(int x = 0; x < dstView.getImgWidth(); x++) {
        for(int y = 0; y < dstView.getImgHeight(); y++) {

            int dstX = x;
            int dstY = y;

            // center of dst image
            double centerX = (dstView.getImgWidth() - 1) / 2.0;
            double centerY = (dstView.getImgHeight() - 1) / 2.0;

            // difference between center and point
            double diffX = centerX - dstX;
            double diffY = centerY - dstY;
            // distance or radius of dst image
            double dstR = Math.sqrt(diffX * diffX + diffY * diffY);

            // distance or radius of src image (with formula)
            double srcR = (paramA * dstR * dstR * dstR + paramB * dstR * dstR + paramC * dstR + paramD) * dstR;

            // comparing old and new distance to get factor
            double factor = Math.abs(dstR / srcR);
            // coordinates in source image
            double srcXd = centerX + (diffX * factor);
            double srcYd = centerY + (diffX * factor);

            // no interpolation yet (just nearest point)
            int srcX = (int)srcXd;
            int srcY = (int)srcYd;

            if(srcX >= 0 && srcY >= 0 && srcX < dstView.getImgWidth() && srcY < dstView.getImgHeight()) {

                int dstPos = dstY * dstView.getImgWidth() + dstX;
                pixels[dstPos] = pixelsCopy[srcY * dstView.getImgWidth() + srcX];
            }
        }
    }

    return pixels;
}

My questions are:
1) Is this formula correct?
2) Do I have made a mistake turning that formula into a piece of software?
3) There are other algorithms out there (e.g. How to simulate fisheye lens effect by openCV? or wiki/Distortion_(optics)), are they better?

Thanks for your help!


Solution

  • The main bug you have is that the algorithm specifies that r_corr and r_src are in units of min((xDim-1)/2, (yDim-1)/2). That needs to be done to normalise the calculation so that the parameter values are not dependent on the size of the source image. With the code as it is you'll need to use much smaller values for paramB, e.g. it worked ok for me with paramB = 0.00000002 (for an image with dimensions 2272 x 1704).

    You also have a bug in calculating the difference from the center that causes the resulting image to be rotated 180 degree compared to the source image.

    Fixing both these bugs should give you something like this:

    protected static int[] correction2(int[] pixels, int width, int height) {
        int[] pixelsCopy = pixels.clone();
    
        // parameters for correction
        double paramA = -0.007715; // affects only the outermost pixels of the image
        double paramB = 0.026731; // most cases only require b optimization
        double paramC = 0.0; // most uniform correction
        double paramD = 1.0 - paramA - paramB - paramC; // describes the linear scaling of the image
    
        for (int x = 0; x < width; x++) {
            for (int y = 0; y < height; y++) {
                int d = Math.min(width, height) / 2;    // radius of the circle
    
                // center of dst image
                double centerX = (width - 1) / 2.0;
                double centerY = (height - 1) / 2.0;
    
                // cartesian coordinates of the destination point (relative to the centre of the image)
                double deltaX = (x - centerX) / d;
                double deltaY = (y - centerY) / d;
    
                // distance or radius of dst image
                double dstR = Math.sqrt(deltaX * deltaX + deltaY * deltaY);
    
                // distance or radius of src image (with formula)
                double srcR = (paramA * dstR * dstR * dstR + paramB * dstR * dstR + paramC * dstR + paramD) * dstR;
    
                // comparing old and new distance to get factor
                double factor = Math.abs(dstR / srcR);
    
                // coordinates in source image
                double srcXd = centerX + (deltaX * factor * d);
                double srcYd = centerY + (deltaY * factor * d);
    
                // no interpolation yet (just nearest point)
                int srcX = (int) srcXd;
                int srcY = (int) srcYd;
    
                if (srcX >= 0 && srcY >= 0 && srcX < width && srcY < height) {
                    int dstPos = y * width + x;
                    pixels[dstPos] = pixelsCopy[srcY * width + srcX];
                }
            }
        }
    
        return pixels;
    }
    

    With this version you can use parameter values from existing lens databases like LensFun (though you'll need to flip the sign of each parameter). The page describing the algorithm can now be found at http://mipav.cit.nih.gov/pubwiki/index.php/Barrel_Distortion_Correction