Say I've got a texture mapped to a grid screen-aligned mesh. It looks something like:
The vertex positions are:
(-1, -1), (1, -1), (-1, 1), (1, 1)
The UVs:
(0, 0), (1, 0), (0, 1), (1,1)
I warp the image by moving around the vertices and save the output by doing a glReadPixels()
.
The new warped vertex positions are:
(-1, -1), (0.8, -0.8), (-0.6, 0.6), (0.4, 0.4)
And the produced output is like:
Next time as an input I use the warped image that I've just saved. What I'm trying to do is reverse the warping effect I did before, by modifying the vertex coordinates. Initially I thought that the coordinates that would unwarp the image must be something of the sort:
x_unwarp = 2 * x_original - x_warped
But it's not working. The warping effect doesn't get undone. What I get is something like:
Any idea what I'm doing wrong and how is should modify the vertex coords or perhaps uvs? I'm sure I've got the math wrong.
Thanks!
It seems that my formula is wrong. I should've used matrices:
transform_matrix * x_original = x_warped
and then:
x_unwarped = inverse(tranform_matrix) * x_original
As the transformation I'm doing is pure scaling, the transform matrix is like:
/ \
| S 0 |
| 0 S |
\ /
Where S is the scale factor. Therefore the inverse would be:
/ \
| 1/S 0 |
| 0 1/S |
\ /
Thus this would give the unwarping vertex positions as:
(-1, -1), (1.25, -1.25), (-1.67, 1.67), (2.5, 2.5)
Seems straighter, but still incorrect:
I seem to have started it wrong from the beginning. There's been some random factor in all the renders. I'm going to fix that and try it again.
Done it using UV warping this time.
Warped using these UVs:
(0.0, 0.0), (0.9, 0.0), (0.0, 0.8), (0.7, 0.7)
This produced:
And then tried to unwarp using the inverses:
(0.0, 0.0), (1/0.9, 0.0), (0.0, 1/0.8), (1/0.7, 1/0.7)
Which looks like:
This didn't do it either. I'm starting to worry about the 1/0 cases that I just overlook.
Completely clueless.
One thing you're running into is perspective incorrect texture mapping (you can see in the first picture, that the texture coordinates are interpolated differently for each of the two triangles a quad is made of). You can either use perspective correction on a large single quad (using a fragment shader and implementing the algorithm there), or subdivide your quad into smaller patches.
Now reversing the texture warping is straightforward if you think about it like that: The distortion coordinated of your first step become the UV coordinates for the second step. Of course this requires to understand the distortion vertex positions in a certain range. Recall the OpenGL transformation pipeline:
Modelview → Projection → Clipping → Normalized Device Coordinates. Screen space is the last step, with the viewport dimensions applied. NDC coordinates are in the range [-1,1], but getting to the [0,1] range is easy enough: (x+1)/2
So what you do is performing the transformation pipeline (you can omit the clipping, but you must apply the perspective divide), this gives you the UV coordinates for de-distortion.