# warpPerspective() to a different origin [SOLVED]

I need to apply a perspective transformation to a pixel that is not the origin of the warped image. The warpPerspective() method is applying :

But I need to apply the following centered perspective transformation :

The perspective transformation that I need to apply is the following ... Here it is applied to the center of the squared image (cx,cy)=(0.5,0.5). But if I warp the image from (cx,cy)=(0,0), there's a translation error, like in the following example ...

The perspective transformation (here it is a rotation) must be applied to the center of the warped image. How can I use OpenCV to perform such a centered perspective transform in C++ ? Thanks for helping.

Best regards,

Dr. F. Le Coat CNRS / Paris / France

https://hebergement.universite-paris-saclay.fr/lecoat

edit retag close merge delete

Sort by » oldest newest most voted

Modifying the Mij parameters to use the warpPerspective() original method is possible, but the image must be resized, and the result need to be cropped, because it is translated with the half of the image's size. You can watch at the transformation I performed, which has a translation error. This is not an elegant way to proceed ...

Implementing my own warpPerspective() method is also possible, but that would mean there's no way to use OpenCV. My question is about what I can write in C++ using the library. The issue is that OpenCV's warpPerspective() method is not generic for perspective transformations.

I need a warping method that could be centered to a different point than the origin (cx,cy)=(0,0) of the image. The warpPerspective() method was implemented for a very peculiar usage, with a peculiar homography form.

I need to use the general implementation of perspective transforms. That means 3 translations factors, 3 rotations angles and 2 shear angles, applied from the center (cx,cy) of the image ... Thanks for helping.

Best regards,

Dr. F. Le Coat CNRS / Paris / France

more

Sorry I still don't get why you cannot just change M13, M23 and M33. And change your (cx,cy)=(0.5,0.5) to be (cx,cy)=(image_width/2,image_height/2).

At the end, the fonction assumes X' = HX. With X and X' in homogeneous coordinates. I don't see why it cannot be generic enough for your case.

If you are talking about the image coordinates system, OpenCV uses indeed (0,0) for the top left pixel and (320,240) ((y,x)) for the middle pixel for a (640x480) image size. This is the image convention in OpenCV and in most image processing, computer vision libraries I believe.

( 2020-09-22 03:07:33 -0500 )edit

You are not right about centered perspective warping of an image. You can't modify M13, M23 and M33 and then perform the warpPerspective() existing method. Because M13, M23 and M33 are multiplicative factors. It's not additive like a translation ...

What I could do, is translating the image from (-cx,-cy), then apply the warpPerspective() original method from OpenCV, and translate it back with a (cx,cy) factor.

But doing that way, I will lose at least the 3/4 of the image contents, in the surface of the image.

Can you tell me how I should modify M13, M23 and M33 ? This is not a elegant way to proceed.

There's two (cx,cy) factors in warpPerspective() method missing, to turn it in a generic perspective warping.

Regards,

Dr. F. Le Coat CNRS / Paris / France

( 2020-09-22 08:51:08 -0500 )edit

Sorry but I still don't get you.

In input of warpPerspective() you pass the M (the 3x3 transformation or mapping matrix) to perform the mapping (x', y', 1) = M x (x, y, 1).

This is like that for all the libraries:

I would have done: M13_new = M13 - M11 x cx - M12 x cy, M23_new = M23 - M21 x cx - M22 x cy, same thing for M33. The other coefficients remains unchanged.

( 2020-09-22 11:09:02 -0500 )edit

That's not really the transformation I want to apply with the warpPerspective() method from OpenCV.

Let's imagine that the M3x3 matrix is the identity ... Then the resulting transform you are suggesting, is a translation of the image from (x,y)=(-cx,-cy). That's a good starting for an idea, but the image must then be back-translated again with (x,y)=(cx,cy) for the result (the identity). And we will have lost the 3/4 of the image contents, considering the informative surface of the image.

You have solved half of the problem. This can't be the only thing to do, to perform a centered perspective warping. I don't think it can be solved that way ... warpPerspective() is centered in (0,0). Not in (cx,cy) unfortunately.

Regards,

Dr. F. Le Coat CNRS / Paris / France

( 2020-09-22 14:56:52 -0500 )edit

I guess you are on your own now...

My final attempt:

• why not do something similar to homogeneous transformation?
• something like X' = T^-1 . M . T . X
• with M your transformation matrix expressed in your coordinates system
• T the transformation from classical image coordinates system to yours
• X the classical image coordinates system and X' the transformed point in the classical image coordinates system
• and pass to the function M' = T^-1 . M . T?

Finally:

warpPerspective() is centered in (0,0). Not in (cx,cy) unfortunately.

This does not make sense for me. There are only convention used for image coordinates system.

My final suggestion is to look at other libraries like Matlab or in Python how this is done... You may found better advices.

( 2020-09-23 15:15:22 -0500 )edit

I have the M3x3 matrix and I want to apply M' centered in (cx,cy). So T=((1,0,cx),(0,1,cy),(0,0,1)) and T^-1=((1,0,-cx),(0,1,-cy),(0,0,1)). Can you please tell me what M'3x3 matrix can I apply ? That's the question I'm asking about using warpPerspective() OpenCV method to a different origin, in C++ ... Thanks for helping.

Best regards,

Dr. F. Le Coat CNRS / Paris / France

( 2020-09-24 10:35:40 -0500 )edit

Again for me this is similar to Homogeneous transformation.

So I would do: M' = T^-1 . M . T

So you pass to the function the input image and M' no?

( 2020-09-24 13:24:36 -0500 )edit

The perspective transformation is working. Here is the C++ code with OpenCV :

// center the transformation
H.at<double>(0, 0) = M00+(M20*N)/2;
H.at<double>(0, 1) = M01+(M21*N)/2;
H.at<double>(0, 2) = M02-N*(2*M00+2*M01-2*M22+M20*N+M21*N)/4;
H.at<double>(1, 0) = M10+(M20*N)/2;
H.at<double>(1, 1) = M11+(M21*N)/2;
H.at<double>(1, 2) = M12-N*(2*M10+2*M11-2*M22+M20*N+M21*N)/4;
H.at<double>(2, 0) = M20;
H.at<double>(2, 1) = M21;
H.at<double>(2, 2) = M22-(M20+M21)*N/2;
H /= H.at<double>(2,2);
H.at<double>(2, 2) = 1.0;


With Mat H(3,3,CV_64FC1) and cx=cy=N/2 and H=T^-1.M.T. Thanks for helping. Best regards, Dr. F. Le Coat CNRS / Paris / France

( 2020-09-24 15:44:14 -0500 )edit

Official site

GitHub

Wiki

Documentation