Ask Your Question

Get position of source points in destination after stitching

asked 2017-01-24 18:53:13 -0500

aPiso gravatar image

updated 2017-01-24 18:54:02 -0500

Hi all,

I have some region of interest in my source images which I would like to visualise in the final panorama after stitching.

I have tried drawing them on the source images before stitching but they can get covered in the overlap between images and disappear.

So, I tried to warp the points using the warpPoint function so that I could re-draw these ROIs on the final image.

Basically, I am going through the stitch detailed example and trying to warp the points before the images are warped in the example :

    cameras[img_idx].K().convertTo(K, CV_32F);
    p1Warp = warper->warpPoint(p1, K, cameras[img_idx].R);
    p2Warp = warper->warpPoint(p2, K, cameras[img_idx].R);
    cout << "p1 warp: " << p1Warp << " p2 warp: " << p2Warp << "\n";
    // Warp the current image

    warper->warp(img, K, cameras[img_idx].R, INTER_LINEAR, BORDER_REFLECT, img_warped);

This also does not seem to work as I am getting negative coordinates for my points.

TLDR: How to find position of a region of interest in the destination image after stitching?


edit retag flag offensive close merge delete


Huh. That's how I would have done it, but I'm getting negative points too.

Tetragramm gravatar imageTetragramm ( 2017-01-24 20:52:06 -0500 )edit

I tried to print out the new warped image size:

 Rect roi = warper->warpRoi(sz, K, cameras[i].R); //line 802 of stitching detailed example cpp
 cout << "orig size: " << sz << "warped size: " << roi << "\n";

That is what I get: orig size: [720 x 1280]warped size: [789 x 1020 from (-704, 372)]

Does this mean the warped photo has an origin different that 0,0 and so I need to offset the points based on that?

aPiso gravatar imageaPiso ( 2017-01-24 21:23:18 -0500 )edit

It's worth a shot.

Tetragramm gravatar imageTetragramm ( 2017-01-24 21:25:59 -0500 )edit

But how do I get the coordinates after the stitching has completed? This just gives me the coordinates in the warped image, but then if I draw rectangles they will still disappear during the compositing/blending phase... How can I get the coordinates of a point in the final stitched image?

aPiso gravatar imageaPiso ( 2017-01-24 21:49:37 -0500 )edit

Hmm. I don't know what's going on. I'm going to have to look at the code to see. It'll take a bit. I'll try to see tomorrow, could be longer.

Tetragramm gravatar imageTetragramm ( 2017-01-24 23:36:49 -0500 )edit

Thanks a lot! I think I can get the warped point coordinates now by subtracting the offset, still struggling in understanding how to map this point after the images get fed into the blender.

aPiso gravatar imageaPiso ( 2017-01-25 00:03:59 -0500 )edit

My first guess, when you have your ROI you have coordinate system1, after warp, there is no guarantee that this coordinate system makes sense. How about trying to find keypoint inside your ROI and try to using those?

StevenPuttemans gravatar imageStevenPuttemans ( 2017-01-25 03:41:55 -0500 )edit

Hi Steven, thanks for replying! I can get the coordinates after the warp, which I could get with the key points as well, good idea. My issue is that when the images go into the blender they get stitched and have some variable overlap, so even if I use key points after the warp, how can I get the corresponding coordinates on the final stitched image?

aPiso gravatar imageaPiso ( 2017-01-25 09:44:18 -0500 )edit

Hi all! There is one thing worth mentioning, in the detailed example they load images twice (imread) and what you are warping and where you draw is probably not what they are using there for stitching.

slackwar gravatar imageslackwar ( 2017-10-16 18:49:16 -0500 )edit

1 answer

Sort by ยป oldest newest most voted

answered 2018-08-02 18:47:20 -0500

cjsb gravatar image


were you able to solve this issue? I am having the same problem

edit flag offensive delete link more


As commented above, you should provide keypoint matching between the original image and the stiched image, to find the homography and then in theory you could use that to warp the coordinates.

StevenPuttemans gravatar imageStevenPuttemans ( 2018-08-03 03:31:17 -0500 )edit

I thought that the warPoint function were able to calculate the pixel position after the warping.

I don't have enough corresponding points to calculate the homography between the two pictures.

cjsb gravatar imagecjsb ( 2018-08-06 16:35:15 -0500 )edit

It worked! Just to need to get the return from the warpPoint function and subtract the

cjsb gravatar imagecjsb ( 2018-08-06 16:50:51 -0500 )edit

Question Tools

1 follower


Asked: 2017-01-24 18:53:13 -0500

Seen: 569 times

Last updated: Jan 24 '17