Define difference between rotation img and sample

asked 2016-07-11 12:25:17 -0500

Kiryl gravatar image

Hello! I can't found a method for define difference between ratotation img and her reference (sample). For example, if we have a some artifacts (such as lines, dots or spots) - i need to find it with pinpoint accuracy! If image has a dot as one pixel - i must know about it! I have the homography transform matrix. And I use it to recovery img and after I calculate the difference between them by substract one from the other by pixels in one channel format. But opencv method "warpPerspective" has some error: each pixels has some displacement to one or two position in random direction. So I got a difference between them pixels and have a big error! Maybe anyone know how to resolve this task?

edit retag flag offensive close merge delete


Are you sure the error is in warpPerspective? It's more likely the error is in your homography matrix. How did you get the homography matrix?

Tetragramm gravatar imageTetragramm ( 2016-07-11 17:46:59 -0500 )edit

Yes, I'm sure! If I transform my img in and decomposition homography matrix, then I get a right results. But homography or warpPerspecrive has some determining error after recovery (such as ~0.3 degree during define rotation angle, shift several pixels "1-2" to right/left direction and a little change pixels color). But it's a little change color and the other - very important for define difference between them. I call the method Imgproc.findHomography() with prepared "parameters and algorithms" as FeatureDetector.ORB, DescriptorExtractor.ORB and DescriptorMatcher.BRUTEFORCE.

Kiryl gravatar imageKiryl ( 2016-07-12 01:30:21 -0500 )edit

So, one tip is to use as few parameters as you can. A homography does a lot more than just rotation and translation. So, if that's all you're doing, I suggest using estimateRigidTransform with fullAffine = false. Then it's only finding translation, rotation, and scale.

Second, try to remove outliers (bad matches) by checking the results. Find the transformation, and then transform the first point set with that transformation. If the result is very far from the second point (say more than 50 pixels), throw that match away. Then re-calculate the transformation with just the good points.

Tetragramm gravatar imageTetragramm ( 2016-07-12 07:44:49 -0500 )edit

Thanks! I am necessarily looking to estimateRigidTransform! Yes, You are right: I'm only need a rotation agle, scale, translation and Shift. Now I'm using homography without good matches, because ... I'm many times defined parameters with and without good mathces (with any value for min and max distance) and in the second case I get more accurate results!
I read about estimateRigidTransform and used it - sure it's more easilyto using, but it's has a little more measurement error! If I didn't need a such precise values I necessarily would use estRigTransform!

Kiryl gravatar imageKiryl ( 2016-07-12 10:18:32 -0500 )edit