# multiplying homography and image matrices

I have been trying to understand how warpPerspective() is working. I understand that among other things it multiplies the homography matrix by the source image matrix. I have the source image matrix generated by

```
Mat img_src = imread(imageName, IMREAD_GRAYSCALE);
```

and the homography matrix generated by

```
Mat h = findHomography(pts_src, pts_dst);
```

where pts_src and pts_dst are filled using four point each

```
pts_dst.push_back(Point2f(0,0));
```

This works well with warpPerspective(), but the code

```
Mat im=h*img_src;
imshow("Image", im);
```

compiles well, yet imshow("Image", im) generates an error:

```
Assertion failed (type == B.type() && (type == CV_32FC1 || type == CV_64FC1 || type == CV_32FC2 || type == CV_64FC2))
```

I guess this is probably type mismatch between img_src and h which seems to be of type double because only

```
cout << h.at<double>(i,j) << endl;
```

works on it.

How could be this sorted out?

you got the idea wrong,

not the pixel values but their positions are transformed, and it's also not a matrix multiplication

in the end, it's a case of remapping

Thank you. I've just started learning OCV, yet from the introduction lesson I am under strong impression that the pixels' position (i.e. pixels' coordinates) do not change, but their values change. I.e. value of pixel(x,y) of source image is assigned to value of pixel(x',y') of destination image according to a certain rule. It is remapping (in general meaning of this word) indeed and for simple remapping like reflecting or scaling (as of your link) no matrix is needed.

Anyway, if we leave terminology aside, what is going on under warpPerspective() hood which takes source image matrix, destination image matrix and homography matrix as its arguments? And how to sort out the error I am getting?

error is from matrix multiplication, not from imshow() (both args to gemm() need to be float type, also a.cols==b.rows, clearly not what you have)

and yes, once you start moving around pixels, you have to interpolate results

exactly.

U've got me here! Thanks. Indeed a.cols!==b.rows. Multiplying 3x3 homography matrix by 640x480 image matrix is a stupid mistake.

do you think that converting homography matrix h from double to float and then Ioop through the image matrix applying h * img_src.at<uint8_t>(r,c,1) should sort out this bit?

no not the pixel

valueagain, but the position (in homogeneous, 3d coords) like:thank you. Trying to digest. I guess z is the 3rd coordinate of 3D space. Where shall I take it from? Is z=1? Has not z coordinate been already taken into account when calculating H using two sets of four points?

homogeneous 2/3d coords, please consult your maths book

consulted but still do not get how to calculate z in my case here. And my math book still confirms that homogeneous 2/3d coords are taken into account at the previous step when calculating homography matrix. The formula about cv::warpPerspective(): Dense perspective transform on page 313 of O'Railly Learn OpenCV by Adrian Kaehler suggests that z=1, if I read it correctly (cannot copy it here as it is a two story construction with low indeces)

p.z=1

beforethe multiplication