# cvResize - Pixels Shifting

Hello,

I'm using the cvResize function of OpenCV (C-code version) to resize an image from its original size. The size increase is determined by the height difference between the original image and the height of the projected image. The width is calculated using the same calculated difference so that the aspect ratio is preserved.

It is observed however, that depending on the final calculated height/width size, the pixels start shifting from the first row and then continues to increasingly shift for each subsequent row until the end of the image. This creates a image that is shifted diagonally (each row "wraps around" the following row). I suspect that the problem occurs when calculating the difference (variable "d" in the code) using a double. The cvSize take only integers so the height/width aspect ratio is not preserved. However, if I round-off the calculated different to the nearest integer, the size increase is too big. I've tried many variants of calculating the height/width (ratios) but I obtain the same bizarre result. Any suggestions as to prevent this problem from occurring?

bool shrink = false;
const double dbShift = pow(10.0, 1);  // use 1 decimal precision only

if (_native_rgb_img->height > newHeight){
shrink = true;
}

// Calculate the size different between original and projected image using height
double d = (double) newHeight / (double) _native_rgb_img->height;
CvSize size = cvSize((int)(_native_rgb_img->width * d), (int)(_native_rgb_img->height * d));

_resized_rgb_img = cvCreateImage(size, _native_rgb_img->depth, _native_rgb_img->nChannels);

if (shrink){
cvResize(_native_rgb_img, _resized_rgb_img, CV_INTER_AREA);
}
else{
cvResize(_native_rgb_img, _resized_rgb_img, CV_INTER_LINEAR);
}

edit retag close merge delete