OpenCV GPU resize interpolation bug/problem

asked 2013-09-27 10:14:10 -0500

JG gravatar image

updated 2013-09-27 11:05:02 -0500

berak gravatar image


When comaparing the output of the CPU version of resize to the GPU version of resize, I noticed that the resize function on GPU is not interpolating correctly. In fact, it appears to be a simple pixel decimation instead of ANY type of interpolation. According to the docs, there are 3 supported interpolation methods, in my testing all of them produced the same result. I've tried the CV_INTER_LINEAR definition as well as the cv::INTER_LINEAR, both resulting in same output. The CPU resize output is correct.

Is there a configuration setting that enables this support during compile time? Or some other function/option/switch/method that needs to be set or invoked prior to a gpu resize call?

Thanks for any help!

Edit: Found a difference in my comparison, corrected it, and the CPU vs GPU resize is still different but it's closer... What differences should be expected if any? Should they be exactly identical?

edit retag flag offensive close merge delete



I also observed the difference in cpu and gpu version of resize but still not able to figure out the reason behind it....

Ishita gravatar imageIshita ( 2013-10-05 14:01:44 -0500 )edit

I'm surprised no one else has commented or addressed this. No one else out there cares that resizing causes significant differences - from a data perspective - between CPU and GPU?

JG gravatar imageJG ( 2013-10-24 10:42:48 -0500 )edit