Different results between imread as grayscale and cvtColor

asked 2016-09-30 15:27:54 -0600

simbaforrest gravatar image

Did anyone notice that if you load a color image directly as gray scale, the resulting image is slightly different than loading it to a color image and then cvtColor to grayscale? Anyone know why?

img1=cv2.imread("imge.jpg", 0)
img2=cv2.imread("imge.jpg", 1)
img2=cv2.cvtColor(img2, cv2.COLOR_BGR2GRAY)
print np.max(img1!=img2)
# True
print np.max(img1.astype('float32')-img2.astype('float32'))
# 2.0

BTW, this happens under both Linux and Windows (either c++ or python).

edit retag flag offensive close merge delete

Comments

jpg's are lossy.

berak gravatar imageberak ( 2016-09-30 19:44:20 -0600 )edit

Thanks for the comment. I understand that. But I thought that cv2.imread(...,0) is loading it to color and then convert to gray. But the results are different. So This doesn't seem to be related to jpeg...

simbaforrest gravatar imagesimbaforrest ( 2016-10-07 15:45:04 -0600 )edit
1

The reason is that there are multiple implementations of the grayscale conversion in play. cvtColor() is THE opencv implementation and will be consistent across platforms. When you use imread() to convert to grayscale, you are at the mercy of the platform-specific implementation of imread(). I wouldn't be surprised if imread() returns slightly different grayscale values on each platform.

I can't think of any reason to ever convert to grayscale in imread.

mdg gravatar imagemdg ( 2019-10-23 13:55:13 -0600 )edit