Encoding corrupts image!

asked 2015-03-23 16:21:14 -0500

Hey!

I am using OpenCV for Face Recognition. For some reason I have to encode the face image into a string and send it to a server where the recognition part is done. However, the results are not the same as if I did the recognition directly on the taken picture.


import cv2

img = cv2.imread('/path/to/file.jpg')

recognizer.predict(img) # returns some id

#encode data
encode_param=[int(cv2.IMWRITE_JPEG_QUALITY),90]
result, imgencode = cv2.imencode('.jpg', img, encode_param)
data = numpy.array(imgencode)
stringData = data.tostring()

#decode
data = numpy.fromstring(stringData, dtype='uint8')
decimg = cv2.imdecode(data,1)

recognizer.predict(decimg) # should return the same id, but does not

What could be the problem?

Thanks a lot in advance. :) Arminius

edit retag flag offensive close merge delete

Comments

The code looks ok and works for me, could you save the image after is was transferred? And could be a more specific on how the results are not the same? (Just a slight difference that could be explained by the jpg-conversion?)

FooBar gravatar imageFooBar ( 2015-03-24 02:23:56 -0500 )edit

Yes, I can save the image after it was transferred. However, the sending part is omitted in the above code to rule out problems concerning the connection.

The predict method gives back an ID and a distance with respect to the recognized image in the trainingsdatabase. Both ID and distance are different before and after encoding. The prediction before encoding is always correct, whereas the prediction after the encoding ist incorrect most of the times.

If I look at the images before and after the encoding with cv2.imshow(), they look the same, so the .jpg encoding is not complete nonsense but somehow still destroys the pattern. (I'm using Local Binary Pattern Histograms for Recognition) Is there a way to encode img losless without altering it, so that img = decimg.

Arminius gravatar imageArminius ( 2015-03-24 06:18:54 -0500 )edit