In a receiver side of TCP, opencv 'imdecode' returns zero in while loop

asked 2020-04-15 23:35:39 -0600

jj2 gravatar image

I try to display an image stream using a series of images that are transmitted via TCP communication. In a sender side, multiple jpg image files are encoded as binaries using 'imencode' and transmitted serially. In a receiver side, the received data is decoded and saved in a cv::Mat variable using 'imdecode' in a while loop.

The first received data is correctly decoded and saved in the cv::Mat variable but the second received data is not correctly decoded. The 'imdecode' function, at this time, returns zero and all data is NULL.

How can I decode the binary data continuously in a while loop

Below is a part of my code

////*  code  *////
while (1) {
        cv::Mat img_rcv;
        std::vector<char> buffer;
        char recvbuf[FULL_BUFLEN];
        size_t len = sizeof(recvbuf);
        char *p = recvbuf;
        dataReceived = 0;

        while (len > 0 && (dataReceived = recv(ConnectSocket, p, len, 0)) > 0) {
            p += dataReceived;
            len = len - (size_t)dataReceived;
            if (len > 0 || dataReceived < 0) {
                continue;
            }
        }
        buffer.assign(recvbuf, recvbuf + sizeof(recvbuf));
        img_rcv = cv::imdecode(buffer, 1);

        // Display the resulting frame
        cv::imshow("received Image", img_rcv);
        ...

    }

'img_rcv' in a first loop is correctly generated so I can see the image using cv::imshow but the 'img_rcv' returns zero from the second loop. I checked the 'recvbuf' is updated in the second loop.

edit retag flag offensive close merge delete