Delay voluntarily a live feed from a VideoCapture
Hi,
I have a class Camera inheriting from cv::VideoCapture, which its core method is to convert a cv::Mat that I get from a live stream to a QImage :
QImage Camera::getFrame() {
if(isOpened()) {
cv::Mat image;
(*this) >> image;
cv::cvtColor(image, image, CV_BGR2RGB);
return QImage((uchar*) image.data, image.cols, image.rows, image.step, QImage::Format_RGB888);
}
else return QImage();
}
And an encapsulating class CameraDelayedView which calls this method and adds a delay :
void CameraDelayedView::timerEvent(QTimerEvent *evt) {
if(cam != NULL) {
buffer.enqueue(cam->getFrame());
if(buffer.size() > delay*fps) {
setPixmap(QPixmap::fromImage(buffer.dequeue()));
}
}
}
The first frame is delayed like I want, but from that moment on, the video is a normal live feed like I would get without the queue system. Apparently, the images are part of a buffer which is updated, so that when creating a QImage with image.data, the pointers point at a frame which will be updated afterward. I want to avoid that: keep a snapshot, store it.
Do I have to copy each image.data myself to make it independant, or is there a more efficient way ? Please do not hesitate to tell me if the above can be more efficient as well.
Thanks in advance.
Regards, Mister Mystère