Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

waitKey(1) timing issues causing frame rate slow down - fix?

Hello,

I am using a camera that can run at 120fps, I am accessing it's frames via it's SDK and then formatting them into opencv Mat data types for display.

At 120fps I am hoping that the while(1) loop will achieve a period of 8.3ms - i.e. the camera fps will be the bottleneck. However, currently I am only achieving around a 13ms loop (~70fps)

Using a timer that gives microsecond resolution I have timed the components of the while(1) loop and see that the bottleneck is actually the 'waitKey(1);' line.

If I'm not mistaken I should expect a 1ms delay here ? Instead I see around 13ms spent on this line.

i.e. waitKey(1); is the bottle neck

Also of note is that if I try waitKey(200); I will see a 200ms delay, but anything lower than around waitKey(20); will not give a delay that reflects the waitkey input parameter:

waitkey(1) = 12ms

waitkey(5), waitkey(10) and waitkey(15) all give a 12ms delay

waitkey(20) gives 26ms

waitkey(40) gives 42ms

etc.

It would seem that everything else but the waitkey is taking around 2ms, which will give me 6ms or so for other openCV processing - all in keeping up with 120fps - this is my goal.

How can I either 'fix' or avoid waitkey? or perhaps at least understand where I am going wrong conceptually :)

The code follows hopefully it's clear considering the cameras SDK:

#include "cameralibrary.h"
#include "supportcode.h"  

#include "opencv2/opencv.hpp"
#include <opencv2/core/core.hpp>
#include <opencv2/highgui/highgui.hpp>

using namespace cv;
using namespace std;
using namespace CameraLibrary;

int main(int argc, char** argv){ 

    CameraManager::X().WaitForInitialization();
    Camera *camera = CameraManager::X().GetCamera();
    camera->SetVideoType(Core::GrayscaleMode);  

    int cameraWidth  = camera->Width();
    int cameraHeight = camera->Height();

    Mat imageCV(cv::Size(cameraWidth, cameraHeight), CV_8UC1);
    const int BITSPERPIXEL = 8;

    Frame *frame;

    namedWindow("frame", WINDOW_AUTOSIZE);

    camera->Start();

    while(1){

        //maximum of 2 micro seconds:
        frame = camera->GetLatestFrame();   

        //maximum of 60 micro seconds:
        if(frame){                      
            frame->Rasterize(cameraWidth, cameraHeight, imageCV.step, BITSPERPIXEL, imageCV.data);
        }

        //maximum of 0.75 ms
        imshow("V120", imageCV); 

        // **PROBLEM HERE**  12 ms on average
        waitKey(1);                     

        //maximum of 2 micro seconds:
        frame->Release();                   

    }

    camera->Release();
    CameraManager::X().Shutdown();
    return 0;
}

waitKey(1) timing issues causing frame rate slow down - fix?

Hello,

I am using a camera that can run at 120fps, I am accessing it's frames via it's SDK and then formatting them into opencv Mat data types for display.

At 120fps I am hoping that the while(1) loop will achieve a period of 8.3ms - i.e. the camera fps will be the bottleneck. However, currently I am only achieving around a 13ms loop (~70fps)

Using a timer that gives microsecond resolution I have timed the components of the while(1) loop and see that the bottleneck is actually the 'waitKey(1);' line.

If I'm not mistaken I should expect a 1ms delay here ? Instead I see around 13ms spent on this line.

i.e. waitKey(1); is the bottle neck

Also of note is that if I try waitKey(200); I will see a 200ms delay, but anything lower than around waitKey(20); will not give a delay that reflects the waitkey input parameter:

waitkey(1) = 12ms

waitkey(5), waitkey(10) and waitkey(15) all give a 12ms delay

waitkey(20) gives 26ms

waitkey(40) gives 42ms

etc.

It would seem that everything else but the waitkey is taking around 2ms, which will give me 6ms or so for other openCV processing - all in keeping up with 120fps - this is my goal.

How can I either 'fix' or avoid waitkey? or perhaps at least understand where I am going wrong conceptually :)

The code follows hopefully it's clear considering the cameras SDK:

#include "cameralibrary.h"
#include "supportcode.h"  

#include "opencv2/opencv.hpp"
#include <opencv2/core/core.hpp>
#include <opencv2/highgui/highgui.hpp>

using namespace cv;
using namespace std;
using namespace CameraLibrary;

int main(int argc, char** argv){ 

    CameraManager::X().WaitForInitialization();
    Camera *camera = CameraManager::X().GetCamera();
    camera->SetVideoType(Core::GrayscaleMode);  

    int cameraWidth  = camera->Width();
    int cameraHeight = camera->Height();

    Mat imageCV(cv::Size(cameraWidth, cameraHeight), CV_8UC1);
    const int BITSPERPIXEL = 8;

    Frame *frame;

    namedWindow("frame", WINDOW_AUTOSIZE);

    camera->Start();

    while(1){

        //maximum of 2 micro seconds:
        frame = camera->GetLatestFrame();   

        //maximum of 60 micro seconds:
        if(frame){                      
            frame->Rasterize(cameraWidth, cameraHeight, imageCV.step, BITSPERPIXEL, imageCV.data);
        }

        //maximum of 0.75 ms
        imshow("V120", imshow("frame", imageCV); 

        // **PROBLEM HERE**  12 ms on average
        waitKey(1);                     

        //maximum of 2 micro seconds:
        frame->Release();                   

    }

    camera->Release();
    CameraManager::X().Shutdown();
    return 0;
}