Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

kmeans causes segmentation fault

I'm doing object classification in videos. Therefore I build a vocabulary for the Bag-of-Visual-Words approach. That is: iterating through the videos, taking every xth frame, taking densely sampled feature points (cv::DenseFeatureDetector) and using ORB (cv::DescriptorExtractor::create("ORB")) for keypoint description. The keypoints are stored in a Mat object and given to cv::BOWKmeansTrainer. You find the whole method at the end of this text.

I am using Ubuntu 13.10 on a 12 core server with 32 GB of RAM. OpenCV version is 2.4.8.

My problem is a Segmentation fault occuring often (but not always); see following backtrace from a core dump.

Program terminated with signal 11, Segmentation fault.
#0  __GI___libc_free (mem=0x1ff5133c010) at malloc.c:2892
(gdb) bt
#0  __GI___libc_free (mem=0x1ff5133c010) at malloc.c:2892
#1  0x00007f85b336311f in cv::kmeans(cv::_InputArray const&, int, cv::_OutputArray const&, cv::TermCriteria, int, int, cv::_OutputArray const&) ()
    from /usr/lib/libopencv_core.so.2.4
#2  0x00007f85b3014fd2 in cv::BOWKMeansTrainer::cluster(cv::Mat const&) const
    () from /usr/lib/libopencv_features2d.so.2.4
#3  0x00007f85b30176c4 in cv::BOWKMeansTrainer::cluster() const ()
    from /usr/lib/libopencv_features2d.so.2.4
#4  0x0000000000475631 in ORBClassifier::buildVocabulary (this=0x7fff6f5b34f0,
    dictionarySize=1024, stepSize=6, frameSkip=160)
    at ../src/orb/ORBClassifier.cpp:555
#5  0x0000000000474d2f in ORBClassifier::buildVocabulary (this=0x7fff6f5b34f0)
    at ../src/orb/ORBClassifier.cpp:435
#6  0x000000000045a3c5 in startJob (pm=...) at ../src/segmentation.cpp:937
#7  0x000000000045a90d in main () at ../src/segmentation.cpp:966

I am using 14 videojunks with an overall length of about two hours. The sizes of the vocabularies, which should be built, are 1k, 2k and 4k. The number of descriptors the clusters should be built from is up to 60,000,000. The Segmentation fault occurs with less descriptors (4 millions of descriptors), also. I have had successfull runs using all videos (about 60 mio descriptors), too. Starting the binary again could and has lead to a Segfault.

What I have done: - Tested with single videos - Tested with different sets of videos - Tested with different parameters - Used Valgrind (no apparent abnormalities - single video run)

I am thankful for all help and information. Tschey

cv::Mat ORBClassifier::buildVocabulary(int dictionarySize, int stepSize,
    int frameSkip) 
{
    std::vector<boost::filesystem::path> videoFiles;
    Tools::fillVideoList(videoFiles, pm.getStringValue(PARAM_BUILD_VOCABULARY_DIRECTORY));

    // define detector, descriptor and matcher
    cv::Ptr<cv::FeatureDetector> detector(
        new cv::DenseFeatureDetector(1.f, 1, 0.1f, stepSize, 0, true,
                false));
    cv::Ptr<cv::DescriptorExtractor> descriptor(
        cv::DescriptorExtractor::create("ORB"));

    cv::Mat training_descriptors(1, descriptor->descriptorSize(), CV_32F);

    for (std::vector<boost::filesystem::path>::iterator it = videoFiles.begin();
        it != videoFiles.end(); ++it)
    {
        // load video
        VideoHandler videoHandler;
        videoHandler.initializeVideo(*it);
        cv::Mat frame;

        bool isRun = true;
        long frameNumber = 0;

        while (isRun)
        {
            for (int i = 0; i < frameSkip; ++i)
            {
                frameNumber = videoHandler.readNextFrame(frame);
            }

            frameNumber = videoHandler.readNextFrame(frame);
            if (frameNumber < 0)
            {
                isRun = false;
                continue;
            }

            std::vector<cv::KeyPoint> keypoints;
            cv::Mat descriptors;
            detector->detect(frame, keypoints);
            descriptor->compute(frame, keypoints, descriptors);
            descriptors.convertTo(descriptors, CV_32FC1);
            training_descriptors.push_back(descriptors);
        }
    }

    cv::Mat vocabulary;

    // defining terms for bowkmeans trainer
    cv::TermCriteria tc(CV_TERMCRIT_ITER | CV_TERMCRIT_EPS, 10, 0.001);  
    int retries = 1;
    int flags = cv::KMEANS_PP_CENTERS;
    cv::BOWKMeansTrainer bowTrainer(dictionarySize, tc, retries, flags);

    bowTrainer.add(training_descriptors);
    vocabulary = bowTrainer.cluster();

    // store vocabulary to filesystem
    std::string fileName = "vocabulary.yml";
    cv::FileStorage fs1(fileName, cv::FileStorage::WRITE);
    fs1 << "vocabulary" << vocabulary;
    fs1.release();

    return vocabulary;
}

kmeans causes segmentation fault

I'm doing object classification in videos. Therefore I build a vocabulary for the Bag-of-Visual-Words approach. That is: iterating through the videos, taking every xth frame, taking densely sampled feature points (cv::DenseFeatureDetector) and using ORB (cv::DescriptorExtractor::create("ORB")) for keypoint description. The keypoints are stored in a Mat object and given to cv::BOWKmeansTrainer. You find the whole method at the end of this text.

I am using Ubuntu 13.10 on a 12 core server with 32 GB of RAM. OpenCV version is 2.4.8.

My problem is a Segmentation fault occuring often (but not always); see following backtrace from a core dump.

Program terminated with signal 11, Segmentation fault.
#0  __GI___libc_free (mem=0x1ff5133c010) at malloc.c:2892
(gdb) bt
#0  __GI___libc_free (mem=0x1ff5133c010) at malloc.c:2892
#1  0x00007f85b336311f in cv::kmeans(cv::_InputArray const&, int, cv::_OutputArray const&, cv::TermCriteria, int, int, cv::_OutputArray const&) ()
    from /usr/lib/libopencv_core.so.2.4
#2  0x00007f85b3014fd2 in cv::BOWKMeansTrainer::cluster(cv::Mat const&) const
    () from /usr/lib/libopencv_features2d.so.2.4
#3  0x00007f85b30176c4 in cv::BOWKMeansTrainer::cluster() const ()
    from /usr/lib/libopencv_features2d.so.2.4
#4  0x0000000000475631 in ORBClassifier::buildVocabulary (this=0x7fff6f5b34f0,
    dictionarySize=1024, stepSize=6, frameSkip=160)
    at ../src/orb/ORBClassifier.cpp:555
#5  0x0000000000474d2f in ORBClassifier::buildVocabulary (this=0x7fff6f5b34f0)
    at ../src/orb/ORBClassifier.cpp:435
#6  0x000000000045a3c5 in startJob (pm=...) at ../src/segmentation.cpp:937
#7  0x000000000045a90d in main () at ../src/segmentation.cpp:966

I am using 14 videojunks with an overall length of about two hours. The sizes of the vocabularies, which should be built, are 1k, 2k and 4k. The number of descriptors the clusters should be built from is up to 60,000,000. The Segmentation fault occurs with less descriptors (4 millions of descriptors), also. I have had successfull runs using all videos (about 60 mio descriptors), too. Starting the binary again could and has lead to a Segfault.

What I have done: - Tested with single videos - Tested with different sets of videos - Tested with different parameters - Used Valgrind (no apparent abnormalities - single video run)

I am thankful for all help and information. Tschey

cv::Mat ORBClassifier::buildVocabulary(int dictionarySize, int stepSize,
    int frameSkip) 
{
    std::vector<boost::filesystem::path> videoFiles;
    Tools::fillVideoList(videoFiles, pm.getStringValue(PARAM_BUILD_VOCABULARY_DIRECTORY));

    // define detector, descriptor and matcher
    cv::Ptr<cv::FeatureDetector> detector(
        new cv::DenseFeatureDetector(1.f, 1, 0.1f, stepSize, 0, true,
                false));
    cv::Ptr<cv::DescriptorExtractor> descriptor(
        cv::DescriptorExtractor::create("ORB"));

    cv::Mat training_descriptors(1, descriptor->descriptorSize(), CV_32F);

    for (std::vector<boost::filesystem::path>::iterator it = videoFiles.begin();
        it != videoFiles.end(); ++it)
    {
        // load video
        VideoHandler videoHandler;
        videoHandler.initializeVideo(*it);
        cv::Mat frame;

        bool isRun = true;
        long frameNumber = 0;

        while (isRun)
        {
            for (int i = 0; i < frameSkip; ++i)
            {
                frameNumber = videoHandler.readNextFrame(frame);
            }

            frameNumber = videoHandler.readNextFrame(frame);
            if (frameNumber < 0)
            {
                isRun = false;
                continue;
            }

            std::vector<cv::KeyPoint> keypoints;
            cv::Mat descriptors;
            detector->detect(frame, keypoints);
            descriptor->compute(frame, keypoints, descriptors);
            descriptors.convertTo(descriptors, CV_32FC1);
            training_descriptors.push_back(descriptors);
        }
    }

    cv::Mat vocabulary;

    // defining terms for bowkmeans trainer
    cv::TermCriteria tc(CV_TERMCRIT_ITER | CV_TERMCRIT_EPS, 10, 0.001);  
    int retries = 1;
    int flags = cv::KMEANS_PP_CENTERS;
    cv::BOWKMeansTrainer bowTrainer(dictionarySize, tc, retries, flags);

    bowTrainer.add(training_descriptors);
    vocabulary = bowTrainer.cluster();

    // store vocabulary to filesystem
    std::string fileName = "vocabulary.yml";
    cv::FileStorage fs1(fileName, cv::FileStorage::WRITE);
    fs1 << "vocabulary" << vocabulary;
    fs1.release();

    return vocabulary;
}