Ask Your Question

Revision history [back]

TrainSVM convert_to_ml error

Hi,

I am attempting to train the defaultPeopleDetctor for the HOG person C++ example. I get an out of memory error at the convert_to_ml stage.

/* * Convert training/testing set to be used by OpenCV Machine Learning algorithms. * TrainData is a matrix of size (#samples x max(#cols,#rows) per samples), in 32FC1. * Transposition of samples are made if needed. / void convert_to_ml(const std::vector< cv::Mat > & train_samples, cv::Mat& trainData) { //--Convert data const int rows = (int)train_samples.size(); const int cols = (int)std::max(train_samples[0].cols, train_samples[0].rows); cv::Mat tmp(1, cols, CV_32FC1); //< used for transposition if needed trainData = cv::Mat(rows, cols, CV_32FC1); vector< Mat >::const_iterator itr = train_samples.begin(); vector< Mat >::const_iterator end = train_samples.end(); for (int i = 0; itr != end; ++itr, ++i) { CV_Assert(itr->cols == 1 || itr->rows == 1); if (itr->cols == 1) { transpose((itr), tmp); tmp.copyTo(trainData.row(i)); } else if (itr->rows == 1) { itr->copyTo(trainData.row(i)); } } }

Does anyone know why this would occur? I'm using the INRIA person data set for training.

click to hide/show revision 2
No.2 Revision

updated 2015-08-25 23:59:58 -0600

berak gravatar image

TrainSVM convert_to_ml error

Hi,

I am attempting to train the defaultPeopleDetctor for the HOG person C++ example. I get an out of memory error at the convert_to_ml stage.

/*
* Convert training/testing set to be used by OpenCV Machine Learning algorithms.
* TrainData is a matrix of size (#samples x max(#cols,#rows) per samples), in 32FC1.
* Transposition of samples are made if needed.
/
*/
void convert_to_ml(const std::vector< cv::Mat > & train_samples, cv::Mat& trainData)
{
    //--Convert data
    const int rows = (int)train_samples.size();
    const int cols = (int)std::max(train_samples[0].cols, train_samples[0].rows);
    cv::Mat tmp(1, cols, CV_32FC1); //< used for transposition if needed
    trainData = cv::Mat(rows, cols, CV_32FC1);
    vector< Mat >::const_iterator itr = train_samples.begin();
    vector< Mat >::const_iterator end = train_samples.end();
    for (int i = 0; itr != end; ++itr, ++i)
    {
        CV_Assert(itr->cols == 1 ||
            itr->rows == 1);
        if (itr->cols == 1)
        {
            transpose((itr), transpose(*(itr), tmp);
            tmp.copyTo(trainData.row(i));
        }
        else if (itr->rows == 1)
        {
            itr->copyTo(trainData.row(i));
        }
    }
}

}

Does anyone know why this would occur? I'm using the INRIA person data set for training.

TrainSVM convert_to_ml error

Hi,

I am attempting to train the defaultPeopleDetctor for the HOG person C++ example. I get an out of memory error at the convert_to_ml stage.

/*
* Convert training/testing set to be used by OpenCV Machine Learning algorithms.
* TrainData is a matrix of size (#samples x max(#cols,#rows) per samples), in 32FC1.
* Transposition of samples are made if needed.
*/
void convert_to_ml(const std::vector< cv::Mat > & train_samples, cv::Mat& trainData)
{
    //--Convert data
    const int rows = (int)train_samples.size();
    const int cols = (int)std::max(train_samples[0].cols, train_samples[0].rows);
    cv::Mat tmp(1, cols, CV_32FC1); //< used for transposition if needed
    trainData = cv::Mat(rows, cols, CV_32FC1);
    vector< Mat >::const_iterator itr = train_samples.begin();
    vector< Mat >::const_iterator end = train_samples.end();
    for (int i = 0; itr != end; ++itr, ++i)
    {
        CV_Assert(itr->cols == 1 ||
            itr->rows == 1);
        if (itr->cols == 1)
        {
            transpose(*(itr), tmp);
            tmp.copyTo(trainData.row(i));
        }
        else if (itr->rows == 1)
        {
            itr->copyTo(trainData.row(i));
        }
    }
}

Does anyone know why this would occur? I'm using the INRIA person data set for training.