Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

for machine learning with opencv, you need a continuous MxN (float)Mat, where M(rows) is the number of feature vectors, and N(cols) is the feature count. also you need a Mx1 (int) Mat with the labels, one per feature row.

like this:

  feature1        1
  feature2        1
  feature3        2
  ...

you can construct your training data manually (pseudocode):

Mat data, labels; // initially empty.
for each featurevec:
    // this is the same format we need later for testing, see below !:
    Mat row;
    row.push_back(2.2f); // compactness
    row.push_back(1.2f); // aspect
    row.push_back(22.0f); // orient
    data.push_back(row.reshape(1,1)); // flat row

    labels.push_back(17);

Ptr<TrainData> td = ml::TrainData::create(data, ml::ROW_SAMPLE, labels);

but, if you already have a csv file, and it looks like this:

compactness, aspect_ratio, orientation, label
2.2, 1.4, 27, 1
3.2, 3.4, 7, 1
1.2, 2.4, 17, 2
2.6, 1.2, 2, 2

then it's a piece of cake:

    // train:
    Ptr<TrainData> td = ml::TrainData::loadFromCSV("my.csv",1);
    Ptr<ml::KNearest> knn = ml::KNearest::create();
    knn->train(td);

    // later, test:
    Mat test;
    test.push_back(2.2f); // compactness
    test.push_back(1.2f); // aspect
    test.push_back(22.0f); // orient
    // reshape to row-vec, and predict:
    Mat res;
    knn->findNearest(test.reshape(1,1), 3, res);
    cerr << res << endl;

for machine learning with opencv, you need a continuous MxN (float)Mat, where M(rows) is the number of feature vectors, and N(cols) is the feature count. also you need a Mx1 (int) Mat with the labels, one per feature row.

like this:

  feature1        1
  feature2        1
  feature3        2
  ...

you can construct your training data manually (pseudocode):

Mat data, labels; // initially empty.
for each featurevec:
    // this is the same format we need later for testing, see below !:
    Mat row;
    row.push_back(2.2f); // compactness
    row.push_back(1.2f); // aspect
    row.push_back(22.0f); // orient
    data.push_back(row.reshape(1,1)); // flat row

    labels.push_back(17);

Ptr<TrainData> Ptr<ml::TrainData> td = ml::TrainData::create(data, ml::ROW_SAMPLE, labels);

but, if you already have a csv file, and it looks like this:

compactness, aspect_ratio, orientation, label
2.2, 1.4, 27, 1
3.2, 3.4, 7, 1
1.2, 2.4, 17, 2
2.6, 1.2, 2, 2

then it's a piece of cake:

    // train:
    Ptr<TrainData> Ptr<ml::TrainData> td = ml::TrainData::loadFromCSV("my.csv",1);
    Ptr<ml::KNearest> knn = ml::KNearest::create();
    knn->train(td);

    // later, test:
    Mat test;
    test.push_back(2.2f); // compactness
    test.push_back(1.2f); // aspect
    test.push_back(22.0f); // orient
    // reshape to row-vec, and predict:
    Mat res;
    knn->findNearest(test.reshape(1,1), 3, res);
    cerr << res << endl;

for machine learning with opencv, you need a continuous MxN (float)Mat, where M(rows) is the number of feature vectors, and N(cols) is the feature count. also you need a Mx1 (int) Mat with the labels, one per feature row.

like this:

  feature1        1
  feature2        1
  feature3        2
  ...

you can construct your training data manually (pseudocode):

Mat data, labels; // initially empty.
for each featurevec:
    // this is the same format we need later for testing, see below !:
    Mat row;
    row.push_back(2.2f); // compactness
    row.push_back(1.2f); // aspect
    row.push_back(22.0f); // orient
    data.push_back(row.reshape(1,1)); // flat row

    labels.push_back(17);

Ptr<ml::TrainData> td = ml::TrainData::create(data, ml::ROW_SAMPLE, labels);

but, if you already have a csv file, and it looks like this:

compactness, aspect_ratio, orientation, label
2.2, 1.4, 27, 1
3.2, 3.4, 7, 1
1.2, 2.4, 17, 2
2.6, 1.2, 2, 2

then it's a piece of cake:

    // train:
    Ptr<ml::TrainData> td = ml::TrainData::loadFromCSV("my.csv",1);
ml::TrainData::loadFromCSV("my.csv",1); // 1 header row
    Ptr<ml::KNearest> knn = ml::KNearest::create();
    knn->train(td);

    // later, test:
    Mat test;
    test.push_back(2.2f); // compactness
    test.push_back(1.2f); // aspect
    test.push_back(22.0f); // orient
    // reshape to row-vec, and predict:
    Mat res;
    knn->findNearest(test.reshape(1,1), 3, res);
    cerr << res << endl;