Converting OpenCV grayscale Mat to Caffe blob

asked 2015-08-25 08:28:49 -0600

jackbrucesimspon gravatar image

updated 2015-08-25 08:31:16 -0600

I've been following an example I was referred to on how to convert an OpenCV Mat into a Caffe object I could make predictions from. From what I understand, the first section scales the image and then initialises the caffe class "TransformationParameter":

const float img_to_net_scale = 0.0039215684;
TransformationParameter input_xform_param;
input_xform_param.set_scale( img_to_net_scale );
DataTransformer<float> input_xformer( input_xform_param, TEST );

Then, the OpenCV Mat "patch" is converted into "input_blob". I've changed this part because I've loaded in my image in grayscale instead of colour.

cv::Mat patch = cv::imread( input_file_str, CV_LOAD_IMAG_GRAYSCALE  );
Blob<float> input_blob;
input_blob.Reshape(1, patch.channels(), patch.rows, patch.cols );
input_xformer.Transform( patch, &input_blob );

Finally, I'm not too sure what this section does - if I already have my OpenCV Mat converted to a Caffe blob, why do I need to push back on the "input" vector and pass it to the net? Can't I pass input_blob directly into the net to get my prediction back?

std::vector<Blob<float>*> input;
input.push_back( &input_blob );

std::vector<Blob<float>*> output = net->Forward( input );
edit retag flag offensive close merge delete

Comments

I think this is a caffe related question, or simply Net::Forward (const vector< Blob< Dtype > * > &bottom, ...) requires a vector of Blob.

pklab gravatar imagepklab ( 2015-08-25 09:45:06 -0600 )edit