Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Is there a way to import a RapidMiner MLP-ANN in OpenCV?

I trained and validated a MLP Model in RapidMiner Studio. My Input Values are already normalized to [-1, 1]. As far as I understood, the MLP is already defined by its weights. As you can see here, the ANN has one Hidden Layer: http://i.stack.imgur.com/qhVP0.png

Now I'm trying to import this in OpenCV, as I don't want to retrain the whole model. I got all weights per Node + Bias from RapidMiner.

OpenCV offers the function CvANN_MLP::load(), where I am able to load a XML or YML file. I tried to modify an existing YML config for my needs. As you can see, I already defined the dimensions of the layers/data.

I have 23 Inputs, 15 Hidden Nodes and 5 Outputs. So i got (23 + 1) * 15 = 360 weights for the Hidden-Layer and (15 + 1) * 5 = 80 weights for the Output-Layer.

My main questions are:

  • Is this even possible?
  • What is the correct order for the values?
  • Where are the Bias-Values located?
  • What exactly does the output-scaling do?
  • How to determine the predicted label from the Response Matrix in OpenCV? (Is it the Index of the largest Value?)

I already tried to import the modified file and the program compiled as well. It computes something, but I don't think / am not really sure it worked.

Here is my YML File:

%YAML:1.0
mlp: !!opencv-ml-ann-mlp
   layer_sizes: !!opencv-matrix
      rows: 1
      cols: 3
      dt: i
      data: [ 23, 15, 5 ]
   activation_function: SIGMOID_SYM
   f_param1: 0.6666666666666666
   f_param2: 1.7159000000000000
   min_val: -0.9500000000000000
   max_val: 0.9500000000000000
   min_val1: -0.9800000000000000
   max_val1: 0.9800000000000000
   training_params:
      train_method: RPROP
      dw0: 0.1000000000000000
      dw_plus: 1.2000000000000000
      dw_minus: 0.5000000000000000
      dw_min: 1.1920928955078125e-07
      dw_max: 50.
      term_criteria: { epsilon:9.9999997764825821e-03, iterations:1000 }
   input_scale: [ 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1, 1, -1 ]
   output_scale: [ 1, -1, 1, -1, 1, -1, 1, -1, 1, -1 ]
   inv_output_scale: [ 1, -1, 1, -1, 1, -1, 1, -1, 1, -1 ]
   weights:
      - [ 10.063, 8.812, 3.996, 19.716, -10.239, 2.720, -21.349, 16.635, 0.797, -0.906, -3.003, -5.882, -2.595, -0.957, -4.255, -2.008, 2.978, 17.362, 2.246, 9.740, 0.491, 3.492, 23.299, 10.214,
          31.730, -23.089, 0.369, -72.193, 2.193, -9.687, 4.192, -26.858, 2.780, 5.791, 0.348, -3.331, 2.822, -15.520, -9.149, -16.861, -10.512, -17.079, -14.414, -14.371, -0.278, 10.420, -3.733, -1.921,
          -0.198, 50.929, 0.355, 3.136, 4.892, 0.496, -10.206, -2.844, 0.606, 1.570, -3.054, 6.012, 1.654, -2.043, -2.194, -3.776, -4.745, -6.988, -4.795, -0.397, -2.280, -7.741, -12.301, -4.852,
          21.052, 17.829, -5.563, -4.054, -4.946, 0.678, 12.326, 3.211, -0.077, 0.134, 2.265, 1.193, -0.252, -1.217, 2.801, -6.759, -0.816, -1.864, 3.791, -1.765, -0.300, -0.838, 2.362, 4.991,
          9.959, 5.587, 0.087, 1.134, -0.840, -2.012, 9.899, 3.960, -1.249, 2.009, -1.026, 0.461, 0.856, 0.985, -1.318, -3.725, -0.370, -1.032, 3.027, 1.670, 3.671, 3.363, -0.966, 0.772,
          2.141, 6.319, 1.173, 37.751, 2.052, -4.105, -33.639, -18.147, -1.244, -1.832, -1.618, 2.928, 0.035, -6.696, -3.505, -4.559, 0.323, -1.206, -10.850, 0.375, 2.033, -6.208, 7.072, -2.194,
          -106.173, 23.689, -1.622, -32.154, 12.832, -55.287, -5.667, 20.985, 16.064, 4.698, 3.423, 9.451, 3.948, 1.577, -2.676, 15.881, -3.287, -6.384, 2.212, 3.454, -17.373, 0.468, -0.158, -12.710,
          -50.325, -19.556, 11.445, 4.624, 17.463, 40.003, -18.085, -40.158, -4.420, -3.204, 9.482, 16.973, -0.983, 5.422, -3.963, 3.192, 4.276, 0.048, 0.945, -0.705, 12.549, -20.505, -23.406, -17.500,
          2.170, 25.429, -0.580, -43.367, -12.480, 45.753, -14.348, -52.088, 7.153, -0.057, -2.941, -3.973, -1.758, -2.525, -4.490, -4.230, -5.430, -4.688, -3.771, -3.549, -0.440, -7.770, -2.080, 12.518,
          11.252, 7.898, -0.520, -17.707, -4.407, -5.331, 30.505, 13.269, 0.283, 4.721, -1.135, -1.546, -0.606, 7.173, -1.892, -7.053, -7.691, 0.806, 5.935, 9.022, -0.864, 12.893, -4.822, 4.370,
          -1.093, 9.921, -0.802, -7.914, -1.993, -1.787, -6.430, -5.814, 0.252, 0.518, -0.331, -3.952, -1.088, 0.780, 0.433, 0.413, -1.318, 2.360, 0.420, 1.616, 0.146, 0.591, 0.732, 2.001,
          57.532, -41.869, -12.159, -5.874, 5.427, -8.832, 23.333, 30.359, -1.924, 0.381, 2.149, 5.306, 1.218, -0.096, 6.807, 0.768, -1.179, -2.081, 3.161, 3.027, -4.115, 4.467, -8.611, -5.322,
          -7.083, 2.329, -1.774, -7.084, 3.629, 2.580, 8.041, -30.396, 4.485, 3.507, 5.641, 3.479, 4.649, 16.726, -10.149, -2.149, 12.574, -3.936, -14.388, -16.592, 10.642, -16.776, -28.675, -3.649,
          -20.185, -3.553, -0.956, 21.986, 2.851, 5.577, -15.121, 2.955, 1.307, -1.009, 2.845, -0.853, 1.353, 2.904, 3.493, 0.152, -0.209, 15.398, 13.453, -1.700, -4.743, -0.650, 16.701, -2.937,
          7.023, 0.194, 1.299, -50.949, -2.719, -6.921, 22.979, 25.758, 1.773, 1.856, 1.130, -0.082, 0.491, 3.782, 0.131, -2.763, -2.333, -2.575, -3.118, 3.226, 2.422, -0.072, 1.284, 2.726 ]
      - [ -7.520, -9.371, -22.550, -18.724, -6.839, -2.827, 8.158, 7.286, -7.619, -14.201, -1.802, -7.981, 17.489, -12.048, -0.711, -2.863,
          5.659, 9.175, 17.781, -3.963, -16.634, 17.243, -7.931, -7.002, 7.120, -19.737, -13.348, 7.506, -17.768, 12.578, 1.903, -10.863,
          7.543, -7.573, 0.975, 6.101, -3.182, -8.354, -0.534, -1.656, -7.454, 11.196, -3.254, -3.823, 0.541, -9.587, -25.693, -1.856,
          -1.394, -12.486, -0.036, -2.638, 6.597, -12.324, -0.735, -6.608, 0.773, 2.329, -13.445, -0.192, -0.605, 5.244, 15.621, -13.113,
          -4.430, 17.325, 3.450, 3.499, -11.163, -14.549, 0.957, -9.089, 1.989, 9.820, 22.517, 7.370, -9.074, -3.695, 1.741, -16.864 ]

By the way, sorry for asking this Question twice: Here and on Stackoverflow, but I simply recognized this forum here too late and think it's still the appropriate place to ask the Question :) Any help is apreciated.