Loading OpenCV SVM from xml model on iOS gives incorrect alpha and support vector indices

asked 2017-06-09 13:14:20 -0500

updated 2017-06-09 18:33:02 -0500


The issue is that when loading an OpenCV SVM model in iOS. The loaded alpha and support vector indices differ between OS X and iOS. Load seems to use a cv::FileNode::readRaw (float and double), what would cause this to load differently on iOS? They are correctly but are not stored in the e^a form.

A test environment for this can be found here.


An OpenCV SVM Model is being trained on OS X using OpenCV 3.2.0 built from source on mac and the framework for iOS. This issue can be replicated in OpenCV3.1 and OpenCV2.4. This is being trained with cv::ML::SVM::autoTrain, using a type C Support Vector Classification and a radial basis function kernel, a linear kernel has also been tried.


Loading uses the generic cv::ml::SVM::load(path) function.


Clone the repository. Run the program. To get the SVM info text file go to Xcode > Window > Device > Select Device this is running on > Select the app (Installed App list) > download Container from the settings button. It'll be located in svminfo.txt.

On osx/linux/windows side, the tests are located in CVLoadTest. mkdir build && cd build && cmake .. make ./src/main

edit retag flag offensive close merge delete