Loading OpenCV SVM from xml model on iOS gives incorrect alpha and support vector indices
Overview
The issue is that when loading an OpenCV SVM model in iOS. The loaded alpha and support vector indices differ between OS X and iOS. Load seems to use a cv::FileNode::readRaw (float and double), what would cause this to load differently on iOS? They are correctly but are not stored in the e^a form.
A test environment for this can be found here.
Train
An OpenCV SVM Model is being trained on OS X using OpenCV 3.2.0 built from source on mac and the framework for iOS. This issue can be replicated in OpenCV3.1 and OpenCV2.4. This is being trained with cv::ML::SVM::autoTrain, using a type C Support Vector Classification and a radial basis function kernel, a linear kernel has also been tried.
Load
Loading uses the generic cv::ml::SVM::load(path) function.
Reproducing
Clone the repository. Run the program. To get the SVM info text file go to Xcode > Window > Device > Select Device this is running on > Select the app (Installed App list) > download Container from the settings button. It'll be located in svminfo.txt.
On osx/linux/windows side, the tests are located in CVLoadTest. mkdir build && cd build && cmake .. make ./src/main