Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

The documentation for the neural network class needs some serious updating. For starter, it does not support a normal sigmoid activation function, that is output between [0,1]. Instead it supports the tanh function, which outputs between [-1,1], so your output matrix should be [-1,1] as well. But even then the output by predict() isn't capped between [-1,1] because it applies 1.7159tanh(2/3*x), a popular modification to avoid numerical saturation. So your output is really capped between [-1.7159, 1.7159], which can be VERY confusing if you're new.

Putting all that aside, you just have to binarise your output to get the desired results. For example:

  • [1.4031053, 0.99364734], max at class 0 so ==> [1,0] (classified wrong)
  • [1.0640485, -0.40310526], max at class so ==> [1,0]
  • [-0.40303019,1.4031053], max at class 1 so ==> [0,1]
  • [0.99389786, -0.40310526], max at class so ==> [1,0]

The documentation for the neural network class needs some serious updating. For starter, it does not support a normal sigmoid activation function, that is output between [0,1]. Instead it supports the tanh function, which outputs between [-1,1], so your output matrix should be [-1,1] as well. But even then the output by predict() isn't capped between [-1,1] because it applies 1.7159tanh(2/3*x), a popular modification to avoid numerical saturation. So your output is really capped between [-1.7159, 1.7159], which can be VERY confusing if you're new.

Putting all that aside, you just have to binarise your output to get the desired results. For example:

  • [1.4031053, 0.99364734], max at class 0 so ==> [1,0] (classified wrong)
  • [1.0640485, -0.40310526], max at class 0 so ==> [1,0]
  • [-0.40303019,1.4031053], max at class 1 so ==> [0,1]
  • [0.99389786, -0.40310526], max at class 0 so ==> [1,0]