1 | initial version |
The documentation for the neural network class needs some serious updating. For starter, it does not support a normal sigmoid activation function, that is output between [0,1]. Instead it supports the tanh function, which outputs between [-1,1], so your output matrix should be [-1,1] as well. But even then the output by predict() isn't capped between [-1,1] because it applies 1.7159tanh(2/3*x), a popular modification to avoid numerical saturation. So your output is really capped between [-1.7159, 1.7159], which can be VERY confusing if you're new.
Putting all that aside, you just have to binarise your output to get the desired results. For example:
2 | No.2 Revision |
The documentation for the neural network class needs some serious updating. For starter, it does not support a normal sigmoid activation function, that is output between [0,1]. Instead it supports the tanh function, which outputs between [-1,1], so your output matrix should be [-1,1] as well. But even then the output by predict() isn't capped between [-1,1] because it applies 1.7159tanh(2/3*x), a popular modification to avoid numerical saturation. So your output is really capped between [-1.7159, 1.7159], which can be VERY confusing if you're new.
Putting all that aside, you just have to binarise your output to get the desired results. For example: