Why vector module doesn't grow up with not regularized logistic regression

asked 2020-08-12 05:53:33 -0600

I'm training on a linearly separable dataset, without regularization, with thousands of iterations and a leraning rate of 0.001 (but in fact I have used several combinations, and with several datasets)

Taking into account that the dataset is clearly linearly separable I would expect a result as: 100% of accuracy an increasing module of the vector defined by the learnt weights (discarding first one)

But, comparing the learnt weights when using L2 regularization or not, the changes in learn weights only happens in the fourth or fifth digit...

Is that correct?

edit retag flag offensive close merge delete

Comments

an increasing module of the vector

can you explain "module" here ?

can you show your code ? also, a mini dataset for reproducing your problem will be useful

berak gravatar imageberak ( 2020-08-12 07:30:16 -0600 )edit
berak gravatar imageberak ( 2020-08-12 07:40:52 -0600 )edit