Finding good lambda ( λ ) for regularization in a machine learning model is important, to avoid under-fitting (high bias) or over-fitting (high variance).
If lambda is too large, then all theta ( θ ) values will be penalized heavily. Hypothesis ( h ) tends to zero. (High bias, under-fitting).
If lambda is too small, that's similar to very small regularization. (High variance, over-fitting).
Cross validation set principle can be used to select good lambda based on the plot of errors vs lambda, for both training data and validation data.
Everything should be made as simple as possible, but not simpler. (Albert Einstein)
Showing posts with label octave. Show all posts
Showing posts with label octave. Show all posts
Wednesday, March 4, 2015
Thursday, February 26, 2015
Handwritten Digits Recognition, Experiment with Octave's Neural Network Package "nnet", and RSNNS
This is a note on implementation of handwritten digits recognition, with the neural network learning process, by using Octave nnet package (or MATLAB neural network toolbox).
At the end, I play around with R code and RSNNS library (Stuttgart Neural Network Simulator for R).
GitHub, Octave/MATLAB:
https://github.com/flyingdisc/handwritten-digits-recognition-octave-nnet
Github, R - RSNNS:
https://github.com/flyingdisc/handwritten-digits-recognition-RSNNS
----
At the end, I play around with R code and RSNNS library (Stuttgart Neural Network Simulator for R).
GitHub, Octave/MATLAB:
https://github.com/flyingdisc/handwritten-digits-recognition-octave-nnet
Github, R - RSNNS:
https://github.com/flyingdisc/handwritten-digits-recognition-RSNNS
----
Subscribe to:
Comments (Atom)