Neural Networks Demystified, Part 6

After all that work it's finally time to train our Neural Network. We'll use the BFGS numerical optimization algorithm and have a look at the results.  

After all that work it's finally time to train our Neural Network. We'll use the BFGS numerical optimization algorithm and have a look at the results. Supporting Code: http://nbviewer.ipython.org/github/stephencwelch/Neural-Networks-Demysitifed/blob/master/Part%206%20Training.ipynb Yann Lecun's Efficient BackProp Paper: http://yann.lecun.com/exdb/publis/pdf/lecun-98b.pdf More on BFGS: http://en.wikipedia.org/wiki/Broyden%E2%80%93Fletcher%E2%80%93Goldfarb%E2%80%93Shanno_algorithm In this series, we will build and train a complete Artificial Neural Network in python.

Neural Networks Demystified, Part 5: Numerical Gradient Checking

When building complex systems like neural networks, checking portions of your work can save hours of headache. Here we'll check our gradient computations. 

When building complex systems like neural networks, checking portions of your work can save hours of headache. Here we'll check our gradient computations. Supporting code: http://nbviewer.ipython.org/github/stephencwelch/Neural-Networks-Demysitifed/blob/master/Part%205%20Numerical%20Gradient%20Checking.ipynb Link to excellent Stanford tutorial: http://ufldl.stanford.edu/wiki/index.php/UFLDL_Tutorial In this series, we will build and train a complete Artificial Neural Network in python. New videos every other friday.

Neural Networks Demystified, Part 4: Backpropagation

Backpropagation as simple as possible, but no simpler. Perhaps the most misunderstood part of neural networks, Backpropagation of errors is the key step that allows ANNs to learn. In this video, I give the derivation and thought processes behind backpropagation using high school level calculus. 

Backpropagation as simple as possible, but no simpler. Perhaps the most misunderstood part of neural networks, Backpropagation of errors is the key step that allows ANNs to learn. In this video, I give the derivation and thought processes behind backpropagation using high school level calculus.

Neural Networks Demystified, Part 3: Gradient Descent

This time we'll work on strategies for training our neural network. 

Neural Networks Demystified @stephencwelch Supporting Code: http://nbviewer.ipython.org/github/stephencwelch/Neural-Networks-Demysitifed/blob/master/Part%203%20Gradient%20Descent.ipynb Link to Yann's Talk: http://videolectures.net/eml07_lecun_wia/ In this short series, we will build and train a complete Artificial Neural Network in python. New videos every other friday.

Neural Networks Demystified, Part 2: Forward Propagation

In part 2 we'll cover moving inputs across out network, introduce the equations we'll need, and write some code.

Neural Networks Demystified @stephencwelch Supporting Code: http://nbviewer.ipython.org/github/stephencwelch/Neural-Networks-Demysitifed/blob/master/Part%202%20Forward%20Propagation.ipynb In this short series, we will build and train a complete Artificial Neural Network in python. New videos every other friday.