Topics to be covered:
- Multi-layer Perceptrons
- The neural viewpoint
Additional learning materials:
Linear backprop example
Colah Blogpost on Backprop
Michael Nelson tutorial on backpropogation
Neural Networks and Deep Learning Book Chapter 2
- Why do you need backpropagation in neural networks?
- How do you get the partial derivative on the loss function with respect to each of the parameters?
- How do you differentiate each of the various non-linearities, like sigmoid, tanh and ReLU?
- Why do we need a vectorised implementation?
- What is the Jacobian matrix?
Hands on code implementation/Assignment: