Wiki: CS231n Lecture 4 – Introduction to Neural Networks


#1

Topics to be covered:

  • Backpropagation
  • Multi-layer Perceptrons
  • The neural viewpoint

Lecture Slides
Lecture video

Additional learning materials:

Backprop Notes
Linear backprop example
Derivatives Notes
Efficient BackProp
Colah Blogpost on Backprop
Michael Nelson tutorial on backpropogation
Neural Networks and Deep Learning Book Chapter 2
MIT Lecture

Discussion Questions:

  • Why do you need backpropagation in neural networks?
  • How do you get the partial derivative on the loss function with respect to each of the parameters?
  • How do you differentiate each of the various non-linearities, like sigmoid, tanh and ReLU?
  • Why do we need a vectorised implementation?
  • What is the Jacobian matrix?

Hands on code implementation/Assignment: