Wiki: Lecture 2 – Overview of Deep Learning from a Practical Point of View


Discussion on Lecture 2 – Overview of Deep Learning from a Practical Point of View

Lecture video

Emergence of simple cell
ImageNet Classification with Deep Convolutional Neural Networks (Alexnet)
Very Deep Convolutional Networks for Large-Scale Image Recognition (VGG)
Going Deeper with Convolutions (GoogLeNet)
Deep Residual Learning for Image Recognition (ResNet)
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
Visualizing and Understanding Convolutional Neural Networks

An Intuitive Guide to Deep Network Architectures
Neural Network Architectures

Deep Visualization Toolbox


Just sharing some of my notes.

Things mentors at Microsoft campus pointed out:

  1. Brute force is in a similar vein to Alchemy. Alchemy was coined during Ali Rahimi’s (and Ben Recht’s) talk at Neural Information Processing Symposium (NIPS) 2017 test-of-time award presentation.

Ali Rahimi and Ben Recht blog post “Back when we were kids”.

Yann LeCun response to Ali Rahimi’s NIPS lecture.

Note: update is in progress as I have more to share…


Slide 22/50, the lecturer mentioned that backpropagation for CNN is different from the classical neural networks : “need to sum over the gradients from all spatial positions”.

For ones who have questioned for this statement, I found the following blogs is useful:

You may also need to refer to chain rule for high dimensions functions: