Lesson 3 AI6 KL Summary


Session 1: Dogs vs Cats to Homer vs Bender
We trained the dogs vs cats model from fastai to distinguish between homer and bender pictures.
(See refresher_lesson1_onlycode.ipynb in https://github.com/jamsawamsa/AI6-wk1-refresher)

Session 2: Intro to Convolutional Neural Networks

  1. What is an activation function?
    It defines how an output of a node (unit) in a neural network should look like (most non math answer I can think of :p)

  2. What is a linear function? Give an example.
    For example f(x) = 3x + 5, It simply multiplies an input (x) with a number (3), and adds another number (5) to the result.

  3. Neural Networks work well partly because of the use of non-linear activation functions. Why?
    You can use non linear function to aproximate more functions (do more stuff).

  4. What is max pooling?
    Find the maximum of an input region. It reduces the dimension of output (showed in excel sheet)

  5. What is the Rectified Linear Unit (RELU) function?
    An activation function that only allows a neuron to be fired if the input is >0

  6. What is a tensor?
    Tensor is a multidimensional array

  7. What does the softmax function do? How does it help us with classification problems?
    It converts output of a NN to a probability distribution (where each output is between 0 and 1, and sum of output is 1)

  8. Describe the relationship between the natural logarithm (ln) with exponential (e)
    e^x = y then x = ln y

  9. Why is it is not encouraged to use softmax for multi label classification?
    Because one input can fall into more than one class.

  10. What is the function used in the final layer for multi label classification?
    Sigmoid. It takes in any real number and “squashes” it in a range between (and including) 0 and 1.

Session 3: Multilabel classification: Kaggle planet competition. We encourage you to participate. A good guide can be found in: https://medium.com/ai-saturdays/kaggle-planet-competition-how-to-land-in-top-4-a679ff0013ba.