AI Saturdays (AI6) Kuala Lumpur


#1

Discussions related to AI Saturdays for the Kuala Lumpur chapter. Malaysia boleh!

Important Links

  1. Meetup
  2. Twitter
  3. Google Drive with weekly material
  4. FB Group
  5. Slack invite - Make sure to look for your city’s channel! :ok_hand:

Volunteer Mentors

We’re always on the lookout for more helping hands, interested in becoming a Mentor for AI6 KL? Register to be one here.


List of all City Chapter Threads
#2

Here are some guides for setting up a GPU - https://github.com/howkhang/fastai-v2-setup/blob/master/README.md


#3

Answer to Desmon, KL’s Week-2, in my humble opinion, is 'Generalize’
Fast.ai week-2 advances week-1 image identifiers with the ability to generalize better using cosine annealling on new set of data from Kaggle dog breed challenge to uncover more parameters to achieve a bowl minima by enhancing the codes with cyclemult=2, without overfitting Correct me if I am wrong.
To make it interesting, Jeremy introduces ResNext architecture and gives credit to AWS. How we wish Amazon would enable MOOC participants to AWS’s VM and GPU.


#4

You’re pretty much spot-on. :slight_smile:

‘Generalize’ basically means to be able to better define a bunch of indicators that a certain class of stuff has. ie if I have a neural network that can identify and classify a wide range of cats accurately (meaning that it can look recognize a good number of features that cats have) - I’ll say the neural network generalizes cat images well.


#5

20 January AI6: Fastai lesson 3 discussion questions and answers

I welcome constructive feedback.

  1. What is an activation function?
    It defines how an output of a NN layer should look like (most non math answer I can think of :p)

  2. What is a linear function? Give an example.
    For example f(x) = 3x + 5, It simple multiplies an input (x) with a number (3), and adds another number (5) to the result.

  3. Neural Networks work well partly because of the use of non-linear activation functions. Why?
    You can use non linear function to aproximate more functions (do more stuff).

  4. What is max pooling?
    Find the maximum of an input region. It reduces the dimension of output (showed in excel sheet)

  5. What is the RELU function?
    An activation function that only allows a neuron to be fired if the input is >0

  6. What is a tensor?
    Tensor is a multidimensional array

  7. What does the softmax function do? How does it help us with classification problems?
    It converts output of a NN to a probability distribution (where each output is between 0 and 1, and sum of output is 1)

  8. Describe the relationship between the natural logarithm (ln) with exponential (e)
    e^x = y then x = ln y

  9. What it is not encouraged to use softmax for multi label classification?
    Because one input can fall into more than one class.

  10. What is the function used in the final layer for multilabel classification?
    Sigmoid. It takes in any real number and “squashes” it in a range between (and including) 0 and 1.


#6

I was looking for such non math definition as well. This is by far the better one :slight_smile: Could be better if you rephrase to:

It defines how an output of a node (unit) in a neural network should look like.

Below are some recommended edits:

For question 2:

It simply multiplies an input, x with a number, 3, and adds another number, 5 to the result.

For question 5:

What is the Rectified Linear Unit (RELU) function?

For question 9:

Why is it not encouraged to use softmax for multi label classification?

For question 10:

What is the function used in the final layer for multi label classification?


#7

We don’t use softmax for multi label classification because a softmax’s total sum of output would be 1. A sigmoid is preferred for this since each label’s probabilities are independent of other labels (ie, a sum total of all the probabilities can be more than 1).


#8

Hear ye! Hear ye!
Today, on the 10th of February year 2018.

Lord James has declared on the 17 of February in the year of the cute puppies, 2018, to be a public holiday for all to feast and celebrate in KL chapter and spans across the land of known for delicious food and “Truly Asia.”

All international counterparts and readers are welcome to visit Malaysia. Traffics will be cleared to welcome your arrival. For free food, kindly google “Open House Malaysia 2018”

To all cushering the lunar new year - Happy Lunar New Year!


#9

Hahaha happy new year @KCKhoo !