Wiki: Lecture 2 – Word Vector Representations


#1

Lecture Video

Lecture Notes

Readings:
Word2Vec Tutorial - The Skip-Gram Model
Distributed Representations of Words and Phrases and their Compositionality
Efficient Estimation of Word Representations in Vector Space

Paper spotlight:
Deep Contextualized Word Representations. Write a summary on the paper in a reply to this thread, highlighting the key features of this new word representation.

Assignment:

  • Start working on assignment 1.
  • Training the skip-gram model can be unfeasible. Authors of word2vec addressed this in their second paper. Summarise the innovations outlined in the paper by replying to this thread.
  • Complete this hands-on tutorial for word2vec.