Word Embeddings MCQ

📖 NLP Word Embeddings quiz

20 questions on word2vec, GloVe & FastText. No answers pre‑marked – select, check, and learn.

6 easy 8 medium 6 hard

🔍 Word embeddings concepts covered

This quiz covers word embeddings: dense vector representations of words (word2vec, GloVe, FastText), the distributional hypothesis, similarity, and how they capture semantic and syntactic relations.

Word2Vec (CBOW, skip-gram)
GloVe
FastText
Distributional hypothesis
Similarity & analogy
OOV handling