Instruction: Describe what word embeddings are and why they are useful in NLP.
Context: This question aims to test the candidate's knowledge of advanced techniques for representing words in vector space.
Official answer available
Preview the opening of the answer, then unlock the full walkthrough.
The way I'd explain it in an interview is this: Word embeddings are dense vector representations of words that place semantically or syntactically related words closer together in vector space. Instead of representing words as isolated symbols, embeddings capture patterns of...