Word Embedding
Word embedding represents words as continuous vector values, capturing semantic relationships and meanings based on context. Unlike traditional word representations like one-hot encoding, embeddings place semantically similar words closer in the vector space. Examples include "king" - "man" + "woman" approximately equaling "queen." Popular methods for creating word embeddings include Word2Vec, GloVe, and FastText. These are foundational for tasks like sentiment analysis, translation, and contextual understanding.