AI Engineering Glossary
Search
view all

Word Embedding

Word embedding represents words as continuous vector values, capturing semantic relationships and meanings based on context. Unlike traditional word representations like one-hot encoding, embeddings place semantically similar words closer in the vector space. Examples include "king" - "man" + "woman" approximately equaling "queen." Popular methods for creating word embeddings include Word2Vec, GloVe, and FastText. These are foundational for tasks like sentiment analysis, translation, and contextual understanding.

Search Perplexity | Ask ChatGPT | Ask Clade

a

b

c

d

e

f

g

h

i

j

k

l

m

n

o

p

q

r

s

t

u

v

w

z