AI Engineering Glossary
Search
view all

Contextual Embedding

Contextual embedding captures the meaning of a word by considering neighboring words, enabling dynamic variations in representation depending on context. For example, 'bank' might refer to a financial institution or river's edge, based on context. Contextual embeddings, utilized in models like BERT, outperform traditional static embeddings that assign fixed vectors to words irrespective of their usage nuances. This advancement enhances tasks like translation, sentiment analysis, and question answering by providing nuanced understanding of language.

Search Perplexity | Ask ChatGPT | Ask Clade

a

b

c

d

e

f

g

h

i

j

k

l

m

n

o

p

q

r

s

t

u

v

w

z