AI Engineering Glossary
Search

Self-Attention

Self-attention is a technique used in models to determine which parts of a sequence (like a sentence) are important when generating predictions. In the context of natural language processing, it helps the model focus on relevant words in a sentence by computing a score for each word relative to others. This is essential for understanding context and meaning, improving tasks like language translation and text generation.

Search Perplexity | Ask ChatGPT | Ask Clade

a

b

c

d

e

f

g

h

i

j

k

l

m

n

o

p

q

r

s

t

u

v

w

z