AI Engineering Glossary
Search

Autoregressive Language Models

Autoregressive language models work by predicting the next word in a sequence using the context provided by preceding words. By doing so iteratively, they generate coherent text. This contrasts with non-autoregressive models, which can predict multiple words or entire sequences simultaneously. Models like GPT are prime examples of autoregressive systems, where each token prediction is conditional on all prior outputs, facilitating tasks like text completion and translation.

Search Perplexity | Ask ChatGPT | Ask Clade

a

b

c

d

e

f

g

h

i

j

k

l

m

n

o

p

q

r

s

t

u

v

w

z