AI Engineering Glossary
Search

BERT

BERT, which stands for Bidirectional Encoder Representations from Transformers, is a pre-trained model developed by Google. It uses transformers to understand context in sentences by processing text in both directions. This bidirectional approach allows BERT to grasp the meaning of words based on their surrounding words, enhancing its effectiveness in tasks like question answering and sentiment analysis. BERT's contextual understanding distinguishes it from earlier unidirectional models like GPT.

Search Perplexity | Ask ChatGPT | Ask Clade

a

b

c

d

e

f

g

h

i

j

k

l

m

n

o

p

q

r

s

t

u

v

w

z