AI Engineering Glossary
Search
view all

Cross entropy

Cross entropy is a concept used to quantify the difference between two probability distributions, typically the predicted probabilities and the actual distribution. In the context of classification tasks, particularly in neural networks, cross entropy loss measures how well the predicted distribution captures the actual target outputs. It's crucial for tasks like image recognition or text classification where multiple outcomes are possible, and it assists in updating model parameters through gradient descent.

Search Perplexity | Ask ChatGPT | Ask Clade

a

b

c

d

e

f

g

h

i

j

k

l

m

n

o

p

q

r

s

t

u

v

w

z