AI Engineering Glossary
Search

Gradient Descent

Gradient Descent is a key optimization technique used to minimize a function, often the error or loss in models. Begin with an initial guess for the model parameters, it calculates the gradient (derivative) and moves in its opposite direction. Over iterations, the parameters converge to a minimum value, refining the model's accuracy. Commonly used in training neural networks, it is akin to finding the lowest valley point by taking small downhill steps.

Search Perplexity | Ask ChatGPT | Ask Clade

a

b

c

d

e

f

g

h

i

j

k

l

m

n

o

p

q

r

s

t

u

v

w

z