AI Engineering Glossary
Search

Activation Function

Activation functions are pivotal in neural networks, deciding whether a neuron should be activated or not based on inputs. They introduce non-linearity into the model, ensuring the network can learn complex patterns. Common examples include ReLU, Sigmoid, and Tanh. For instance, in deep learning, the ReLU function helps models cope with vanishing gradients, unlike linear activation which would restrict the model’s capability to learn from data.

Search Perplexity | Ask ChatGPT | Ask Clade

a

b

c

d

e

f

g

h

i

j

k

l

m

n

o

p

q

r

s

t

u

v

w

z