Skip to main content
Theorem
Path
Curriculum
Paths
Labs
Diagnostic
Case Study
Blog
Search
Sign in
Quiz Hub
/
Gradient Flow and Vanishing Gradients
Gradient Flow and Vanishing Gradients
7 selected
Difficulty 2-7
7 unseen
View topic
Foundation
New
0 answered
3 foundation
3 intermediate
1 advanced
Adapts to your performance
Question 1 of 7
120s
foundation (2/10)
conceptual
Why does Leaky ReLU sometimes help compared with ordinary ReLU?
Hide and think first
A.
It changes the loss function from cross-entropy to squared error
B.
It removes the need for backpropagation by making the network exactly linear
C.
It gives negative inputs a small nonzero slope, so gradients can still flow
D.
It forces all positive activations to be smaller than one, preventing overconfidence
Show Hint
Submit Answer
I don't know