Derivatives
Derivatives are the foundation of machine learning optimization. Every time a neural network learns, it's using derivatives to figure out how to adjust its parameters.
Resources
Same as previous page
I recommend watching only Professor Leonard and study and solve all problems Calculus I with integrated Precalculus book
| Resource | Type | Cost | Link | Notes |
|---|---|---|---|---|
| 3Blue1Brown Calculus | Video Series | Free | YouTube | Best intuitive introduction available |
| Khan Academy Calculus | Interactive Course | Free | khanacademy.org | Solid practice problems and explanations |
| MIT OCW 18.01 | Full Course | Free | ocw.mit.edu | Rigorous treatment with problem sets |
| Paul's Online Math Notes | Reference | Free | tutorial.math.lamar.edu | Great for quick lookups and examples |
| Professor Leonard | Video Lectures | Free | YouTube | |
| Calculus I with integrated Precalculus | Book | Paid | Book link |
The Core Concept
A derivative measures how much a function changes when you change its input by a tiny amount. If f(x) is your function, f'(x) tells you the slope of the curve at point x.
In ML terms: if your loss function is L(w) where w is a weight, then L'(w) tells you how much the loss increases or decreases when you change that weight slightly.
Essential Rules
Calculus Cheat Sheet derivatives
Why This Matters
When you hear "gradient descent," that's just following derivatives downhill to minimize a function. When you see "backpropagation," that's computing derivatives using the chain rule through a network.
Understanding derivatives geometrically (as slopes) and algebraically (as limits) gives you the intuition to debug optimization problems and understand why learning algorithms work the way they do.