Single Variable Calculus
Calculus is the math of change, and machine learning is all about optimization which is finding the best way to change model parameters. You can't understand gradient descent, backpropagation, or most ML algorithms without solid calculus fundamentals.
Resources
| Resource | Type | Cost | Link | Notes |
|---|---|---|---|---|
| 3Blue1Brown Calculus | Video Series | Free | YouTube | Best intuitive introduction available |
| Khan Academy Calculus | Interactive Course | Free | khanacademy.org | Solid practice problems and explanations |
| MIT OCW 18.01 | Full Course | Free | ocw.mit.edu | Rigorous treatment with problem sets |
| Paul's Online Math Notes | Reference | Free | tutorial.math.lamar.edu | Great for quick lookups and examples |
| Professor Leonard | Video Lectures | Free | YouTube | |
| Calculus I with integrated Precalculus | Book | Paid | Book link |
What You Need to Know
The core concepts are limits, derivatives, and integrals. Limits help you understand what happens at boundary cases. Derivatives tell you how fast things change. Integrals let you accumulate change over time or space.
For ML specifically, you need to understand what a derivative represents geometrically and how to compute them for common functions. Chain rule is absolutely critical since neural networks are compositions of functions.
Integration becomes essential for probability theory and more advanced topics.
The Big Picture
Derivatives are the slope of a curve at any point. When training neural networks, you're constantly asking "which direction should I adjust this parameter to reduce the error?" That's exactly what gradients tell you.
Don't get bogged down in integration techniques initially. Focus on understanding what derivatives mean and how to compute them confidently.