Single Variable Calculus

Calculus is the math of change, and machine learning is all about optimization which is finding the best way to change model parameters. You can't understand gradient descent, backpropagation, or most ML algorithms without solid calculus fundamentals.

Resources

ResourceTypeCostLinkNotes
3Blue1Brown CalculusVideo SeriesFreeYouTubeBest intuitive introduction available
Khan Academy CalculusInteractive CourseFreekhanacademy.orgSolid practice problems and explanations
MIT OCW 18.01Full CourseFreeocw.mit.eduRigorous treatment with problem sets
Paul's Online Math NotesReferenceFreetutorial.math.lamar.eduGreat for quick lookups and examples
Professor LeonardVideo LecturesFreeYouTube
Calculus I with integrated PrecalculusBookPaidBook link

What You Need to Know

The core concepts are limits, derivatives, and integrals. Limits help you understand what happens at boundary cases. Derivatives tell you how fast things change. Integrals let you accumulate change over time or space.

For ML specifically, you need to understand what a derivative represents geometrically and how to compute them for common functions. Chain rule is absolutely critical since neural networks are compositions of functions.

Integration becomes essential for probability theory and more advanced topics.

The Big Picture

Derivatives are the slope of a curve at any point. When training neural networks, you're constantly asking "which direction should I adjust this parameter to reduce the error?" That's exactly what gradients tell you.

Don't get bogged down in integration techniques initially. Focus on understanding what derivatives mean and how to compute them confidently.