🎓How I Study AIHISA
📖Read
📄Papers📰Blogs🎬Courses
💡Learn
🛤️Paths📚Topics💡Concepts🎴Shorts
🎯Practice
📝Daily Log🎯Prompts🧠Review
SearchSettings
How I Study AI - Learn AI Papers & Lectures the Easy Way
All Topics
🧮Foundations
∂

Calculus & Optimization

Learn differentiation, gradients, and optimization techniques essential for training neural networks

Recommended for:🔬ML Researcher📊Data Scientist🤖LLM Engineer
🌱

Beginner

Beginner

Understand derivatives and their intuition

What to Learn

  • •Derivatives as rate of change
  • •Chain rule and product rule
  • •Partial derivatives
  • •Gradients as direction of steepest ascent
  • •Basic optimization: finding minima/maxima

Resources

  • 📚3Blue1Brown: Essence of Calculus
  • 📚Khan Academy Calculus
  • 📚Calculus Made Easy by Thompson
🌿

Intermediate

Intermediate

Apply calculus to machine learning

What to Learn

  • •Gradient descent and variants (SGD, momentum)
  • •Learning rate schedules
  • •Convex vs non-convex optimization
  • •Jacobian and Hessian matrices
  • •Backpropagation derivation

Resources

  • 📚Deep Learning book Chapter 4
  • 📚Stanford CS231n backprop notes
  • 📚Implement gradient descent from scratch
🌳

Advanced

Advanced

Advanced optimization for research

What to Learn

  • •Second-order optimization methods
  • •Constrained optimization and Lagrangians
  • •Loss landscape analysis
  • •Adaptive learning rate methods (Adam, AdaGrad)
  • •Optimization theory and convergence proofs

Resources

  • 📚Convex Optimization by Boyd
  • 📚Optimization for Machine Learning (Sra)
  • 📚Research papers on Adam, LAMB optimizers
#math#optimization#gradients#calculus