🧮Foundations
∂
Calculus & Optimization
Learn differentiation, gradients, and optimization techniques essential for training neural networks
🌱
Beginner
BeginnerUnderstand derivatives and their intuition
What to Learn
- •Derivatives as rate of change
- •Chain rule and product rule
- •Partial derivatives
- •Gradients as direction of steepest ascent
- •Basic optimization: finding minima/maxima
Resources
- 📚3Blue1Brown: Essence of Calculus
- 📚Khan Academy Calculus
- 📚Calculus Made Easy by Thompson
🌿
Intermediate
IntermediateApply calculus to machine learning
What to Learn
- •Gradient descent and variants (SGD, momentum)
- •Learning rate schedules
- •Convex vs non-convex optimization
- •Jacobian and Hessian matrices
- •Backpropagation derivation
Resources
- 📚Deep Learning book Chapter 4
- 📚Stanford CS231n backprop notes
- 📚Implement gradient descent from scratch
🌳
Advanced
AdvancedAdvanced optimization for research
What to Learn
- •Second-order optimization methods
- •Constrained optimization and Lagrangians
- •Loss landscape analysis
- •Adaptive learning rate methods (Adam, AdaGrad)
- •Optimization theory and convergence proofs
Resources
- 📚Convex Optimization by Boyd
- 📚Optimization for Machine Learning (Sra)
- 📚Research papers on Adam, LAMB optimizers