πŸŽ“How I Study AIHISA
πŸ“–Read
πŸ“„PapersπŸ“°Blogs🎬Courses
πŸ’‘Learn
πŸ›€οΈPathsπŸ“šTopicsπŸ’‘Concepts🎴Shorts
🎯Practice
πŸ“Daily Log🎯Prompts🧠Review
SearchSettings
How I Study AI - Learn AI Papers & Lectures the Easy Way

🎬AI Lectures18

πŸ“šAllπŸ“LLM🎯PromptsπŸ”RAG🀝Agents🧠Deep LearningπŸ’¬NLPπŸ€–MLπŸ“–Basics
Difficulty:
AllBeginnerIntermediateAdvanced
Stanford CS230 | Autumn 2025 | Lecture 8: Agents, Prompts, and RAGBasics

Stanford CS230 | Autumn 2025 | Lecture 8: Agents, Prompts, and RAG

Beginner
Stanford

This session sets up course logistics and introduces core machine learning ideas. You learn when and how class meets, where to find materials, how grading works, and why MATLAB is used. It also sets expectations: the course is challenging, homeworks are crucial, and live attendance is encouraged.

#machine learning#supervised learning#unsupervised learning
12
Stanford CS230 | Autumn 2025 | Lecture 4: Adversarial Robustness and Generative Models
Basics

Stanford CS230 | Autumn 2025 | Lecture 4: Adversarial Robustness and Generative Models

Beginner
Stanford

The lecture explains why we use machine learning instead of writing step-by-step rules. Many real problems, like finding cats in photos, are too messy for hand-written rules because there are too many exceptions. With machine learning, we give the computer lots of examples and it discovers patterns on its own. This approach lets computers handle tasks we can’t easily explain in code.

#machine learning#supervised learning#unsupervised learning
Stanford CME295 Transformers & LLMs | Autumn 2025 | Lecture 3 - Tranformers & Large Language ModelsBasics

Stanford CME295 Transformers & LLMs | Autumn 2025 | Lecture 3 - Tranformers & Large Language Models

Beginner
Stanford

Artificial Intelligence (AI) is the science of making machines do tasks that would need intelligence if a person did them. Today’s AI mostly focuses on specific tasks like recognizing faces or recommending products, which is called narrow AI. A future goal is general AI, which would do any thinking task a human can, but it does not exist yet.

#artificial intelligence#narrow ai#general ai
Stanford CS329H: Machine Learning from Human Preferences | Autumn 2024 | EthicsBasics

Stanford CS329H: Machine Learning from Human Preferences | Autumn 2024 | Ethics

Beginner
Stanford

The lecture explains regularization, a method to reduce overfitting by adding a penalty to the cost (loss) function that discourages overly complex models. Overfitting is when a model memorizes noise in the training data and fails to generalize. Regularization keeps model parameters (weights) from growing too large, which helps models generalize better to new data.

#regularization#l1#l2
Stanford CS329H: Machine Learning from Human Preferences | Autumn 2024 | VotingBasics

Stanford CS329H: Machine Learning from Human Preferences | Autumn 2024 | Voting

Beginner
Stanford

This lecture kicks off an Introduction to Machine Learning course by explaining how the class will run and what you will learn. The instructor, Byron Wallace, introduces the TAs (Max and Zohair), office hours, and where to find all materials (Piazza/Canvas). The course has weekly graded homework, a midterm, and a group final project with a proposal, report, and presentation. Lectures are mostly theory and recorded; hands-on coding happens in homework.

#supervised learning#classification#regression
Stanford CS329H: ML from Human Preferences | Autumn 2024 | Model-based Preference OptimizationBasics

Stanford CS329H: ML from Human Preferences | Autumn 2024 | Model-based Preference Optimization

Beginner
Stanford

Decision trees are flowchart-like models used to predict a class (like yes/no) by asking a series of questions about features. You start at the root and follow branches based on answers until you reach a leaf with a class label. Each internal node tests one attribute, each branch is an outcome of that test, and each leaf gives the prediction.

#decision tree#entropy#information gain
Chapter 1: Vectors, what even are they? | Essence of Linear AlgebraBasics

Chapter 1: Vectors, what even are they? | Essence of Linear Algebra

Beginner
3Blue1Brown Korean

This lesson explains what a vector really is from three connected views: a directed arrow in space, a coordinate like (4, 3), and a list of numbers like [4, 3]. Thinking of vectors as arrows makes direction and length feel natural, while coordinates make calculation easy. Both are the same thing described in different ways. You can move an arrow anywhere without changing the vector, as long as its direction and length stay the same.

#vector#arrow representation#components
Chapter 12: A geometric interpretation of Cramer's rule | Essence of Linear AlgebraBasics

Chapter 12: A geometric interpretation of Cramer's rule | Essence of Linear Algebra

Beginner
3Blue1Brown Korean

This lesson explains Cramer's rule using geometry. A linear system like 2x + 5y = 7 and βˆ’3x + 4y = βˆ’1 can be written as Ax = v, where A’s columns are two vectors a1 and a2. The solution (x, y) tells how much to stretch and add a1 and a2 to land exactly on v.

#cramer's rule#determinant#parallelogram area
Chapter 6: The determinant | Essence of Linear AlgebraBasics

Chapter 6: The determinant | Essence of Linear Algebra

Beginner
3Blue1Brown Korean

The determinant is a single number attached to every square matrix that tells how a linear transformation scales area in 2D or volume in 3D. Its absolute value is the scale factor, and its sign tells whether the transformation keeps orientation or flips it. Think of dropping a 1-by-1 square (or 1-by-1-by-1 cube) into the transformation and measuring what size it becomes.

#determinant#linear transformation#area scaling
Chapter 5: Three-dimensional linear transformations | Essence of Linear AlgebraBasics

Chapter 5: Three-dimensional linear transformations | Essence of Linear Algebra

Beginner
3Blue1Brown Korean

Three-dimensional linear transformations change the whole 3D space while keeping the origin fixed and all grid lines straight and parallel. This is just like in 2D, but now there are three axes: x, y, and z. These transformations stretch, rotate, shear (slant), or reflect the space without bending or curving it.

#3d linear transformation#basis vectors#i-hat j-hat k-hat
Chapter 9: Dot products and duality | Essence of Linear AlgebraBasics

Chapter 9: Dot products and duality | Essence of Linear Algebra

Beginner
3Blue1Brown Korean

Dot product (also called inner product) takes two vectors and gives one number by multiplying matching coordinates and adding them. For example, with V=(3,2) and W=(4,-1), the dot product is 3*4 + 2*(-1) = 10. This single number is not just arithmetic; it measures how much V goes in the direction of W. If V points with W, the number is positive; if against, it’s negative; if perpendicular, it’s near zero.

#dot product#inner product#projection
Chapter 8: Nonsquare matrices as transformations between dimensions | Essence of Linear AlgebraBasics

Chapter 8: Nonsquare matrices as transformations between dimensions | Essence of Linear Algebra

Beginner
3Blue1Brown Korean

A matrix with different numbers of rows and columns models a transformation between spaces of different sizes. For example, a 3-by-2 matrix takes 2D vectors from the flat plane and turns them into 3D vectors in space. The columns of the matrix tell you exactly where the basic 2D directions (i-hat and j-hat) end up in 3D. Using this rule, any 2D input can be mapped by combining those columns.

#non-square matrix#rectangular matrix#linear transformation