6-Month Maths-Focused Machine Learning Program with Enhanced Resources
This program spreads the material from an intensive plan over 6 months, aiming for roughly 1-2 hours of focused study/coding per day on weekdays, with optional longer sessions on weekends for deeper dives or project work.
Month 1: Linear Algebra & Foundational Math
Goal: Build a solid understanding of vectors, matrices, and basic linear algebra operations crucial for ML.
-
Week 1: Introduction to Vectors
- Day 1: What are Vectors?
- MMLL: Chapter 2.1 (Vectors)
- SIA: Chapter 1.1 (Vectors and Linear Combinations)
- KA:
Vectors introduction 1 - 3Blue1Brown:
What is a vector? - Medium:
Vectors for Machine Learning Explained Simply - Assignment: KA - Vector intro questions
- Day 2: Vector Operations
- MMLL: Chapter 2.1 (Vector Addition, Scalar Multiplication)
- SIA: Chapter 1.2 (Lengths and Dot Products)
- KA:
Adding and subtracting vectors - YouTube:
(Khan Academy)Vector operations - Assignment: KA - Performing vector operations
- Day 3: Dot Product & Projections (Basic)
- MMLL: Chapter 2.1.2 (Inner Product/Dot Product), Chapter 2.2.3 (Orthogonal Projection - Basic idea)
- SIA: Chapter 1.2 (Lengths and Dot Products)
- KA:
Dot product - 3Blue1Brown:
Dot products and duality - YouTube:
(3Blue1Brown)Dot products and duality | Chapter 3, Essence of linear algebra - Assignment: KA - Dot product & vector projections
- Day 4: Linear Combinations & Span
- MMLL: Chapter 2.1.3 (Linear Combination)
- SIA: Chapter 1.3 (Matrices) - focus on how columns combine.
- KA:
Linear combinations and span - Medium:
(CK-12)Linear Combinations and Span - YouTube:
(3Blue1Brown)Linear combinations, span, and basis vectors | Chapter 2, Essence of linear algebra - Assignment: KA - Determine if a vector is in a given span
- Day 5: Hands-on Vector Operations
- Assignment: Implement basic vector addition, scalar multiplication, and dot product using NumPy arrays. Create a function for vector magnitude.
- Day 6: Review & Practice
- Review notes from Week 1. Redo any challenging KA problems.
- Assignment: Search for "linear algebra vector problems" online, solve 5-10.
- Day 7: Rest/Catch-up
- Day 1: What are Vectors?
-
Week 2: Matrices - The Basics
- Day 8: Matrix Definition & Types
- MMLL: Chapter 2.2 (Matrices)
- SIA: Chapter 1.4 (Multiplying Matrices) - focus on basic definition.
- KA:
Introduction to matrices - Medium:
(Machine Learning Mastery)A Complete Guide to Matrices for Machine Learning with Python - YouTube:
(3Blue1Brown)Matrices (and how to multiply them) | Chapter 4, Essence of linear algebra - Assignment: KA - Matrix dimensions
- Day 9: Matrix Addition & Scalar Multiplication
- MMLL: Chapter 2.2.1 (Matrix Operations)
- SIA: Chapter 2.1 (Solving Linear Equations by Elimination) - focus on matrix representation.
- KA:
Adding & subtracting matrices - YouTube:
Scalar multiplication and addition of matrices 2 (Khan Academy) - Assignment: KA - Add & subtract matrices
- Day 10: Matrix Multiplication (The Core)
- MMLL: Chapter 2.2.1 (Matrix Operations) - Understand row-column dot product.
- SIA: Chapter 1.4 (Multiplying Matrices)
- KA:
Matrix multiplication - 3Blue1Brown:
Matrix multiplication as transformations - YouTube:
(3Blue1Brown)Matrix multiplication as transformations | Chapter 4, Essence of linear algebra - Assignment: KA - Multiplying matrices
- Day 11: Transpose & Identity Matrix
- MMLL: Chapter 2.2.1 (Matrix Operations), 2.2.2 (Identity Matrix)
- SIA: Chapter 1.4 (Transpose)
- KA:
Transpose of a matrix - YouTube:
(Khan Academy)Introduction to the identity matrix - Assignment: KA - Transposing a matrix
- Day 12: Hands-on Matrix Operations
- Assignment: Implement matrix addition, scalar multiplication, and matrix multiplication using NumPy. Verify
np.dotandnp.transpose.
- Assignment: Implement matrix addition, scalar multiplication, and matrix multiplication using NumPy. Verify
- Day 13: Review & Practice
- Review notes from Week 2. Focus on matrix multiplication intuition.
- Assignment: Search for "linear algebra matrix multiplication problems," solve 5-10.
- Day 14: Rest/Catch-up
- Day 8: Matrix Definition & Types
-
Week 3: Systems of Linear Equations & Inverses
- Day 15: Systems of Linear Equations (Matrix Form)
- MMLL: Chapter 2.3 (Solving Systems of Linear Equations)
- SIA: Chapter 2.1 (Solving Linear Equations by Elimination)
- KA:
Solving systems of linear equations - YouTube:
(Khan Academy)Solving linear systems with matrices - Assignment: KA - Solutions to systems of equations
- Day 16: Determinants (2x2, 3x3)
- MMLL: Chapter 2.2.4 (Determinant)
- SIA: Chapter 5.1 (The Properties of Determinants)
- KA:
Determinant of a 2x2 matrix - 3Blue1Brown:
The determinant - YouTube:
(3Blue1Brown)The determinant | Chapter 6, Essence of linear algebra - Assignment: KA - Determinant of a 2x2, 3x3 matrix
- Day 17: Inverse Matrices (Conceptual & Calculation)
- MMLL: Chapter 2.2.5 (Matrix Inverse)
- SIA: Chapter 2.2 (Matrix Inverse)
- KA:
Inverse of a 2x2 matrix - YouTube:
(3Blue1Brown)Inverse matrices, column space and null space | Chapter 7, Essence of linear algebra - Assignment: KA - Inverse of a 2x2 matrix
- Day 18: Solving Systems with Inverses
- MMLL: Chapter 2.3 (Using Matrix Inverse to Solve Systems)
- SIA: Chapter 2.2 (Solving with A−1)
- Assignment: Use
np.linalg.solveto solve a system of linear equations in Python. Practice calculating inverses withnp.linalg.inv.
- Day 19: Hands-on Determinants & Inverses
- Assignment: Write Python functions to calculate the determinant of a 2x2 matrix and the inverse of a 2x2 matrix from scratch (without
np.linalg). Compare with NumPy.
- Assignment: Write Python functions to calculate the determinant of a 2x2 matrix and the inverse of a 2x2 matrix from scratch (without
- Day 20: Review & Practice
- Review notes. Ensure you understand when an inverse exists (non-zero determinant).
- Assignment: MIT OpenCourseware (MIT 18.06SC Linear Algebra) Problem Set 3, Problem 1 (or similar problems related to inverses).
- Day 21: Rest/Catch-up
- Day 15: Systems of Linear Equations (Matrix Form)
-
Week 4: Eigenvalues, Eigenvectors & Review
- Day 22: Eigenvalues & Eigenvectors - Intuition
- MMLL: Chapter 2.4 (Eigenvalues and Eigenvectors)
- SIA: Chapter 6.1 (Introduction to Eigenvalues)
- 3Blue1Brown:
Eigenvectors and eigenvalues - Medium:
Eigenvalues and Eigenvectors: The Most Intuitive Explanation - YouTube:
(3Blue1Brown)Eigenvectors and eigenvalues | Chapter 8, Essence of linear algebra - Assignment: Watch 3Blue1Brown multiple times. Focus on geometric intuition.
- Day 23: Calculating Eigenvalues & Eigenvectors (2x2)
- MMLL: Chapter 2.4 (Calculation examples)
- SIA: Chapter 6.1 (Finding Eigenvalues and Eigenvectors)
- KA:
Eigenvalues and eigenvectors - Assignment: KA - Find eigenvalues and eigenvectors of a 2x2 matrix
- Day 24: Hands-on Eigen Decomposition
- Assignment: Use
np.linalg.eigto find eigenvalues and eigenvectors of a matrix. Verify Av=lambdav.
- Assignment: Use
- Day 25: Orthogonality (Conceptual)
- MMLL: Chapter 2.1.2 (Orthogonal Vectors)
- SIA: Chapter 4.1 (Orthogonal Vectors and Subspaces)
- KA:
Orthogonal vectors - YouTube:
(Khan Academy)Orthogonal vectors - Assignment: Identify orthogonal vectors.
- Day 26: Linear Algebra Review & Mini-Project Prep
- Review: All concepts from Month 1. Revisit problem areas.
- Project Prep: Understand the high-level goal of Principal Component Analysis (PCA) as a dimensionality reduction technique (you'll implement a basic version in Month 5). Focus on how it uses eigenvectors.
- Day 27: REST/Catch-up
- Day 28: Monthly Review & Catch-up
- Day 22: Eigenvalues & Eigenvectors - Intuition
Month 2: Calculus, Probability & First ML Algorithm
Goal: Grasp essential calculus concepts for optimization, fundamental probability/statistics, and apply them to your first ML model.
-
Week 5: Calculus - Derivatives
- Day 29: Functions, Limits, Continuity
- KA:
Limits introduction - YouTube:
(Khan Academy)Limits (full playlist) - Assignment: KA - Evaluating limits from graphs & functions, Continuity problems
- KA:
- Day 30: Derivatives - Intuition & Power Rule
- KA:
Derivatives introduction - 3Blue1Brown:
What is a derivative? - YouTube:
(3Blue1Brown)What's a derivative? - Assignment: KA - Power rule problems
- KA:
- Day 31: Product, Quotient, Chain Rule
- KA:
Product rule - YouTube:
(Khan Academy)The Chain Rule - Assignment: KA - Practice applying product, quotient, and chain rules.
- KA:
- Day 32: Critical Points & Local Min/Max
- KA:
Maxima & minima on an interval - YouTube:
(Khan Academy)Finding critical points - Assignment: KA - Find critical points and determine local extrema.
- KA:
- Day 33: Hands-on Symbolic Differentiation (Optional)
- Assignment: Experiment with
SymPyin Python to symbolically differentiate simple functions. (e.g.,import sympy; x = sympy.symbols('x'); f = x**2 + 3*x; sympy.diff(f, x))
- Assignment: Experiment with
- Day 34: Review & Practice
- Assignment: Search for "univariate calculus differentiation problems," solve 5-10.
- Day 35: Rest/Catch-up
- Day 29: Functions, Limits, Continuity
-
Week 6: Calculus - Gradients & Optimization Intro
- Day 36: Functions of Multiple Variables & Partial Derivatives
- MMLL: Chapter 3.1.2 (Partial Derivatives)
- KA:
Partial derivatives introduction - Medium:
(CodeSignal Learn)Derivatives for Multivariable Functions - YouTube:
(Khan Academy)Partial derivatives - Assignment: KA - Compute partial derivatives
- Day 37: Gradient Vector
- MMLL: Chapter 3.1.3 (Gradient)
- KA:
The gradient - 3Blue1Brown:
The gradient - YouTube:
(3Blue1Brown)The gradient, explained | Chapter 10, Essence of calculus - Assignment: Compute gradient vectors for simple functions (e.g., ).
- Day 38: Hessian Matrix (Conceptual)
- MMLL: Chapter 3.1.4 (Hessian) - Understand it represents curvature.
- KA:
The Hessian matrix - YouTube:
(mathematicalmonk)Hessian Matrix - Assignment: No explicit problem, focus on conceptual understanding of second partial derivatives and the Hessian's purpose.
- Day 39: Introduction to Optimization
- MMLL: Chapter 4 (Optimization) - Focus on 4.1 (Introduction) and 4.2 (Conditions for Optima).
- KA:
Optimization problems (conceptual) - Medium:
(Neural Concept)Machine Learning Optimization: Best Techniques and Algorithms - Assignment: Understand the goal of optimization in ML (minimizing loss functions).
- Day 40: Hands-on Gradient Calculation
- Assignment: Write a Python function that calculates the gradient of a simple multivariate function (e.g., ).
- Day 41: Review & Practice
- Assignment: Review partial derivatives and gradients. MIT OpenCourseware (MIT 18.02 Multivariable Calculus) Problem Set 1, Problem 1 (or similar).
- Day 42: Rest/Catch-up
- Day 36: Functions of Multiple Variables & Partial Derivatives
-
Week 7: Probability Fundamentals
- Day 43: Basic Probability & Events
- MMLL: Chapter 5.1 (Probability)
- KA:
Basic probability - YouTube:
(Khan Academy)Basic Probability - Assignment: KA - Simple probability
- Day 44: Conditional Probability & Bayes' Theorem
- MMLL: Chapter 5.1.2 (Conditional Probability), 5.1.3 (Bayes' Theorem)
- KA:
Conditional probability - YouTube:
(3Blue1Brown - highly visual)Conditional Probability and Bayes' Theorem - Assignment: KA - Conditional probability, Bayes' theorem
- Day 45: Random Variables & Distributions Intro
- MMLL: Chapter 5.2 (Random Variables)
- KA:
Random variables - YouTube:
(Khan Academy)Random Variables - Assignment: Distinguish between discrete and continuous random variables.
- Day 46: PMF, PDF, CDF
- MMLL: Chapter 5.2.1 (Probability Mass Function), 5.2.2 (Probability Density Function), 5.2.3 (Cumulative Distribution Function)
- KA:
PMF, PDF, CDF (conceptual) - Medium:
(Machine Learning Mastery)Understanding Probability Distributions for Machine Learning with Python - YouTube:
(StatQuest with Josh Starmer)Probability Density Functions (PDFs) and Probability Mass Functions (PMFs) - Assignment: Match descriptions to the correct function type (PMF, PDF, CDF).
- Day 47: Hands-on Probability Simulations
- Assignment: Write Python code to simulate coin flips, dice rolls. Calculate empirical probabilities and compare to theoretical.
- Day 48: Review & Practice
- Assignment: Review probability concepts. Search for "probability problems with Bayes' theorem" and solve 2-3.
- Day 49: Rest/Catch-up
- Day 43: Basic Probability & Events
-
Week 8: Probability Distributions & Linear Regression (Math)
- Day 50: Common Discrete Distributions (Bernoulli, Binomial)
- MMLL: Chapter 5.2.5 (Bernoulli, Binomial)
- KA:
Bernoulli - YouTube:
(StatQuest with Josh Starmer)The Binomial Distribution - Assignment: KA - Bernoulli, Binomial distribution problems
- Day 51: Normal (Gaussian) Distribution
- MMLL: Chapter 5.2.6 (Gaussian Distribution)
- KA:
Normal distribution introduction - YouTube:
(StatQuest with Josh Starmer)The Normal Distribution - Assignment: KA - Z-scores and normal distribution probabilities
- Day 52: Expectation & Variance
- MMLL: Chapter 5.3 (Expectation), 5.4 (Variance and Covariance)
- KA:
Expected value - YouTube:
(The Organic Chemistry Tutor)Expected Value and Variance Explained - Assignment: KA - Calculate expected value and variance for simple distributions.
- Day 53: Simple Linear Regression - Mathematical Formulation
- MMLL: Chapter 6.1 (Linear Regression) - Focus on 6.1.1 (Model Definition) and 6.1.2 (Squared Error Loss).
- Medium:
(Google for Developers)Linear regression | Machine Learning - YouTube:
(StatQuest with Josh Starmer)Linear Regression - Fun and Easy Machine Learning - Assignment: Understand the equation y=beta_0+beta_1x and the concept of minimizing the Sum of Squared Errors (SSE).
- Day 54: Least Squares Method - Conceptual
- MMLL: Chapter 6.1.3 (Least Squares Estimation) - Focus on the intuition of finding the "best fit" line.
- YouTube:
(Khan Academy)Least Squares Regression - Assignment: No coding, just understanding the conceptual goal of Least Squares.
- Day 55: Monthly Review & Catch-up
- Assignment: Review all concepts from Month 2. Focus on the intuition of derivatives, gradients, and how probability distributions describe data.
- Day 56: Rest/Catch-up
- Day 50: Common Discrete Distributions (Bernoulli, Binomial)
Month 3: Regression & Core ML Concepts
Goal: Deepen understanding of regression, explore optimization for ML, and grasp bias-variance.
-
Week 9: Linear Regression Implementation
- Day 57: Derivation of Coefficients (Calculus)
- MMLL: Chapter 6.1.3 (Least Squares Estimation) - Understand the partial derivatives of the SSE.
- YouTube:
(StatQuest with Josh Starmer)Deriving the Normal Equation for Linear Regression - Assignment: Walk through the derivation of the coefficients for simple linear regression (or watch a video explaining it).
- Day 58: Multiple Linear Regression & Normal Equation (Matrix Form)
- MMLL: Chapter 6.1.4 (Multiple Linear Regression), 6.1.5 (Normal Equation).
- YouTube:
(Andrew Ng's ML Course)The Normal Equation - Assignment: Understand y=Xbeta and the Normal Equation beta=(XTX)−1XTy.
- Day 59: Hands-on Simple Linear Regression from Scratch
- Assignment: Implement simple linear regression from scratch using NumPy. Plot the regression line on a small dataset.
- Day 60: Hands-on Multiple Linear Regression (Normal Equation)
- Assignment: Implement multiple linear regression using the Normal Equation with NumPy. Test on a synthetic dataset.
- Day 61: Assumptions of Linear Regression
- MMLL: Chapter 6.1.6 (Assumptions)
- YouTube:
(StatQuest with Josh Starmer)Assumptions of Linear Regression - Assignment: List and understand the key assumptions (linearity, independence, homoscedasticity, normality of errors).
- Day 62: Review & Practice
- Assignment: Review linear regression. Search for "linear regression normal equation problems" and solve one.
- Day 63: Rest/Catch-up
- Day 57: Derivation of Coefficients (Calculus)
-
Week 10: Gradient Descent in Depth
- Day 64: Gradient Descent for Linear Regression
- MMLL: Chapter 6.1.7 (Gradient Descent)
- YouTube:
(StatQuest with Josh Starmer)Gradient Descent, Step-by-Step - Assignment: Understand the update rule beta_new=beta_old−alphanablaJ(beta).
- Day 65: Learning Rate & Convergence
- MMLL: Chapter 4.3 (Gradient Descent) - Focus on learning rate.
- YouTube:
(sentdex)How to choose a learning rate for gradient descent - Assignment: Experiment with different learning rates in your GD implementation from Day 60. Observe divergence/slow convergence.
- Day 66: Stochastic Gradient Descent (SGD) - Intuition
- MMLL: Chapter 4.3.2 (Stochastic Gradient Descent)
- Medium:
(GeeksforGeeks)Different Variants of Gradient Descent - YouTube:
(StatQuest with Josh Starmer)Stochastic Gradient Descent, Clearly Explained!!! - Assignment: Understand why SGD is faster for large datasets (updates per single example).
- Day 67: Mini-batch Gradient Descent
- MMLL: Chapter 4.3.3 (Mini-batch Gradient Descent)
- YouTube:
(DeepLearning.AI)Mini-Batch Gradient Descent - Assignment: Understand the trade-off between GD and SGD.
- Day 68: Hands-on SGD for Linear Regression
- Assignment: Implement SGD for linear regression in NumPy. Compare its performance to batch GD on a slightly larger dataset.
- Day 69: Review & Practice
- Assignment: Review Gradient Descent variants. MIT 6.036 (Introduction to Machine Learning) problem set on Gradient Descent (search for recent versions).
- Day 70: Rest/Catch-up
- Day 64: Gradient Descent for Linear Regression
-
Week 11: Regularization
- Day 71: Overfitting & Underfitting
- MMLL: Chapter 6.3 (Regularization) - Intro.
- YouTube:
(StatQuest with Josh Starmer)Overfitting vs. Underfitting - Assignment: Understand the concepts of overfitting and underfitting. Identify them visually.
- Day 72: Ridge Regression (L2 Regularization) - Math
- MMLL: Chapter 6.3.1 (Ridge Regression)
- Medium:
Ridge Regression: L2 Regularization Explained with Examples - YouTube:
(StatQuest with Josh Starmer)L1 and L2 Regularization, Clearly Explained!!! - Assignment: Understand the added lambda∣∣beta∣∣2_2 term in the loss function and its effect on coefficients.
- Day 73: Lasso Regression (L1 Regularization) - Math
- MMLL: Chapter 6.3.2 (Lasso Regression)
- Medium:
Lasso Regression Explained with Examples - YouTube:
(StatQuest with Josh Starmer)L1 and L2 Regularization - Assignment: Understand the added lambda∣∣beta∣∣_1 term and its ability to induce sparsity (feature selection).
- Day 74: Hands-on Regularization (Scikit-learn)
- Assignment: Use
sklearn.linear_model.Ridgeandsklearn.linear_model.Lasso. Experiment with thealphaparameter on a dataset prone to overfitting.
- Assignment: Use
- Day 75: Choosing Lambda (Regularization Strength)
- Assignment: Research cross-validation as a method for selecting hyperparameters like lambda. (No implementation yet, just conceptual).
- Day 76: Review & Practice
- Assignment: Search for "Ridge vs Lasso explained" or "regularization in linear regression problems."
- Day 77: Rest/Catch-up
- Day 71: Overfitting & Underfitting
-
Week 12: Polynomial Regression & Bias-Variance
- Day 78: Polynomial Regression
- MMLL: Chapter 6.2 (Polynomial Regression)
- YouTube:
(StatQuest with Josh Starmer)Polynomial Regression Explained - Assignment: Understand how polynomial features are created. Implement polynomial regression using
sklearn.preprocessing.PolynomialFeaturesandLinearRegression.
- Day 79: Bias-Variance Trade-off - Intuition
- MMLL: Chapter 6.4 (Bias-Variance Decomposition) - Focus on 6.4.1 (Introduction).
- Medium:
(MLU-Explain - interactive)Bias Variance Tradeoff - YouTube:
(StatQuest with Josh Starmer)Bias and Variance - Assignment: Conceptualize bias (model's simplifying assumptions) and variance (model's sensitivity to training data).
- Day 80: Mathematical Breakdown of Bias-Variance
- MMLL: Chapter 6.4.2 (Derivation) - Go through the derivation of the MSE decomposition (if comfortable, otherwise understand the terms).
- Assignment: Understand how MSE = Bias² + Variance + Noise.
- Day 81: Visualizing Bias-Variance
- Assignment: Search for online visualizations of the bias-variance trade-off (e.g., using target practice analogy).
- Day 82: Hands-on Bias-Variance Example
- Assignment: Create a synthetic dataset. Fit a low-degree polynomial (high bias) and a high-degree polynomial (high variance) to it. Plot and observe the fit and generalization.
- Day 83: Monthly Review & Project Prep
- Review: All concepts from Month 3. Focus on regression, optimization, regularization, and bias-variance.
- Project Prep: Brainstorm simple regression datasets you could use for a mini-project (e.g., house price prediction, car mileage).
- Day 84: Rest/Catch-up
- Day 78: Polynomial Regression
Month 4: Classification Algorithms
Goal: Understand the mathematical underpinnings of key classification models.
-
Week 13: Logistic Regression
- Day 85: Logistic Regression - Concept & Sigmoid
- MMLL: Chapter 7.1 (Logistic Regression) - Focus on 7.1.1 (Binary Classification) and 7.1.2 (Sigmoid function).
- YouTube:
(StatQuest with Josh Starmer)Logistic Regression, Clearly Explained!!! - Assignment: Understand how the sigmoid function maps linear output to a probability between 0 and 1. Plot the sigmoid.
- Day 86: Cross-Entropy Loss Function
- MMLL: Chapter 7.1.3 (Likelihood and Loss Function) - Understand why squared error isn't suitable and why cross-entropy is used.
- Medium:
(DataCamp)Cross-Entropy Loss Function in Machine Learning: Enhancing Model Accuracy - YouTube:
(A.I. in 5)Cross Entropy Demystified - Assignment: Understand the formula for binary cross-entropy loss.
- Day 87: Gradient Descent for Logistic Regression
- MMLL: Chapter 7.1.4 (Gradient Descent) - Understand the update rule (similar to linear regression, but with different derivatives).
- YouTube:
(Andrew Ng's ML Course)Logistic Regression Details: Calculating the Gradient - Assignment: Walk through the derivation of the gradients (or watch a detailed explanation).
- Day 88: Hands-on Logistic Regression from Scratch
- Assignment: Implement binary logistic regression from scratch using NumPy (forward pass, loss, and gradient descent update). Test on a simple synthetic dataset.
- Day 89: Hands-on Logistic Regression (Scikit-learn)
- Assignment: Use
sklearn.linear_model.LogisticRegression. Compare results with your custom implementation. Understandpredict_proba.
- Assignment: Use
- Day 90: Review & Practice
- Assignment: Review logistic regression. Search for "logistic regression explained math" and re-read.
- Day 91: Rest/Catch-up
- Day 85: Logistic Regression - Concept & Sigmoid
-
Week 14: Softmax & SVMs
- Day 92: Softmax Regression (Multinomial Logistic Regression)
- MMLL: Chapter 7.2 (Softmax Regression)
- YouTube:
(Andrew Ng's ML Course)Softmax Regression - Assignment: Understand the softmax function and its use for multi-class classification.
- Day 93: Categorical Cross-Entropy Loss
- MMLL: Chapter 7.2 (Loss function).
- YouTube:
(StatQuest with Josh Starmer - same as binary, just extended)Categorical Cross Entropy Explained - Assignment: Understand the extension of cross-entropy to multiple classes.
- Day 94: Support Vector Machines (SVMs) - Hyperplane & Margin
- MMLL: Chapter 8.1 (Support Vector Machines) - Focus on 8.1.1 (The Linear Classifier).
- YouTube:
(mathematics for machine learning)Support Vector Machines (SVM) — The math of intelligence - Assignment: Understand the goal of SVM: finding the maximum margin hyperplane.
- Day 95: Kernel Trick (Conceptual)
- MMLL: Chapter 8.2 (Non-linear SVM) - Focus on the idea of mapping data to higher dimensions implicitly.
- Medium:
(GeeksforGeeks)Kernel Trick in Support Vector Classification - YouTube:
(StatQuest with Josh Starmer)SVMs and the Kernel Trick - Assignment: No explicit math problem, focus on understanding the concept of making linearly inseparable data separable.
- Day 96: Hands-on SVM (Scikit-learn)
- Assignment: Use
sklearn.svm.SVC. Experiment with different kernels (linear,rbf,poly) on a dataset like Iris or circles/moons.
- Assignment: Use
- Day 97: Review & Practice
- Assignment: Review Softmax and SVMs. Search for "SVM kernel trick explained" if needed.
- Day 98: Rest/Catch-up
- Day 92: Softmax Regression (Multinomial Logistic Regression)
-
Week 15: Decision Trees & Ensembles Intro
- Day 99: Decision Trees - Basics & Splitting
- MMLL: Chapter 10.1 (Decision Trees)
- YouTube:
(StatQuest with Josh Starmer)Decision Trees, Clearly Explained!!! - Assignment: Understand how decision trees make predictions by splitting data.
- Day 100: Gini Impurity & Entropy (Mathematical Definitions)
- MMLL: Chapter 10.1.1 (Impurity Measures)
- YouTube:
(StatQuest with Josh Starmer)Decision Trees Part 2: Gini Impurity and Information Gain - Assignment: Understand the formulas for Gini impurity and entropy. Calculate them for simple data splits.
- Day 101: Information Gain
- MMLL: Chapter 10.1.2 (Information Gain)
- YouTube:
(Machine Learning with Phil)Information Gain in Decision Tree - Assignment: Understand how information gain is used to choose the best split.
- Day 102: Introduction to Ensemble Methods (Bagging)
- MMLL: Chapter 10.2 (Ensemble Methods) - Focus on 10.2.1 (Bagging/Random Forests).
- Medium:
(GeeksforGeeks)Random Forest Algorithm in Machine Learning - YouTube:
(StatQuest with Josh Starmer)Bagging and Random Forests, Clearly Explained!!! - Assignment: Understand the concept of "wisdom of crowds" and how bagging works.
- Day 103: Boosting (Conceptual)
- MMLL: Chapter 10.2.2 (Boosting).
- YouTube:
(StatQuest with Josh Starmer)Boosting (AdaBoost), Clearly Explained!!! - Assignment: Understand the sequential nature of boosting (correcting previous errors).
- Day 104: Hands-on Decision Tree & Random Forest (Scikit-learn)
- Assignment: Use
sklearn.tree.DecisionTreeClassifierandsklearn.ensemble.RandomForestClassifier. Compare their performance.
- Assignment: Use
- Day 105: Rest/Catch-up
- Day 99: Decision Trees - Basics & Splitting
-
Week 16: KNN & Naive Bayes, Classification Project
- Day 106: K-Nearest Neighbors (KNN)
- Medium:
K-Nearest Neighbors Algorithm: An Intuitive Guide - YouTube:
(StatQuest with Josh Starmer)K-nearest Neighbors (KNN), Clearly Explained!!! - Assignment: Understand the algorithm: "lazy learner," distance metrics (Euclidean, Manhattan). Implement KNN from scratch for a tiny dataset.
- Medium:
- Day 107: Naive Bayes - Intuition
- MMLL: Chapter 5.1.3 (Bayes' Theorem application - conceptually)
- YouTube:
(StatQuest with Josh Starmer)Naive Bayes, Clearly Explained!!! - Assignment: Understand the "naive" assumption of conditional independence.
- Day 108: Different Naive Bayes Variants (Conceptual)
- Assignment: Research Gaussian, Multinomial, and Bernoulli Naive Bayes and when to use them.
- Day 109: Hands-on Naive Bayes (Scikit-learn)
- Assignment: Use
sklearn.naive_bayes.GaussianNBorMultinomialNB.
- Assignment: Use
- Day 110: Monthly Review & Project Prep
- Review: All classification algorithms.
- Project Prep: Prepare for a classification project.
- Day 111: Classification Project
- Assignment: Choose a classification dataset (e.g., Pima Indians Diabetes, Titanic). Apply at least 3 different classification algorithms learned (e.g., Logistic Regression, SVM, Random Forest). Evaluate performance using appropriate metrics (accuracy, precision, recall, F1-score).
- Day 112: Rest/Catch-up
- Day 106: K-Nearest Neighbors (KNN)
Month 5: Unsupervised Learning & Optimization Deep Dive
Goal: Explore methods for finding patterns in unlabeled data and delve deeper into optimization techniques.
-
Week 17: Clustering
- Day 113: K-Means Clustering - Objective Function
- MMLL: Chapter 9.1 (K-Means Clustering)
- YouTube:
(StatQuest with Josh Starmer)K-Means Clustering, Clearly Explained!!! - Assignment: Understand the goal: minimizing within-cluster sum of squares.
- Day 114: Lloyd's Algorithm
- MMLL: Chapter 9.1.1 (Algorithm).
- Assignment: Walk through the steps of Lloyd's algorithm.
- Day 115: Hands-on K-Means from Scratch
- Assignment: Implement K-Means from scratch using NumPy. Test on a simple 2D dataset and visualize clusters.
- Day 116: Hierarchical Clustering (Conceptual)
- MMLL: Chapter 9.2 (Hierarchical Clustering) - Focus on agglomerative and dendrograms.
- YouTube:
(StatQuest with Josh Starmer)Hierarchical Clustering, Clearly Explained!!! - Assignment: Understand how dendrograms are formed.
- Day 117: Hands-on Hierarchical Clustering (SciPy)
- Assignment: Use
scipy.cluster.hierarchyto perform hierarchical clustering and plot a dendrogram.
- Assignment: Use
- Day 118: Review & Practice
- Assignment: Review clustering algorithms. Search for "K-Means problems."
- Day 119: Rest/Catch-up
- Day 113: K-Means Clustering - Objective Function
-
Week 18: Dimensionality Reduction (PCA)
- Day 120: PCA - Recap & Covariance Matrix
- MMLL: Chapter 11.1 (Principal Component Analysis) - Revisit from Linear Algebra section. Focus on the role of the covariance matrix.
- Medium:
(Wikipedia - good overview)Principal component analysis - YouTube:
(StatQuest with Josh Starmer)PCA - Principal Component Analysis, Clearly Explained!!! - Assignment: Understand the definition of covariance and the covariance matrix.
- Day 121: PCA - Eigenvalues & Eigenvectors for Reduction
- MMLL: Chapter 11.1.1 (PCA Algorithm)
- YouTube:
(Mathematicalmonk)PCA Algorithm - Assignment: Understand how eigenvectors correspond to principal components and eigenvalues to explained variance.
- Day 122: Singular Value Decomposition (SVD) for PCA
- MMLL: Chapter 11.1.2 (SVD for PCA)
- SIA: Chapter 7.2 (Singular Value Decomposition)
- YouTube:
(StatQuest with Josh Starmer)Singular Value Decomposition (SVD) and PCA - Assignment: Understand that SVD provides a robust way to compute PCA.
- Day 123: Hands-on PCA from Scratch (using SVD)
- Assignment: Implement PCA from scratch using NumPy's
np.linalg.svd. Apply it to a high-dimensional dataset (e.g., MNIST digits) and visualize the first 2 components.
- Assignment: Implement PCA from scratch using NumPy's
- Day 124: Hands-on PCA (Scikit-learn)
- Assignment: Use
sklearn.decomposition.PCAand compare results with your custom implementation. Understandexplained_variance_ratio_.
- Assignment: Use
- Day 125: Review & Practice
- Assignment: Review PCA. Search for "PCA explained visually."
- Day 126: Rest/Catch-up
- Day 120: PCA - Recap & Covariance Matrix
-
Week 19: Advanced Optimization
- Day 127: Limitations of Basic Gradient Descent
- Assignment: Understand local minima, saddle points, and issues with learning rate (e.g., slow convergence, oscillations).
- Day 128: Momentum
- MMLL: Chapter 4.3.4 (Momentum)
- YouTube:
(StatQuest with Josh Starmer)Gradient Descent with Momentum, Clearly Explained!!! - Assignment: Understand how momentum helps accelerate GD and overcome local minima.
- Day 129: Adagrad & RMSprop (Conceptual)
- MMLL: Chapter 4.3.5 (AdaGrad), 4.3.6 (RMSprop)
- Medium:
(Neptune.ai)Deep Learning Optimization Algorithms - YouTube:
(StatQuest with Josh Starmer)Adagrad, RMSProp, Adam, Clearly Explained!!! - Assignment: Understand the concept of adaptive learning rates per parameter.
- Day 130: Adam Optimizer (Conceptual)
- MMLL: Chapter 4.3.7 (Adam)
- YouTube:
(StatQuest with Josh Starmer)Adam Optimizer, Clearly Explained!!! - Assignment: Understand Adam as a combination of Momentum and RMSprop ideas.
- Day 131: Hands-on Optimizers (TensorFlow/PyTorch)
- Assignment: Build a simple linear regression model using TensorFlow or PyTorch. Experiment with
SGD,Adam, andRMSpropoptimizers, observing their convergence behavior.
- Assignment: Build a simple linear regression model using TensorFlow or PyTorch. Experiment with
- Day 132: Review & Practice
- Assignment: Search for "deep learning optimizers explained" videos/articles. Focus on their mathematical update rules.
- Day 133: Rest/Catch-up
- Day 127: Limitations of Basic Gradient Descent
-
Week 20: Convex Optimization (Conceptual) & Unsupervised Project
- Day 134: Convex Sets & Functions (Conceptual)
- MMLL: Chapter 4.2 (Conditions for Optima) - Focus on the definition of convex functions.
- Medium:
Convex Optimization: Why it is so important in Machine Learning? - YouTube:
(Andrew Ng's ML Course)Convexity in Machine Learning - Assignment: Understand what makes a function convex and why it's desirable for optimization (guarantees a global minimum).
- Day 135: Why Convexity Matters in ML
- Assignment: Understand that many traditional ML models (linear regression, logistic regression with cross-entropy) have convex loss functions, guaranteeing convergence to global optima.
- Day 136: Anomaly Detection (Brief Introduction)
- MMLL: Chapter 9.3 (Anomaly Detection) - Basic overview.
- YouTube:
(StatQuest with Josh Starmer)Anomaly Detection, Clearly Explained!!! - Assignment: Understand the goal of anomaly detection. Explore simple statistical methods (e.g., Z-score).
- Day 137: Monthly Review & Project Prep
- Review: All concepts from Month 5.
- Project Prep: Prepare for an unsupervised learning project.
- Day 138: Unsupervised Learning Project
- Assignment: Take a dataset (e.g., customers with features, or gene expression data). Perform K-Means clustering and PCA. Visualize the clusters in reduced dimensions. Interpret the results.
- Day 139: Rest/Catch-up
- Day 140: Monthly Review & Catch-up
- Day 134: Convex Sets & Functions (Conceptual)
Month 6: Deep Learning Fundamentals
Goal: Understand the core mathematical principles behind neural networks and popular architectures.
-
Week 21: Neural Network Basics & Forward Pass
- Day 141: Perceptrons & Biological Analogy
- MMLL: Chapter 12.1 (Feedforward Neural Networks) - Focus on the basic unit.
- YouTube:
(Andrew Ng's ML Course)The Perceptron - Assignment: Understand how a single perceptron works.
- Day 142: Activation Functions (Sigmoid, Tanh, ReLU)
- MMLL: Chapter 12.1.2 (Activation Functions)
- YouTube:
(StatQuest with Josh Starmer)Activation Functions, Clearly Explained!!! - Assignment: Understand the mathematical forms and properties of each activation function. Plot them.
- Day 143: Feedforward Neural Networks (MLPs) - Architecture
- MMLL: Chapter 12.1.1 (Layered Architectures).
- YouTube:
(Welch Labs)Neural Networks Demystified [Part 1: Data & Network Representation] - Assignment: Draw an MLP architecture with input, hidden, and output layers.
- Day 144: Forward Propagation - Matrix Math
- MMLL: Chapter 12.1.3 (Feedforward Pass)
- YouTube:
(StatQuest with Josh Starmer)Forward Propagation, Clearly Explained!!! - Assignment: Understand how each layer's output is calculated using matrix multiplication and activation functions.
- Day 145: Hands-on MLP Forward Pass (NumPy)
- Assignment: Implement a simple 2-layer MLP (input, 1 hidden, output) forward pass using NumPy. Use random weights and biases.
- Day 146: Review & Practice
- Assignment: Review the forward pass. Search for "neural network forward propagation example" and trace the calculations.
- Day 147: Rest/Catch-up
- Day 141: Perceptrons & Biological Analogy
-
Week 22: Backpropagation - The Core of Learning
- Day 148: Backpropagation - The Chain Rule in Action
- MMLL: Chapter 12.2 (Backpropagation) - Focus on 12.2.1 (Gradient Calculation).
- 3Blue1Brown:
Backpropagation Calculus - Medium:
(Scribd - based on a Medium article)Deep Learning (Part 27)-Backpropagation Intuition - YouTube:
(3Blue1Brown)Backpropagation Calculus | Chapter 4, Deep learning - Assignment: Understand that backpropagation is just repeated application of the chain rule.
- Day 149: Deriving Gradients for Output Layer
- MMLL: Chapter 12.2.2 (Output Layer Gradients)
- YouTube:
(Andrew Ng's ML Course)Backpropagation for a Neural Network - Assignment: Walk through the derivation of gradients for the output layer's weights and biases (e.g., for MSE loss).
- Day 150: Deriving Gradients for Hidden Layers
- MMLL: Chapter 12.2.3 (Hidden Layer Gradients)
- Assignment: Understand how errors are "propagated backward" to update hidden layer weights.
- Day 151: Hands-on Backpropagation (Simple MLP in NumPy)
- Assignment: Extend your MLP from Day 145 to include a loss function (e.g., MSE) and implement the backpropagation algorithm to update weights and biases. This is a significant challenge, but highly rewarding.
- Day 152: Loss Functions in Deep Learning (Recap)
- MMLL: Chapter 12.1.4 (Loss Functions)
- YouTube:
(StatQuest with Josh Starmer)Loss Functions for Machine Learning - Assignment: Revisit MSE (for regression) and Cross-Entropy (for classification) in the context of NNs.
- Day 153: Hands-on Basic NN with Framework (TensorFlow/PyTorch)
- Assignment: Build a basic MLP using TensorFlow Keras or PyTorch. Train it on a simple dataset (e.g., MNIST digits). Focus on
model.compile/model.fitor the training loop.
- Assignment: Build a basic MLP using TensorFlow Keras or PyTorch. Train it on a simple dataset (e.g., MNIST digits). Focus on
- Day 154: Rest/Catch-up
- Day 148: Backpropagation - The Chain Rule in Action
-
Week 23: CNNs - Convolutions Explained
- Day 155: Convolution Operation - Mathematical Definition
- 3Blue1Brown:
What is a convolution? - YouTube:
(3Blue1Brown)What is a convolution? | Chapter 2, Deep learning - Assignment: Understand the concept of a filter sliding over an image, performing element-wise multiplication and summation.
- 3Blue1Brown:
- Day 156: Padding & Stride
- YouTube:
(DeepLearning.AI)CNNs Part 2: Padding and Strides - Assignment: Understand how padding (
same,valid) and stride affect the output dimensions of a convolution.
- YouTube:
- Day 157: Pooling Layers (Max Pooling, Average Pooling)
- YouTube:
(DeepLearning.AI)CNNs Part 3: Pooling Layers - Assignment: Understand the purpose of pooling (downsampling, translation invariance).
- YouTube:
- Day 158: Hands-on Basic Convolution (NumPy)
- Assignment: Implement a simple 2D convolution operation (without padding/stride) using NumPy. Test it on a small matrix and a simple filter.
- Day 159: CNN Architecture Overview (Conceptual)
- YouTube:
(StatQuest with Josh Starmer)Convolutional Neural Networks (CNNs), Clearly Explained!!! - Assignment: Understand the typical sequence of layers in a CNN (Conv -> ReLU -> Pool -> Conv -> ReLU -> Pool -> Flatten -> Dense).
- YouTube:
- Day 160: Hands-on CNN (TensorFlow/PyTorch)
- Assignment: Build a simple CNN for image classification (e.g., Fashion MNIST or CIFAR-10) using your chosen framework.
- Day 161: Rest/Catch-up
- Day 155: Convolution Operation - Mathematical Definition
-
Week 24: RNNs & Advanced Concepts (High-Level)
- Day 162: Recurrent Neural Networks (RNNs) - Math of Recurrence
- MMLL: Chapter 12.4 (Recurrent Neural Networks) - Focus on the concept of sequence and shared weights.
- YouTube:
(StatQuest with Josh Starmer)Recurrent Neural Networks (RNN) and LSTMs, Clearly Explained!!! - Assignment: Understand how RNNs process sequences by maintaining a hidden state.
- Day 163: Vanishing/Exploding Gradients in RNNs
- YouTube:
(DeepLearning.AI)Vanishing and Exploding Gradients in Neural Networks - Assignment: Understand why standard RNNs struggle with long-term dependencies (conceptual, related to repeated matrix multiplication in backprop).
- YouTube:
- Day 164: LSTMs & GRUs (Conceptual)
- Medium:
(Shiksha Online)The Ultimate Showdown: RNN vs LSTM vs GRU – Which is the Best? - YouTube:
(StatQuest with Josh Starmer)LSTMs, Clearly Explained!!! - Assignment: Understand that LSTMs and GRUs solve vanishing gradient issues using "gates" (input, forget, output). No need for full math, just the idea.
- Medium:
- Day 165: Attention Mechanisms & Transformers (Very High-Level)
- YouTube:
(StatQuest with Josh Starmer)Attention Mechanism (for LSTMs and RNNs) - clearly explained!!! - YouTube:
(StatQuest with Josh Starmer)The Transformer Neural Network, Clearly Explained!!! - Assignment: Understand the basic concept of "attention" in neural networks (focusing on relevant parts of input). Know that Transformers use self-attention heavily.
- YouTube:
- Day 166: Monthly Review & Capstone Project Prep
- Review: All deep learning concepts, especially the role of math in NNs.
- Project Prep: Brainstorm ideas for your final capstone project.
- Day 167: Capstone Project Work Day 1
- Assignment: Begin working on your chosen project. Focus on data loading, preprocessing, and setting up the basic model.
- Day 168: Rest/Catch-up
- Day 162: Recurrent Neural Networks (RNNs) - Math of Recurrence
-
Week 25 (Optional Extension/Buffer): Capstone Project & Future Learning
- Day 169: Capstone Project Work Day 2
- Assignment: Continue implementing and training your model.
- Day 170: Capstone Project Work Day 3
- Assignment: Evaluate your model, try different hyperparameters, and analyze results.
- Day 171: Information Theory for ML (Entropy, KL Divergence)
- MMLL: Chapter 5.5 (Entropy and Mutual Information) - Focus on 5.5.1, 5.5.2 (Entropy) and 5.5.3 (KL Divergence).
- Medium:
(Deep and Shallow)Deep Learning and Information Theory - YouTube:
(StatQuest with Josh Starmer)Entropy (Information Theory), Clearly Explained!!! - YouTube:
(StatQuest with Josh Starmer)Kullback-Leibler Divergence, Clearly Explained!!! - Assignment: Understand their mathematical definitions and significance in loss functions (cross-entropy) and comparing distributions.
- Day 172: Causal Inference (Basic Concepts)
- MMLL: Chapter 15.3 (Causal Inference) - Very high-level introduction.
- Medium:
(Plain Concepts)Causal ML: What is it and what is its importance? - YouTube:
(CrashCourse)Causal Inference: The Basics - Assignment: Understand the difference between correlation and causation.
- Day 173: Comprehensive Math Review
- Assignment: Go back through your notes and MMLL. Revisit any challenging math concepts from Linear Algebra, Calculus, Probability, and Statistics.
- Day 174: Capstone Project Finalization
- Assignment: Prepare your project report/notebook. Clearly explain the problem, your approach, the models used, and the mathematical insights gained.
- Day 175: Final Project Presentation & Future Learning Plan
- Assignment: Present your project to yourself or a peer. Outline your next steps in ML and math.
- Day 176-180: Buffer/Deep Dive/Review
- Day 169: Capstone Project Work Day 2
No comments:
Post a Comment