Monday, June 2, 2025

Detailed ML Learning Journey

6-Month Maths-Focused Machine Learning Program with Enhanced Resources

This program spreads the material from an intensive plan over 6 months, aiming for roughly 1-2 hours of focused study/coding per day on weekdays, with optional longer sessions on weekends for deeper dives or project work.


Month 1: Linear Algebra & Foundational Math

Goal: Build a solid understanding of vectors, matrices, and basic linear algebra operations crucial for ML.

  • Week 1: Introduction to Vectors

  • Week 2: Matrices - The Basics

  • Week 3: Systems of Linear Equations & Inverses

    • Day 15: Systems of Linear Equations (Matrix Form)
    • Day 16: Determinants (2x2, 3x3)
    • Day 17: Inverse Matrices (Conceptual & Calculation)
    • Day 18: Solving Systems with Inverses
      • MMLL: Chapter 2.3 (Using Matrix Inverse to Solve Systems)
      • SIA: Chapter 2.2 (Solving with A1)
      • Assignment: Use np.linalg.solve to solve a system of linear equations in Python. Practice calculating inverses with np.linalg.inv.
    • Day 19: Hands-on Determinants & Inverses
      • Assignment: Write Python functions to calculate the determinant of a 2x2 matrix and the inverse of a 2x2 matrix from scratch (without np.linalg). Compare with NumPy.
    • Day 20: Review & Practice
      • Review notes. Ensure you understand when an inverse exists (non-zero determinant).
      • Assignment: MIT OpenCourseware (MIT 18.06SC Linear Algebra) Problem Set 3, Problem 1 (or similar problems related to inverses).
    • Day 21: Rest/Catch-up
  • Week 4: Eigenvalues, Eigenvectors & Review

    • Day 22: Eigenvalues & Eigenvectors - Intuition
    • Day 23: Calculating Eigenvalues & Eigenvectors (2x2)
      • MMLL: Chapter 2.4 (Calculation examples)
      • SIA: Chapter 6.1 (Finding Eigenvalues and Eigenvectors)
      • KA: Eigenvalues and eigenvectors
      • Assignment: KA - Find eigenvalues and eigenvectors of a 2x2 matrix
    • Day 24: Hands-on Eigen Decomposition
      • Assignment: Use np.linalg.eig to find eigenvalues and eigenvectors of a matrix. Verify Av=lambdav.
    • Day 25: Orthogonality (Conceptual)
      • MMLL: Chapter 2.1.2 (Orthogonal Vectors)
      • SIA: Chapter 4.1 (Orthogonal Vectors and Subspaces)
      • KA: Orthogonal vectors
      • YouTube: Orthogonal vectors (Khan Academy)
      • Assignment: Identify orthogonal vectors.
    • Day 26: Linear Algebra Review & Mini-Project Prep
      • Review: All concepts from Month 1. Revisit problem areas.
      • Project Prep: Understand the high-level goal of Principal Component Analysis (PCA) as a dimensionality reduction technique (you'll implement a basic version in Month 5). Focus on how it uses eigenvectors.
    • Day 27: REST/Catch-up
    • Day 28: Monthly Review & Catch-up

Month 2: Calculus, Probability & First ML Algorithm

Goal: Grasp essential calculus concepts for optimization, fundamental probability/statistics, and apply them to your first ML model.

  • Week 5: Calculus - Derivatives

    • Day 29: Functions, Limits, Continuity
    • Day 30: Derivatives - Intuition & Power Rule
    • Day 31: Product, Quotient, Chain Rule
    • Day 32: Critical Points & Local Min/Max
    • Day 33: Hands-on Symbolic Differentiation (Optional)
      • Assignment: Experiment with SymPy in Python to symbolically differentiate simple functions. (e.g., import sympy; x = sympy.symbols('x'); f = x**2 + 3*x; sympy.diff(f, x))
    • Day 34: Review & Practice
      • Assignment: Search for "univariate calculus differentiation problems," solve 5-10.
    • Day 35: Rest/Catch-up
  • Week 6: Calculus - Gradients & Optimization Intro

    • Day 36: Functions of Multiple Variables & Partial Derivatives
    • Day 37: Gradient Vector
    • Day 38: Hessian Matrix (Conceptual)
      • MMLL: Chapter 3.1.4 (Hessian) - Understand it represents curvature.
      • KA: The Hessian matrix
      • YouTube: Hessian Matrix (mathematicalmonk)
      • Assignment: No explicit problem, focus on conceptual understanding of second partial derivatives and the Hessian's purpose.
    • Day 39: Introduction to Optimization
    • Day 40: Hands-on Gradient Calculation
      • Assignment: Write a Python function that calculates the gradient of a simple multivariate function (e.g., ).
    • Day 41: Review & Practice
      • Assignment: Review partial derivatives and gradients. MIT OpenCourseware (MIT 18.02 Multivariable Calculus) Problem Set 1, Problem 1 (or similar).
    • Day 42: Rest/Catch-up
  • Week 7: Probability Fundamentals

  • Week 8: Probability Distributions & Linear Regression (Math)

    • Day 50: Common Discrete Distributions (Bernoulli, Binomial)
      • MMLL: Chapter 5.2.5 (Bernoulli, Binomial)
      • KA: Bernoulli
      • YouTube: The Binomial Distribution (StatQuest with Josh Starmer)
      • Assignment: KA - Bernoulli, Binomial distribution problems
    • Day 51: Normal (Gaussian) Distribution
    • Day 52: Expectation & Variance
    • Day 53: Simple Linear Regression - Mathematical Formulation
    • Day 54: Least Squares Method - Conceptual
      • MMLL: Chapter 6.1.3 (Least Squares Estimation) - Focus on the intuition of finding the "best fit" line.
      • YouTube: Least Squares Regression (Khan Academy)
      • Assignment: No coding, just understanding the conceptual goal of Least Squares.
    • Day 55: Monthly Review & Catch-up
      • Assignment: Review all concepts from Month 2. Focus on the intuition of derivatives, gradients, and how probability distributions describe data.
    • Day 56: Rest/Catch-up

Month 3: Regression & Core ML Concepts

Goal: Deepen understanding of regression, explore optimization for ML, and grasp bias-variance.

  • Week 9: Linear Regression Implementation

    • Day 57: Derivation of Coefficients (Calculus)
      • MMLL: Chapter 6.1.3 (Least Squares Estimation) - Understand the partial derivatives of the SSE.
      • YouTube: Deriving the Normal Equation for Linear Regression (StatQuest with Josh Starmer)
      • Assignment: Walk through the derivation of the coefficients for simple linear regression (or watch a video explaining it).
    • Day 58: Multiple Linear Regression & Normal Equation (Matrix Form)
      • MMLL: Chapter 6.1.4 (Multiple Linear Regression), 6.1.5 (Normal Equation).
      • YouTube: The Normal Equation (Andrew Ng's ML Course)
      • Assignment: Understand y=Xbeta and the Normal Equation beta=(XTX)1XTy.
    • Day 59: Hands-on Simple Linear Regression from Scratch
      • Assignment: Implement simple linear regression from scratch using NumPy. Plot the regression line on a small dataset.
    • Day 60: Hands-on Multiple Linear Regression (Normal Equation)
      • Assignment: Implement multiple linear regression using the Normal Equation with NumPy. Test on a synthetic dataset.
    • Day 61: Assumptions of Linear Regression
      • MMLL: Chapter 6.1.6 (Assumptions)
      • YouTube: Assumptions of Linear Regression (StatQuest with Josh Starmer)
      • Assignment: List and understand the key assumptions (linearity, independence, homoscedasticity, normality of errors).
    • Day 62: Review & Practice
      • Assignment: Review linear regression. Search for "linear regression normal equation problems" and solve one.
    • Day 63: Rest/Catch-up
  • Week 10: Gradient Descent in Depth

    • Day 64: Gradient Descent for Linear Regression
      • MMLL: Chapter 6.1.7 (Gradient Descent)
      • YouTube: Gradient Descent, Step-by-Step (StatQuest with Josh Starmer)
      • Assignment: Understand the update rule beta_new=beta_oldalphanablaJ(beta).
    • Day 65: Learning Rate & Convergence
      • MMLL: Chapter 4.3 (Gradient Descent) - Focus on learning rate.
      • YouTube: How to choose a learning rate for gradient descent (sentdex)
      • Assignment: Experiment with different learning rates in your GD implementation from Day 60. Observe divergence/slow convergence.
    • Day 66: Stochastic Gradient Descent (SGD) - Intuition
    • Day 67: Mini-batch Gradient Descent
      • MMLL: Chapter 4.3.3 (Mini-batch Gradient Descent)
      • YouTube: Mini-Batch Gradient Descent (DeepLearning.AI)
      • Assignment: Understand the trade-off between GD and SGD.
    • Day 68: Hands-on SGD for Linear Regression
      • Assignment: Implement SGD for linear regression in NumPy. Compare its performance to batch GD on a slightly larger dataset.
    • Day 69: Review & Practice
      • Assignment: Review Gradient Descent variants. MIT 6.036 (Introduction to Machine Learning) problem set on Gradient Descent (search for recent versions).
    • Day 70: Rest/Catch-up
  • Week 11: Regularization

    • Day 71: Overfitting & Underfitting
      • MMLL: Chapter 6.3 (Regularization) - Intro.
      • YouTube: Overfitting vs. Underfitting (StatQuest with Josh Starmer)
      • Assignment: Understand the concepts of overfitting and underfitting. Identify them visually.
    • Day 72: Ridge Regression (L2 Regularization) - Math
    • Day 73: Lasso Regression (L1 Regularization) - Math
    • Day 74: Hands-on Regularization (Scikit-learn)
      • Assignment: Use sklearn.linear_model.Ridge and sklearn.linear_model.Lasso. Experiment with the alpha parameter on a dataset prone to overfitting.
    • Day 75: Choosing Lambda (Regularization Strength)
      • Assignment: Research cross-validation as a method for selecting hyperparameters like lambda. (No implementation yet, just conceptual).
    • Day 76: Review & Practice
      • Assignment: Search for "Ridge vs Lasso explained" or "regularization in linear regression problems."
    • Day 77: Rest/Catch-up
  • Week 12: Polynomial Regression & Bias-Variance

    • Day 78: Polynomial Regression
      • MMLL: Chapter 6.2 (Polynomial Regression)
      • YouTube: Polynomial Regression Explained (StatQuest with Josh Starmer)
      • Assignment: Understand how polynomial features are created. Implement polynomial regression using sklearn.preprocessing.PolynomialFeatures and LinearRegression.
    • Day 79: Bias-Variance Trade-off - Intuition
      • MMLL: Chapter 6.4 (Bias-Variance Decomposition) - Focus on 6.4.1 (Introduction).
      • Medium: Bias Variance Tradeoff (MLU-Explain - interactive)
      • YouTube: Bias and Variance (StatQuest with Josh Starmer)
      • Assignment: Conceptualize bias (model's simplifying assumptions) and variance (model's sensitivity to training data).
    • Day 80: Mathematical Breakdown of Bias-Variance
      • MMLL: Chapter 6.4.2 (Derivation) - Go through the derivation of the MSE decomposition (if comfortable, otherwise understand the terms).
      • Assignment: Understand how MSE = Bias² + Variance + Noise.
    • Day 81: Visualizing Bias-Variance
      • Assignment: Search for online visualizations of the bias-variance trade-off (e.g., using target practice analogy).
    • Day 82: Hands-on Bias-Variance Example
      • Assignment: Create a synthetic dataset. Fit a low-degree polynomial (high bias) and a high-degree polynomial (high variance) to it. Plot and observe the fit and generalization.
    • Day 83: Monthly Review & Project Prep
      • Review: All concepts from Month 3. Focus on regression, optimization, regularization, and bias-variance.
      • Project Prep: Brainstorm simple regression datasets you could use for a mini-project (e.g., house price prediction, car mileage).
    • Day 84: Rest/Catch-up

Month 4: Classification Algorithms

Goal: Understand the mathematical underpinnings of key classification models.

  • Week 13: Logistic Regression

    • Day 85: Logistic Regression - Concept & Sigmoid
      • MMLL: Chapter 7.1 (Logistic Regression) - Focus on 7.1.1 (Binary Classification) and 7.1.2 (Sigmoid function).
      • YouTube: Logistic Regression, Clearly Explained!!! (StatQuest with Josh Starmer)
      • Assignment: Understand how the sigmoid function maps linear output to a probability between 0 and 1. Plot the sigmoid.
    • Day 86: Cross-Entropy Loss Function
    • Day 87: Gradient Descent for Logistic Regression
      • MMLL: Chapter 7.1.4 (Gradient Descent) - Understand the update rule (similar to linear regression, but with different derivatives).
      • YouTube: Logistic Regression Details: Calculating the Gradient (Andrew Ng's ML Course)
      • Assignment: Walk through the derivation of the gradients (or watch a detailed explanation).
    • Day 88: Hands-on Logistic Regression from Scratch
      • Assignment: Implement binary logistic regression from scratch using NumPy (forward pass, loss, and gradient descent update). Test on a simple synthetic dataset.
    • Day 89: Hands-on Logistic Regression (Scikit-learn)
      • Assignment: Use sklearn.linear_model.LogisticRegression. Compare results with your custom implementation. Understand predict_proba.
    • Day 90: Review & Practice
      • Assignment: Review logistic regression. Search for "logistic regression explained math" and re-read.
    • Day 91: Rest/Catch-up
  • Week 14: Softmax & SVMs

    • Day 92: Softmax Regression (Multinomial Logistic Regression)
      • MMLL: Chapter 7.2 (Softmax Regression)
      • YouTube: Softmax Regression (Andrew Ng's ML Course)
      • Assignment: Understand the softmax function and its use for multi-class classification.
    • Day 93: Categorical Cross-Entropy Loss
      • MMLL: Chapter 7.2 (Loss function).
      • YouTube: Categorical Cross Entropy Explained (StatQuest with Josh Starmer - same as binary, just extended)
      • Assignment: Understand the extension of cross-entropy to multiple classes.
    • Day 94: Support Vector Machines (SVMs) - Hyperplane & Margin
    • Day 95: Kernel Trick (Conceptual)
      • MMLL: Chapter 8.2 (Non-linear SVM) - Focus on the idea of mapping data to higher dimensions implicitly.
      • Medium: Kernel Trick in Support Vector Classification (GeeksforGeeks)
      • YouTube: SVMs and the Kernel Trick (StatQuest with Josh Starmer)
      • Assignment: No explicit math problem, focus on understanding the concept of making linearly inseparable data separable.
    • Day 96: Hands-on SVM (Scikit-learn)
      • Assignment: Use sklearn.svm.SVC. Experiment with different kernels (linear, rbf, poly) on a dataset like Iris or circles/moons.
    • Day 97: Review & Practice
      • Assignment: Review Softmax and SVMs. Search for "SVM kernel trick explained" if needed.
    • Day 98: Rest/Catch-up
  • Week 15: Decision Trees & Ensembles Intro

    • Day 99: Decision Trees - Basics & Splitting
      • MMLL: Chapter 10.1 (Decision Trees)
      • YouTube: Decision Trees, Clearly Explained!!! (StatQuest with Josh Starmer)
      • Assignment: Understand how decision trees make predictions by splitting data.
    • Day 100: Gini Impurity & Entropy (Mathematical Definitions)
    • Day 101: Information Gain
      • MMLL: Chapter 10.1.2 (Information Gain)
      • YouTube: Information Gain in Decision Tree (Machine Learning with Phil)
      • Assignment: Understand how information gain is used to choose the best split.
    • Day 102: Introduction to Ensemble Methods (Bagging)
    • Day 103: Boosting (Conceptual)
    • Day 104: Hands-on Decision Tree & Random Forest (Scikit-learn)
      • Assignment: Use sklearn.tree.DecisionTreeClassifier and sklearn.ensemble.RandomForestClassifier. Compare their performance.
    • Day 105: Rest/Catch-up
  • Week 16: KNN & Naive Bayes, Classification Project

    • Day 106: K-Nearest Neighbors (KNN)
    • Day 107: Naive Bayes - Intuition
      • MMLL: Chapter 5.1.3 (Bayes' Theorem application - conceptually)
      • YouTube: Naive Bayes, Clearly Explained!!! (StatQuest with Josh Starmer)
      • Assignment: Understand the "naive" assumption of conditional independence.
    • Day 108: Different Naive Bayes Variants (Conceptual)
      • Assignment: Research Gaussian, Multinomial, and Bernoulli Naive Bayes and when to use them.
    • Day 109: Hands-on Naive Bayes (Scikit-learn)
      • Assignment: Use sklearn.naive_bayes.GaussianNB or MultinomialNB.
    • Day 110: Monthly Review & Project Prep
      • Review: All classification algorithms.
      • Project Prep: Prepare for a classification project.
    • Day 111: Classification Project
      • Assignment: Choose a classification dataset (e.g., Pima Indians Diabetes, Titanic). Apply at least 3 different classification algorithms learned (e.g., Logistic Regression, SVM, Random Forest). Evaluate performance using appropriate metrics (accuracy, precision, recall, F1-score).
    • Day 112: Rest/Catch-up

Month 5: Unsupervised Learning & Optimization Deep Dive

Goal: Explore methods for finding patterns in unlabeled data and delve deeper into optimization techniques.

  • Week 17: Clustering

    • Day 113: K-Means Clustering - Objective Function
    • Day 114: Lloyd's Algorithm
      • MMLL: Chapter 9.1.1 (Algorithm).
      • Assignment: Walk through the steps of Lloyd's algorithm.
    • Day 115: Hands-on K-Means from Scratch
      • Assignment: Implement K-Means from scratch using NumPy. Test on a simple 2D dataset and visualize clusters.
    • Day 116: Hierarchical Clustering (Conceptual)
    • Day 117: Hands-on Hierarchical Clustering (SciPy)
      • Assignment: Use scipy.cluster.hierarchy to perform hierarchical clustering and plot a dendrogram.
    • Day 118: Review & Practice
      • Assignment: Review clustering algorithms. Search for "K-Means problems."
    • Day 119: Rest/Catch-up
  • Week 18: Dimensionality Reduction (PCA)

    • Day 120: PCA - Recap & Covariance Matrix
    • Day 121: PCA - Eigenvalues & Eigenvectors for Reduction
      • MMLL: Chapter 11.1.1 (PCA Algorithm)
      • YouTube: PCA Algorithm (Mathematicalmonk)
      • Assignment: Understand how eigenvectors correspond to principal components and eigenvalues to explained variance.
    • Day 122: Singular Value Decomposition (SVD) for PCA
      • MMLL: Chapter 11.1.2 (SVD for PCA)
      • SIA: Chapter 7.2 (Singular Value Decomposition)
      • YouTube: Singular Value Decomposition (SVD) and PCA (StatQuest with Josh Starmer)
      • Assignment: Understand that SVD provides a robust way to compute PCA.
    • Day 123: Hands-on PCA from Scratch (using SVD)
      • Assignment: Implement PCA from scratch using NumPy's np.linalg.svd. Apply it to a high-dimensional dataset (e.g., MNIST digits) and visualize the first 2 components.
    • Day 124: Hands-on PCA (Scikit-learn)
      • Assignment: Use sklearn.decomposition.PCA and compare results with your custom implementation. Understand explained_variance_ratio_.
    • Day 125: Review & Practice
      • Assignment: Review PCA. Search for "PCA explained visually."
    • Day 126: Rest/Catch-up
  • Week 19: Advanced Optimization

    • Day 127: Limitations of Basic Gradient Descent
      • Assignment: Understand local minima, saddle points, and issues with learning rate (e.g., slow convergence, oscillations).
    • Day 128: Momentum
    • Day 129: Adagrad & RMSprop (Conceptual)
    • Day 130: Adam Optimizer (Conceptual)
      • MMLL: Chapter 4.3.7 (Adam)
      • YouTube: Adam Optimizer, Clearly Explained!!! (StatQuest with Josh Starmer)
      • Assignment: Understand Adam as a combination of Momentum and RMSprop ideas.
    • Day 131: Hands-on Optimizers (TensorFlow/PyTorch)
      • Assignment: Build a simple linear regression model using TensorFlow or PyTorch. Experiment with SGD, Adam, and RMSprop optimizers, observing their convergence behavior.
    • Day 132: Review & Practice
      • Assignment: Search for "deep learning optimizers explained" videos/articles. Focus on their mathematical update rules.
    • Day 133: Rest/Catch-up
  • Week 20: Convex Optimization (Conceptual) & Unsupervised Project

    • Day 134: Convex Sets & Functions (Conceptual)
    • Day 135: Why Convexity Matters in ML
      • Assignment: Understand that many traditional ML models (linear regression, logistic regression with cross-entropy) have convex loss functions, guaranteeing convergence to global optima.
    • Day 136: Anomaly Detection (Brief Introduction)
      • MMLL: Chapter 9.3 (Anomaly Detection) - Basic overview.
      • YouTube: Anomaly Detection, Clearly Explained!!! (StatQuest with Josh Starmer)
      • Assignment: Understand the goal of anomaly detection. Explore simple statistical methods (e.g., Z-score).
    • Day 137: Monthly Review & Project Prep
      • Review: All concepts from Month 5.
      • Project Prep: Prepare for an unsupervised learning project.
    • Day 138: Unsupervised Learning Project
      • Assignment: Take a dataset (e.g., customers with features, or gene expression data). Perform K-Means clustering and PCA. Visualize the clusters in reduced dimensions. Interpret the results.
    • Day 139: Rest/Catch-up
    • Day 140: Monthly Review & Catch-up

Month 6: Deep Learning Fundamentals

Goal: Understand the core mathematical principles behind neural networks and popular architectures.

  • Week 21: Neural Network Basics & Forward Pass

    • Day 141: Perceptrons & Biological Analogy
      • MMLL: Chapter 12.1 (Feedforward Neural Networks) - Focus on the basic unit.
      • YouTube: The Perceptron (Andrew Ng's ML Course)
      • Assignment: Understand how a single perceptron works.
    • Day 142: Activation Functions (Sigmoid, Tanh, ReLU)
      • MMLL: Chapter 12.1.2 (Activation Functions)
      • YouTube: Activation Functions, Clearly Explained!!! (StatQuest with Josh Starmer)
      • Assignment: Understand the mathematical forms and properties of each activation function. Plot them.
    • Day 143: Feedforward Neural Networks (MLPs) - Architecture
    • Day 144: Forward Propagation - Matrix Math
      • MMLL: Chapter 12.1.3 (Feedforward Pass)
      • YouTube: Forward Propagation, Clearly Explained!!! (StatQuest with Josh Starmer)
      • Assignment: Understand how each layer's output is calculated using matrix multiplication and activation functions.
    • Day 145: Hands-on MLP Forward Pass (NumPy)
      • Assignment: Implement a simple 2-layer MLP (input, 1 hidden, output) forward pass using NumPy. Use random weights and biases.
    • Day 146: Review & Practice
      • Assignment: Review the forward pass. Search for "neural network forward propagation example" and trace the calculations.
    • Day 147: Rest/Catch-up
  • Week 22: Backpropagation - The Core of Learning

    • Day 148: Backpropagation - The Chain Rule in Action
    • Day 149: Deriving Gradients for Output Layer
      • MMLL: Chapter 12.2.2 (Output Layer Gradients)
      • YouTube: Backpropagation for a Neural Network (Andrew Ng's ML Course)
      • Assignment: Walk through the derivation of gradients for the output layer's weights and biases (e.g., for MSE loss).
    • Day 150: Deriving Gradients for Hidden Layers
      • MMLL: Chapter 12.2.3 (Hidden Layer Gradients)
      • Assignment: Understand how errors are "propagated backward" to update hidden layer weights.
    • Day 151: Hands-on Backpropagation (Simple MLP in NumPy)
      • Assignment: Extend your MLP from Day 145 to include a loss function (e.g., MSE) and implement the backpropagation algorithm to update weights and biases. This is a significant challenge, but highly rewarding.
    • Day 152: Loss Functions in Deep Learning (Recap)
      • MMLL: Chapter 12.1.4 (Loss Functions)
      • YouTube: Loss Functions for Machine Learning (StatQuest with Josh Starmer)
      • Assignment: Revisit MSE (for regression) and Cross-Entropy (for classification) in the context of NNs.
    • Day 153: Hands-on Basic NN with Framework (TensorFlow/PyTorch)
      • Assignment: Build a basic MLP using TensorFlow Keras or PyTorch. Train it on a simple dataset (e.g., MNIST digits). Focus on model.compile/model.fit or the training loop.
    • Day 154: Rest/Catch-up
  • Week 23: CNNs - Convolutions Explained

    • Day 155: Convolution Operation - Mathematical Definition
    • Day 156: Padding & Stride
      • YouTube: CNNs Part 2: Padding and Strides (DeepLearning.AI)
      • Assignment: Understand how padding (same, valid) and stride affect the output dimensions of a convolution.
    • Day 157: Pooling Layers (Max Pooling, Average Pooling)
      • YouTube: CNNs Part 3: Pooling Layers (DeepLearning.AI)
      • Assignment: Understand the purpose of pooling (downsampling, translation invariance).
    • Day 158: Hands-on Basic Convolution (NumPy)
      • Assignment: Implement a simple 2D convolution operation (without padding/stride) using NumPy. Test it on a small matrix and a simple filter.
    • Day 159: CNN Architecture Overview (Conceptual)
    • Day 160: Hands-on CNN (TensorFlow/PyTorch)
      • Assignment: Build a simple CNN for image classification (e.g., Fashion MNIST or CIFAR-10) using your chosen framework.
    • Day 161: Rest/Catch-up
  • Week 24: RNNs & Advanced Concepts (High-Level)

    • Day 162: Recurrent Neural Networks (RNNs) - Math of Recurrence
    • Day 163: Vanishing/Exploding Gradients in RNNs
    • Day 164: LSTMs & GRUs (Conceptual)
    • Day 165: Attention Mechanisms & Transformers (Very High-Level)
    • Day 166: Monthly Review & Capstone Project Prep
      • Review: All deep learning concepts, especially the role of math in NNs.
      • Project Prep: Brainstorm ideas for your final capstone project.
    • Day 167: Capstone Project Work Day 1
      • Assignment: Begin working on your chosen project. Focus on data loading, preprocessing, and setting up the basic model.
    • Day 168: Rest/Catch-up
  • Week 25 (Optional Extension/Buffer): Capstone Project & Future Learning

    • Day 169: Capstone Project Work Day 2
      • Assignment: Continue implementing and training your model.
    • Day 170: Capstone Project Work Day 3
      • Assignment: Evaluate your model, try different hyperparameters, and analyze results.
    • Day 171: Information Theory for ML (Entropy, KL Divergence)
    • Day 172: Causal Inference (Basic Concepts)
    • Day 173: Comprehensive Math Review
      • Assignment: Go back through your notes and MMLL. Revisit any challenging math concepts from Linear Algebra, Calculus, Probability, and Statistics.
    • Day 174: Capstone Project Finalization
      • Assignment: Prepare your project report/notebook. Clearly explain the problem, your approach, the models used, and the mathematical insights gained.
    • Day 175: Final Project Presentation & Future Learning Plan
      • Assignment: Present your project to yourself or a peer. Outline your next steps in ML and math.
    • Day 176-180: Buffer/Deep Dive/Review

No comments:

Post a Comment

Volunteering exploration

1. Vidyanjali - MHRD 2. Amex itself? 3. Bhumi 4. Smile foundation 5. Lotus foundation - ggn 6.