Implementing AI Algorithms from Scratch: Gradient Descent: Building Optimization Algorithms from Scratch

Delve into the intricacies of optimization techniques with this immersive course that focuses on the implementation of various algorithms from scratch. Bypass high-level libraries to explore Stocha

English
Created by
Last updated Thu, 15-May-2025
+ View more
Course overview

Delve into the intricacies of optimization techniques with this immersive course that focuses on the implementation of various algorithms from scratch. Bypass high-level libraries to explore Stochastic Gradient Descent, Mini-Batch Gradient Descent, and advanced optimization methods such as Momentum, RMSProp, and Adam.

What will i learn?

  • Coding and Data Algorithms
  • Machine Learning Model Development
  • Deep Learning and Neural Networks
Requirements
Curriculum for this course
27 Lessons 1 hr 44 mins
Implementing AI Algorithms from Scratch: Gradient Descent: Building Optimization Algorithms from Scratch
1 Lessons 01:44:00 Hours
  • Implementing AI Algorithms from Scratch: Gradient Descent: Building Optimization Algorithms from Scratch
    Preview 01:44:00
Stochastic Gradient Descent: Theory and Implementation in Python
5 Lessons
  • Lesson: Stochastic Gradient Descent: Theory and Implementation in Python
    Preview .
  • Practice: Observing Stochastic Gradient Descent in Action
    Preview .
  • Practice: Tuning the Learning Rate in SGD
    Preview .
  • Practice: Stochastic Sidesteps: Updating Model Parameters
    Preview .
  • Practice: Updating the Linear Regression Model Params with SGD
    Preview .
Optimizing Machine Learning with Mini-Batch Gradient Descent
5 Lessons
  • Lesson: Optimizing Machine Learning with Mini-Batch Gradient Descent
    Preview .
  • Practice: Mini-Batch Gradient Descent in Action
    Preview .
  • Practice: Calculating Gradients and Errors in MBGD
    Preview .
  • Practice: Calculating Gradients for Mini-Batch Gradient Descent
    Preview .
  • Practice: Adjust the Batch Size in Mini-Batch Gradient Descent
    Preview .
Accelerating Convergence: Implementing Momentum in Gradient Descent Algorithms
5 Lessons
  • Lesson: Accelerating Convergence: Implementing Momentum in Gradient Descent Algorithms
    Preview .
  • Practice: Visualizing Momentum in Gradient Descent
    Preview .
  • Practice: Adjusting Momentum in Gradient Descent
    Preview .
  • Practice: Adding Momentum to Gradient Descent
    Preview .
  • Practice: Optimizing the Roll: Momentum in Gradient Descent
    Preview .
Understanding and Implementing RMSProp in Python
6 Lessons
  • Lesson: Understanding and Implementing RMSProp in Python
    Preview .
  • Practice: RMSProp Assisted Space Navigation
    Preview .
  • Practice: Scaling the Optimizer: Adjusting RMSProp with Gamma
    Preview .
  • Practice: Adjust the Decay Rate in RMSProp Algorithm
    Preview .
  • Practice: Implement RMSProp Update
    Preview .
  • Practice: Implement RMSProp's Squared Gradient Update
    Preview .
Advanced Optimization: Understanding and Implementing ADAM
5 Lessons
  • Lesson: Advanced Optimization: Understanding and Implementing ADAM
    Preview .
  • Practice: Optimizing Robot Movements with ADAM Algorithm
    Preview .
  • Practice: Adjusting the Learning Rate in ADAM Optimization
    Preview .
  • Practice: Optimize the Orbit: Tuning the ADAM Optimizer's Epsilon Parameter
    Preview .
  • Practice: ADAM Optimizer: Implement the Coordinate Update
    Preview .
+ View more
Other related courses
About instructor
Includes:
  • 1 hr 44 mins On demand videos
  • 27 Lessons
  • Access on mobile and tv
  • Full lifetime access