Skip to content

A two part tutorial series implementing the gradient descent algorithm without the use of any machine learning libraries

Notifications You must be signed in to change notification settings

CamNZ/gradient-descent-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gradient descent from scratch

A two part tutorial series implementing the gradient descent algorithm without the use of any machine learning libraries.

Background

Gradient descent is a very commonly used optimization method in modern machine learning. This two part series is intended to help people gain a better understanding of how it works by implementing it without the use of any machine learning libraries.

A basic understanding of calculus, linear algebra and python programming are required to get the most out of these tutorials.

Format

Jupyter notebooks that contain explanations of underlying concepts followed by code that can be run from within the notebook.

Part 1 - Intoduction to gradient descent on a simple linear regression problem

Part 2 - Training a neural network to classify handwritten digits

Code is written for readability and is heavily commented to aid beginners. Not an exemplar of production code.

Prerequisites

Jupyter Notebook
Python 3x
Numpy, matplotlib

Usage

Feedback

Any constructive feedback on how this could be improved is welcome.

About

A two part tutorial series implementing the gradient descent algorithm without the use of any machine learning libraries

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published