This project implements a tiny scalar-valued autograd engine, following Andrej Karpathy's excellent lecture on backpropagation. It also includes a small PyTorch-like neural network library built on top of the autograd engine. This repository is a direct copy of the code from the original source.
-
Notifications
You must be signed in to change notification settings - Fork 0
Ali-Mhrez/micrograd
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
A minimal scalar-valued autograd engine and neural network library inspired by Andrej Karpathy's backpropagation lecture.