Jul 17, 2024
micrograd
, an autograd engine that implements backpropagation, from scratch.Value Object: Wraps scalar values into a custom class that supports operations needed for forward and backward passes.
op
) that produced the value.Expression Graph: A directed acyclic graph where nodes are values and edges represent operations.
a
and b
leading to a final node g
and observe both forward and backward passes.Numerical Gradient Check: Empirical approach using the limit definition of derivatives.
f(x)
: (f(x + h) - f(x)) / h
as h
approaches 0.Chain Rule: Fundamental for backpropagation, allowing the composition of derivatives through intermediate variables.
g(f(x))
, the derivative dg/dx
is dg/df * df/dx
.Manual Backpropagation Example: Step-by-step manual calculation of gradients for a simple expression, demonstrating the chain rule.
Neuron: Represents a single neuron with input weights and bias and performs a forward pass using a non-linear activation function like tanh
.
Layer: Comprising multiple neurons with independent evaluations.
Multi-Layer Perceptron (MLP): Stack layers of neurons to form a full MLP.
backpropagation
.micrograd
translate into production-grade libraries like PyTorch.micrograd
help understand fundamental concepts which are otherwise abstracted in high-level libraries.