Understanding Neural Networks in Digital Recognition

Feb 24, 2025

Lecture Notes: Introduction to Neural Networks

Overview of Recognition Tasks

  • Recognition of digits: Example of recognizing the digit "3" in various formats (e.g., low resolution, different pixel arrangements).
  • Human brain vs. Machine Learning: Humans can effortlessly recognize patterns, while programming a machine to do the same is complex.

Importance of Machine Learning

  • Relevance of machine learning and neural networks in modern technology.
  • Aim: Understand the structure of a neural network and its functioning through basic math.

Neural Network Structure

  • Input Layer: Consists of 784 neurons for a 28x28 pixel image (grayscale values from 0 to 1).
  • Output Layer: 10 neurons represent digits (0-9), with activations indicating the likelihood of each digit.
  • Hidden Layers: Intermediate layers that process features. In this case, 2 hidden layers with 16 neurons each.

Neurons and Activations

  • Each neuron holds a number (activation) between 0 and 1.
  • Activations in one layer influence the next layer's activations.
  • Neurons aim to detect patterns (e.g., edges, shapes) that contribute to digit recognition.

Learning Process

  • Machine Learning Goal: Adjusting the weights and biases of the network so it reliably recognizes digits.
  • Weight Assignment: Neurons from one layer are connected to neurons in the next layer through weights that affect activations.
  • Bias: An additional parameter that modifies the weighted sum before applying the activation function (e.g., sigmoid).

Activation Functions

  • Sigmoid Function: Squashes input values to a range between 0 and 1.
    • Useful for binary classifications.
  • ReLU (Rectified Linear Unit): A modern alternative. Outputs the maximum of 0 or the input value, improving training efficiency for deep networks.

Final Thoughts

  • The network's complexity is aligned with the task's difficulty (digit recognition).
  • Understanding the structure and operations of neural networks is crucial for further studies in machine learning.
  • Next steps will cover how networks learn and what happens during training.

Acknowledgments

  • Special thanks to contributors and supporters, including Leisha Lee, who discussed the evolution of activation functions in neural networks.