🖼️

Introduction to Digital Image Processing Concepts

Aug 21, 2024

Digital Image Processing Lecture 1

Course Introduction

  • Course offered as a special topics course for undergrad and graduate students in engineering and computer science.
  • Based on "Digital Image Processing" by Gonzalez and Woods (4th edition).

Topics Covered

  • Introduction to Digital Image Processing (DIP)
  • Origins of DIP
  • Applications of DIP
  • Fundamental steps in DIP
  • Elements of visual perception
  • Light and electromagnetic spectrum
  • Image sensing and acquisition
  • Sampling and quantization

What is Digital Image Processing?

  • Image Definition: A two-dimensional function f(x,y) representing intensity or gray level at coordinates.
    • Continuous or discrete values.
  • Digital Image: An image with finite, discrete values.
  • DIP: Processes involving digital images via digital computers.
  • Levels of DIP:
    • Low-level: Input/output are images (e.g., noise reduction).
    • Mid-level: Input images, output attributes (e.g., segmentation).
    • High-level: Semantic understanding (part of computer vision).

Origins of Digital Image Processing

  • Early application in the newspaper industry for image transmission.
  • Digital computers' foundation in the 1940s.
  • Advancements in computers parallel DIP advancements.
  • Early applications in space exploration and medical imaging (e.g., CT scans).

Applications of Digital Image Processing

  • Numerous applications categorized by image sources:
    • Electromagnetic: Gamma rays, X-rays, UV, visible, infrared, microwaves, radio waves.
    • Acoustic and Ultrasonic
    • Electronic: Scanning electron microscopes.
    • Computer-generated images

Electromagnetic Spectrum

  • Defined by sinusoidal waves or streams of particles (photons).
  • Energy equation: E = hf, where h is Planck's constant, f is frequency.
  • Regions: Gamma rays, X-rays, UV, visible, infrared, microwaves, radio waves.

Image Processing Steps

  • Image Acquisition: Specialized tools depending on the spectrum used.
  • Image Filtering and Enhancement: Noise reduction, contrast improvement.
  • Image Restoration: Advanced noise reduction, blur artifact reduction.
  • Color Image Processing: Pseudo-color to enhance perception.
  • Transforms: Fourier, wavelets, etc., for advanced processing.

Components of DIP System

  • Problem domain, image sensors, specialized hardware, computer processing, displays, storage, software, networking/cloud.

Visual Perception

  • Human Eye Structure: Cornea, iris, lens, retina, etc.
  • Cell Types:
    • Cones: Color-sensitive, concentrated around fovea.
    • Rods: Low-light sensitive, no color, spread across retina.
  • Adaptation: Eye's sensitivity changes based on brightness.

Light and Electromagnetic Spectrum

  • Visible spectrum: 400-700 nm.
  • Colors perceived by light reflection/absorption.

Image Sensing and Acquisition

  • Defined by illumination source and scene elements.
  • Sensing Element: Converts energy to electric signals.
  • Types: Single sensor, line sensor, array sensor.

Sampling and Quantization

  • Sampling: Digitizes spatial domain.
  • Quantization: Digitizes function domain.
  • Resolution:
    • Spatial: Pixels per unit distance.
    • Intensity: Smallest change in intensity level.

Image Representation

  • Images can be represented as 3D or 2D functions, or matrices.
  • Spatial Resolution Sensitivity: More sensitive to shape variations.
  • Intensity Resolution Sensitivity: More sensitive to lighting variations.

Conclusion

  • Sampling and intensity levels affect perceived image quality.
  • Next topics: Intensity transformation and mathematics in DIP.