Digital Image Processing - Lecture 1
Course Introduction
- Offered as a special topics course for undergraduate and graduate students in engineering and computer science.
- Material primarily sourced from "Digital Image Processing," 4th edition by Khandala and Wood.
- Topics covered in this lecture:
- Introduction to Digital Image Processing (DIP)
- Origins and applications of DIP
- Fundamental steps in DIP systems
- Elements of visual perception
- Image sensing and acquisition
- Sampling and quantization
What is Digital Image Processing?
- Image Definition: A two-dimensional function, f(x, y), where x and y are spatial coordinates and f is the intensity or gray level.
- Digital Image: When x, y, and f are finite, discrete quantities.
- Digital Image Processing: Processing images with digital computers.
- Elements of a digital image are pixels.
- Levels of Processing:
- Low-Level: Input and output are images (e.g., noise reduction, contrast enhancement).
- Mid-Level: Input is images, output is attributes (e.g., segmentation, classification).
- High-Level: Images processed for semantic understanding (e.g., object recognition).
Origins of Digital Image Processing
- First applications in the newspaper industry for transmitting images.
- Evolution of digital technologies from the 1940s (transistors) to modern ultra-large-scale integration.
Applications of Digital Image Processing
- Electromagnetic Signal: Gamma rays, X-rays, ultraviolet, visible light, infrared, microwaves, radio waves.
- Applications in medical imaging, astronomy, satellite imaging, and more.
Fundamental Steps in Digital Image Processing
- Image Acquisition: Capturing images using various devices.
- Image Filtering and Enhancement: Noise reduction, contrast enhancement.
- Color Image Processing: Exploiting the human eye's sensitivity to color.
- Transformations: Using methods like Fourier Transform and Wavelet Transform for processing.
- Compression and Watermarking: Reducing size and protecting intellectual property.
- Morphological Processing: Structure recognition in images.
- Segmentation and Classification: Dividing images into meaningful parts.
Components of a DIP System
- Problem domain, image sensors, processing hardware, computer systems, hard copies, image displays, mass storage, software, and networks.
Elements of Visual Perception
- Human eye structure and function.
- Cone and rod cells in the retina for color and low-light vision.
- Visual sensitivity to intensity changes and simultaneous contrast.
Light and Electromagnetic Spectrum
- Visible spectrum ranges from 400 to 700 nanometers.
- Radiance, luminance, and brightness defined.
Image Sensing and Acquisition
- Images created by illumination and object's energy reflection/absorption.
- Types of sensors: Single, line, and array sensors.
- Imaging systems in digital cameras and other devices.
Sampling and Quantization
- Sampling: Digitizing the spatial domain by capturing discrete data points.
- Quantization: Digitizing the intensity domain by mapping intensity values to discrete levels.
- Resolution:
- Spatial Resolution: Determines smallest perceptible detail.
- Intensity Resolution: Smallest discernible change in intensity.
Effects of Resolution on Image Quality
- Spatial Resolution: Affects detail perception, measures in pixels per unit distance.
- Intensity Resolution: Measured in bits, affects perceived smoothness of intensity gradients.
- Experiments show images with more shape detail need fewer intensity levels, and vice versa.
Conclusion
- Overview of topics for further lectures, including intensity transformations and basic mathematics in DIP.
This concludes the first lecture on Digital Image Processing, providing a foundational understanding of key concepts and systems involved in the field.