CEC366 – Image Processing
UNIT I: DIGITAL IMAGE FUNDAMENTALS
1. Steps in Digital Image Processing
Digital Image Processing involves manipulating image data using digital computers and
algorithms to improve image quality or extract useful information. The general processing
pipeline includes:
a. Image Acquisition – Capturing the image using a sensor and converting it to digital form.
b. Preprocessing – Enhancing image quality using noise removal, contrast adjustment, etc.
c. Segmentation – Dividing the image into meaningful regions or objects.
d. Representation and Description – Extracting and describing features like shape and
texture.
e. Recognition – Assigning labels (e.g., car, person) to image objects.
f. Knowledge Base – Contains domain-specific information for interpretation.
2. Components of an Image Processing System
1. Image Sensors – Convert optical images into electrical signals.
2. A/D Converters – Convert analog signals to digital form.
3. Image Processing Hardware – Special processors for real-time operations.
4. Computer – Executes algorithms and handles I/O operations.
5. Software – Tools like MATLAB, OpenCV, Python.
6. Display Devices – Used to visualize processed images.
7. Mass Storage – For saving image data and results.
3. Elements of Visual Perception
Human visual perception is nonlinear. It influences how images should be processed:
- Brightness: Perceived light intensity.
- Contrast: Difference between dark and bright areas.
- Mach Band Effect: Exaggerated contrast at edges perceived by the human eye.
4. Image Sensing and Acquisition
Involves sensing light from a scene using sensors and converting it to digital form.
- Sensors (CCD/CMOS) convert optical input to electrical signals.
- ADC converts analog signals to digital images.
5. Image Sampling and Quantization
Sampling: Selecting discrete pixels from a continuous image. Affects spatial resolution.
Quantization: Assigning intensity levels to sampled pixels. Affects gray-level resolution.
Higher values result in better image quality but more storage needs.
6. Relationships Between Pixels
- Neighbourhood: 4-neighbourhood (up, down, left, right), 8-neighbourhood (includes
diagonals).
- Connectivity: 4, 8, or m-connected.
- Adjacency: Based on similar intensities and neighbourhood.
- Distance Measures: Euclidean, Manhattan, Chessboard distances.
7. Color Image Fundamentals
a. RGB Model – Additive color model used in displays.
b. HSI Model – Represents Hue, Saturation, and Intensity.
c. CMYK – Subtractive model used in printing.
True Color Images use 24 bits (8 bits per channel) for 16.7 million colors.
8. Two-Dimensional Mathematical Preliminaries
- Image as a function: f(x, y), where x, y are coordinates and f is intensity.
- Matrix operations like addition and multiplication are used in filtering.
- Convolution: Combines image with a kernel for filtering.
- Derivatives: Gradient (1st order) and Laplacian (2nd order) are used for edge detection.
9. 2D Transforms
a. DFT – Converts image from spatial to frequency domain.
F(u,v) = ΣΣ f(x,y) * e^(-j2π(ux/M + vy/N))
b. DCT – Used in image compression (e.g., JPEG). Concentrates energy in low-frequency
components.
Conclusion
This unit introduces the building blocks of digital image processing, focusing on acquisition,
representation, color models, and transforms. These concepts form the foundation for
advanced image processing techniques.