DIGITAL IMAGE
PROCESSING
R.Anusha Padmavathy.M.E,
Assistant Professor(T),
GCE-TVL.
What Is Digital Image Processing?
An image may be defined as a two-
dimensional function, f(x,y) where x and
y are spatial (plane) coordinates, and the
amplitude of f at any pair of coordinates
(x, y) is called the intensity or gray level
of the image at that point.
Digital Image:
When x, y and the intensity values of f are all finite,
discrete quantities, we call the image a digital image.
Color Image:
What is Image Processing
WHY…..digital image
processing…???
Interest in digital image processing methods stems
from two principal application areas:
1. Improvement of pictorial information for human
interpretation
2.Processing of image data for storage,
transmission, and representation for autonomous
machine perception
DIP Definition:
A Discipline in Which Both the Input and Output of a
Process are Images.
The Origins of Digital Image
Processing
One of the first applications of digital images
was in the newspaper industry, when pictures
were first sent by submarine cable
Specialized printing equipment coded pictures
for cable transmission and then reconstructed
them at the receiving end.
The initial problems in improving the visual
quality of these early digital pictures were related
to the selection of printing procedures and the
distribution of intensity levels
The printing technique based on photographic
reproduction made from tapes perforated at the
telegraph receiving terminal
The improvements are tonal
quality and in resolution
The early systems were capable of coding images in
five distinct levels of gray.
This capability was increased to 15 levels in 1929.
Fields that Use Digital Image
Processing
Today, there is almost no area of technical endeavor
that is not impacted in some way by digital image
processing.
Gamma-Ray Imaging
X-Ray Imaging
Imaging in the Ultraviolet Band
Imaging in the Visible and Infrared Bands
Imaging in the Microwave Band
Imaging in the Radio Band
Gamma-Ray Imaging
Major uses of imaging based
on gamma rays include
nuclear medicine.
In nuclear medicine, the
approach is to inject a patient
with a radioactive isotope that
emits gamma rays as it decays.
Images are produced from the
emissions collected by gamma
ray detectors.
X-Ray Imaging
Applications
Document Handling
Signature Verification
Biometrics
Fingerprint Verification / Identification
Object Recognition
Target Recognition Department of Defense (Army, Air force, Navy)
Aerial Photography
Traffic Monitoring
Face Detection/Recognition
Medical Applications
Morphing
Inserting Artificial Objects into a Scene
Fundamental Steps in Digital Image
Processing
Image acquisition is the first process. Generally, the
image acquisition stage involves preprocessing, such as
scaling.
Image enhancement is the process of manipulating an image
so that the result is more suitable than the original for a
specific application.
There is no general “theory” of image enhancement.
When an image is processed for visual interpretation, the
viewer is the ultimate judge of how well a particular method
works
Image Restoration is an area that also deals with
improving the appearance of an image.
Color Image Processing is an area that has been
gaining in importance because of the significant
increase in the use of digital images over the Internet.
Wavelets are the foundation for representing images in
various degrees of resolution.
Compression, as the name implies, deals with
techniques for reducing the storage required to save an
image, or the bandwidth required to transmit it. This is
true particularly in uses of the Internet.
Morphological processing deals with tools for
extracting image components that are useful in the
representation and description of shape.
Segmentation procedures partition an image into its
constituent parts or objects.
A segmentation procedure brings the process a long
way toward successful solution of imaging problems
that require objects to be identified individually.
In general, the more accurate the segmentation, the
more likely recognition is to succeed
Representation and description almost always follow the
output of a segmentation stage, which usually is raw pixel
data.
Boundary representation is appropriate when the focus is on
external shape characteristics, such as corners and inflections.
Regional representation is appropriate when the focus is on
internal properties, such as texture or skeletal shape.
Description, also called feature selection, deals with extracting
attributes that result in some quantitative information of
interest or are basic for differentiating one class of objects
from another.
Recognition is the process that assigns a label (e.g.,
“vehicle”) to an object based on its descriptors. Digital
image processing with the development of methods for
recognition of individual objects.
General Purpose Image Processing
System
Specialized image processing hardware usually
consists of the digitizer, plus hardware that performs
other primitive operations, such as an arithmetic logic
unit (ALU), that performs arithmetic and logical
operations in parallel on entire images.
This type of hardware sometimes is called a front-end
subsystem, and its most distinguishing characteristic is
speed.
The Computer in an image processing system is a
general-purpose computer and can range from a PC to a
supercomputer.
In dedicated applications, sometimes custom
computers are used to achieve a required level of
performance, but our interest here is on general
purpose image processing systems.
In these systems, almost any well-equipped PC-type
machine is suitable for off-line image processing tasks
Software for image processing consists of specialized
modules that perform specific tasks.
More sophisticated software packages allow the
integration of those modules and general-purpose
software commands from at least one computer
language
Mass storage capability is a must in image processing
applications.
An image of size 1024 * 1024 pixels, in which the
intensity of each pixel is an 8-bit quantity, requires one
megabyte of storage space if the image is not compressed.
Digital storage for image processing applications falls
into three principal categories:
Short-term storage for use during processing,
On-line storage for relatively fast recall, and
Archival storage, characterized by infrequent access.
Storage is measured in: bytes, Kbytes, Mbytes, Gbytes
Image displays in use today are mainly color (preferably
flat screen) TV monitors.
Monitors are driven by the outputs of image and
graphics display cards that are an integral part of the
computer system.
In some cases, it is necessary to have stereo displays, and
these are implemented in the form of headgear
containing two small displays embedded in goggles worn
by the user
Hardcopy devices for recording images include laser
printers, film cameras, heat-sensitive devices, inkjet
units, and digital units, such as optical and CDROM
disks
Networking is almost a default function in any
computer system in use today.
In dedicated networks, this typically is not a problem,
but communications with remote sites via the Internet
are not always as efficient.
Image Representation
A digital image is composed of M rows and N columns
of pixels each storing a value
Pixel values are most often grey levels in the range 0-
255(black-white)
Thank you