KEMBAR78
Edge Detection algorithm and code | PPTX
EDGE DETECTION
Presented by: Vaddi Manikanta
B212053
ETC
INTRODUCTION
Edges are significant local
changes of intensity in an
image.
Edge Detection is the
process of identifying and
locating sharp discontinuities
in an image.
Abrupt change in pixel
intensity characterize
boundary of an object and
usually edges occur on the
boundary of two regions.
Tulips image
Edges of the Tulips image
Tulips Image Part of the image Edge of the part of the image
Matrix generated by the part of the image
CAUSES OF INTENSITY CHANGE
Geometric events
Discontinuity in depth and
surface colour and texture
Non-geometric events
Reflection of light
Illumination
shadows
Edge formation due to
discontinuity of surface
Reflectance Illumination Shadow
APPLICATIONS
Enhancement of noisy images like satellite images,
x-rays, medical images like cat scans.
Text detection.
Traffic management.
Mapping of roads.
Video surveillance.
DIFFERENT TYPES OF EDGES OR
INTENSITY CHANGES
Step edge: The image intensity abruptly changes from
one value on one side of the discontinuity to a different
value on the opposite side.
Ramp edge: A step edge where the intensity change is
not instantaneous but occur over a finite distance.
Ridge edge: The image intensity abruptly changes value
but then returns to the starting value within some short
distance (i.e., usually generated by lines).
Roof edge: A ridge edge where the intensity change is
not instantaneous but occur over a finite distance (i.e.,
usually generated by the intersection of two surfaces).
MAIN STEPS IN EDGE DETECTION
Smoothing: Suppress as much noise as possible,
without destroying true edges.
Enhancement: Apply differentiation to enhance the
quality of edges (i.e., sharpening).
Thresholding: Determine which edge pixels should be
discarded as noise and which should be retained (i.e.,
threshold edge magnitude).
Localization: Determine the exact edge location. Edge
thinning and linking are usually required in this step.
EDGE DETECTION USING
DERIVATIVE (GRADIENT)
The first derivate of an image can be computed using the
gradient
(or)f
GRADIENT REPRESENTATION
The gradient is a vector which has magnitude and
direction.
or
Magnitude: indicates edge strength.
Direction: indicates edge direction.
| | | |
f f
x y
 

 
(approximation)
EDGE DETECTION STEPS USING
GRADIENT
(i.e., sqrt is costly!)
GENERAL APPROXIMATION
Consider the arrangement of pixels about the pixel [i, j]:
The partial derivatives , can be computed by:
The constant c implies the emphasis given to pixels
closer to the centre of the mask.
3 x 3 neighborhood:
SOBEL OPERATOR
3×3 convolution mask.
Setting c = 2, we get the Sobel operator:
EDGE DETECTOR USING SOBEL
OPERATOR IN C LANGUAGE
#include <stdio.h>
#include <stdlib.h>
#include <float.h>
#include "mypgm.h"
void sobel_filtering( )
/* Spatial filtering of image data */
/* Sobel filter (horizontal differentiation */
/* Input: image1[y][x] ---- Outout: image2[y][x] */
{
/* Definition of Sobel filter in horizontal direction */
int weight[3][3] = { { -1, 0, 1 },
{ -2, 0, 2 },
{ -1, 0, 1 } };
double pixel_value;
double min, max;
int x, y, i, j; /* Loop variable */
/* Maximum values calculation after filtering*/
printf("Now, filtering of input image is performednn");
min = DBL_MAX;
max = -DBL_MAX;
for (y = 1; y < y_size1 - 1; y++) {
for (x = 1; x < x_size1 - 1; x++) {
pixel_value = 0.0;
for (j = -1; j <= 1; j++) {
for (i = -1; i <= 1; i++) {
pixel_value += weight[j + 1][i + 1] * image1[y + j][x + i];
}
}
if (pixel_value < min) min = pixel_value;
if (pixel_value > max) max = pixel_value;
}
}
if ((int)(max - min) == 0) {
printf("Nothing exists!!!nn");
exit(1);
}
/* Initialization of image2[y][x] */
x_size2 = x_size1;
y_size2 = y_size1;
for (y = 0; y < y_size2; y++) {
for (x = 0; x < x_size2; x++) {
image2[y][x] = 0;
}
}
/* Generation of image2 after linear transformtion */
for (y = 1; y < y_size1 - 1; y++) {
for (x = 1; x < x_size1 - 1; x++) {
pixel_value = 0.0;
for (j = -1; j <= 1; j++) {
for (i = -1; i <= 1; i++) {
pixel_value += weight[j + 1][i + 1] * image1[y + j][x + i];
}
}
pixel_value = MAX_BRIGHTNESS * (pixel_value - min) / (max - min);
image2[y][x] = (unsigned char)pixel_value;
}
}
}
main( )
{
load_image_data( ); /* Input of image1 */
sobel_filtering( ); /* Sobel filter is applied to image1 */
save_image_data( ); /* Output of image2 */
return 0;
}
SOBEL EDGE DETECTOR
PREWITT OPERATOR
3×3 convolution mask.
Setting c = 1, we get the Prewitt operator:
PREWITT EDGE DETECTOR
GENERAL EXAMPLE
I
dx
d
I
dy
dI
22
d d
I I
dx dy
  
    
   
100Threshold  
I
PRACTICAL ISSUES
Choice of threshold
Gradient Magnitude
Low Threshold High Threshold
Edge thinning and linking.
CONCLUSION
Reduces unnecessary information in the image while
preserving the structure of the image.
Extract important features of an image like corners, lines
and curves.
Recognize objects, boundaries, segmentation.
Part of computer vision and recognition.
Sobel and prewitt operators are similar.
REFERENCES
Machine Vision – Ramesh Jain, Rangachar Kasturi, Brian
G Schunck, McGraw-Hill, 1995
A Computational Approach to Edge Detection – John
Canny, IEEE, 1986
CS485/685 Computer Vision – Dr. George Bebis
THANK YOU!

Edge Detection algorithm and code

  • 1.
    EDGE DETECTION Presented by:Vaddi Manikanta B212053 ETC
  • 2.
    INTRODUCTION Edges are significantlocal changes of intensity in an image. Edge Detection is the process of identifying and locating sharp discontinuities in an image. Abrupt change in pixel intensity characterize boundary of an object and usually edges occur on the boundary of two regions. Tulips image Edges of the Tulips image
  • 3.
    Tulips Image Partof the image Edge of the part of the image Matrix generated by the part of the image
  • 4.
    CAUSES OF INTENSITYCHANGE Geometric events Discontinuity in depth and surface colour and texture Non-geometric events Reflection of light Illumination shadows Edge formation due to discontinuity of surface Reflectance Illumination Shadow
  • 5.
    APPLICATIONS Enhancement of noisyimages like satellite images, x-rays, medical images like cat scans. Text detection. Traffic management. Mapping of roads. Video surveillance.
  • 6.
    DIFFERENT TYPES OFEDGES OR INTENSITY CHANGES Step edge: The image intensity abruptly changes from one value on one side of the discontinuity to a different value on the opposite side.
  • 7.
    Ramp edge: Astep edge where the intensity change is not instantaneous but occur over a finite distance. Ridge edge: The image intensity abruptly changes value but then returns to the starting value within some short distance (i.e., usually generated by lines).
  • 8.
    Roof edge: Aridge edge where the intensity change is not instantaneous but occur over a finite distance (i.e., usually generated by the intersection of two surfaces).
  • 9.
    MAIN STEPS INEDGE DETECTION Smoothing: Suppress as much noise as possible, without destroying true edges. Enhancement: Apply differentiation to enhance the quality of edges (i.e., sharpening). Thresholding: Determine which edge pixels should be discarded as noise and which should be retained (i.e., threshold edge magnitude). Localization: Determine the exact edge location. Edge thinning and linking are usually required in this step.
  • 10.
    EDGE DETECTION USING DERIVATIVE(GRADIENT) The first derivate of an image can be computed using the gradient (or)f
  • 11.
    GRADIENT REPRESENTATION The gradientis a vector which has magnitude and direction. or Magnitude: indicates edge strength. Direction: indicates edge direction. | | | | f f x y      (approximation)
  • 12.
    EDGE DETECTION STEPSUSING GRADIENT (i.e., sqrt is costly!)
  • 13.
    GENERAL APPROXIMATION Consider thearrangement of pixels about the pixel [i, j]: The partial derivatives , can be computed by: The constant c implies the emphasis given to pixels closer to the centre of the mask. 3 x 3 neighborhood:
  • 14.
    SOBEL OPERATOR 3×3 convolutionmask. Setting c = 2, we get the Sobel operator:
  • 15.
    EDGE DETECTOR USINGSOBEL OPERATOR IN C LANGUAGE #include <stdio.h> #include <stdlib.h> #include <float.h> #include "mypgm.h" void sobel_filtering( ) /* Spatial filtering of image data */ /* Sobel filter (horizontal differentiation */ /* Input: image1[y][x] ---- Outout: image2[y][x] */ { /* Definition of Sobel filter in horizontal direction */ int weight[3][3] = { { -1, 0, 1 }, { -2, 0, 2 }, { -1, 0, 1 } }; double pixel_value; double min, max; int x, y, i, j; /* Loop variable */
  • 16.
    /* Maximum valuescalculation after filtering*/ printf("Now, filtering of input image is performednn"); min = DBL_MAX; max = -DBL_MAX; for (y = 1; y < y_size1 - 1; y++) { for (x = 1; x < x_size1 - 1; x++) { pixel_value = 0.0; for (j = -1; j <= 1; j++) { for (i = -1; i <= 1; i++) { pixel_value += weight[j + 1][i + 1] * image1[y + j][x + i]; } } if (pixel_value < min) min = pixel_value; if (pixel_value > max) max = pixel_value; } }
  • 17.
    if ((int)(max -min) == 0) { printf("Nothing exists!!!nn"); exit(1); } /* Initialization of image2[y][x] */ x_size2 = x_size1; y_size2 = y_size1; for (y = 0; y < y_size2; y++) { for (x = 0; x < x_size2; x++) { image2[y][x] = 0; } }
  • 18.
    /* Generation ofimage2 after linear transformtion */ for (y = 1; y < y_size1 - 1; y++) { for (x = 1; x < x_size1 - 1; x++) { pixel_value = 0.0; for (j = -1; j <= 1; j++) { for (i = -1; i <= 1; i++) { pixel_value += weight[j + 1][i + 1] * image1[y + j][x + i]; } } pixel_value = MAX_BRIGHTNESS * (pixel_value - min) / (max - min); image2[y][x] = (unsigned char)pixel_value; } } } main( ) { load_image_data( ); /* Input of image1 */ sobel_filtering( ); /* Sobel filter is applied to image1 */ save_image_data( ); /* Output of image2 */ return 0; }
  • 19.
  • 20.
    PREWITT OPERATOR 3×3 convolutionmask. Setting c = 1, we get the Prewitt operator:
  • 21.
  • 22.
  • 23.
    22 d d I I dxdy             100Threshold   I
  • 24.
    PRACTICAL ISSUES Choice ofthreshold Gradient Magnitude Low Threshold High Threshold
  • 25.
  • 26.
    CONCLUSION Reduces unnecessary informationin the image while preserving the structure of the image. Extract important features of an image like corners, lines and curves. Recognize objects, boundaries, segmentation. Part of computer vision and recognition. Sobel and prewitt operators are similar.
  • 27.
    REFERENCES Machine Vision –Ramesh Jain, Rangachar Kasturi, Brian G Schunck, McGraw-Hill, 1995 A Computational Approach to Edge Detection – John Canny, IEEE, 1986 CS485/685 Computer Vision – Dr. George Bebis
  • 28.