Preview

Edge detection:Digital Image Processing

Good Essays
Open Document
Open Document
2637 Words
Grammar
Grammar
Plagiarism
Plagiarism
Writing
Writing
Score
Score
Edge detection:Digital Image Processing
Abstract:
Edge detection is an important feature in computer supervision and image processing. In this paper, we discuss several digital image processing techniques in edge feature extraction. Firstly,we define keyterms,such as image,digitalimage,edge, Image noiseetc.leading to certain methods of image de-noising and edge detection. Edge detection includes operators such as Sobel, Prewitt and Roberts. Secondly, a comparative study is made to show that the Sobel operator gives best results.Finally, Edge extraction using edge histogram is taken into account. The edge extraction method proposed in this paper is feasible.
Index terms: digital image, edge detection, operators, edge histogram.
Introduction:
The edge is a set of those pixels whose grey have stepchange and rooftop change, and it exists between object andbackground, object and object, region and region, and betweenelement and element. Edge always indwells in twoneighboring areas having different grey level. It is the result ofgray level being discontinuous. Edge detection is a kind ofmethod of image segmentation based on range non-continuity.Image edge detection is one of the basal contents in the imageprocessing and analysis, and also is a kind of issues which areunable to be resolved completely so far. When image isacquired, the factors such as the projection, mix, aberranceand noise are produced. These factors bring on image feature’sblur and distortion, consequently it is very difficult to extractimage feature. Moreover, due to such factors it is also difficultto detect edge. The method of image edge and outlinecharacteristic 's detection and extraction has been research hotin the domain of image processing and analysis technique.
Edge feature extraction has been applied in many areaswidely. This paper mainly discusses about advantages anddisadvantages of several edge detection operators .In order to gainmore legible image outline, firstly the acquired image isfiltered and denoised. And then



References: [1] “Digital Image Processing by R.C. Gonzalez and R.E.Woods. [2]http://en.wikipedia.org/wiki/Noise_reduction [3]Edge Feature Extraction Based on Digital Image Processing Techniques: Proceedings of the IEEE International Conference on Automation and Logistics Qingdao, China September 2008. [4]Efficient Use of Local Edge Histogram Descriptor by Dong Kwon Park,YoonSeokJeon,Chee Sun Won.

You May Also Find These Documents Helpful

  • Good Essays

    Pt1420 Unit 3.4 Glcm

    • 294 Words
    • 2 Pages

    3.4 GLCM is the widely used statistical method for feature extraction.The number of gray levels present in the input image becomes the number of rows and columns in the matrix.…

    • 294 Words
    • 2 Pages
    Good Essays
  • Satisfactory Essays

    To ensure the best care of a patient while he or she goes through the various stages of cancer, it is necessary to make reliable and accurate decisions in oncological prognosis. Several prognostic markers have emerged over the recent years which can be used as indicators to mark the progression of the disease. One such marker is the mammogram image which displays the tumor present in the breast. These images are black and white in colour and many times hard to read. The main purpose of this paper is to investigate the fuzzy c-means(FCM) clustering as a fuzzy logic method to increase the acutance through segmentation of two mammogram images of the same patient via clustering.…

    • 116 Words
    • 1 Page
    Satisfactory Essays
  • Good Essays

    Spatial Analysis

    • 453 Words
    • 2 Pages

    Text/graphics separation works very smoothly in this case (Fig. 18), as do vectorization and arcs detection (Fig. 19) and loop extraction.…

    • 453 Words
    • 2 Pages
    Good Essays
  • Satisfactory Essays

    Saku Estonian

    • 495 Words
    • 2 Pages

    Soft drinks: 4% (3rd) The Problem Sales of domestic flagship beer brand were falling from 48% (2002) to 42.5% (2003). The Question How should the product porfolio be transformed to address these changes and allow sustainable market success ? Product Lifecycle Too many different product types.…

    • 495 Words
    • 2 Pages
    Satisfactory Essays
  • Good Essays

    Natural Processes are actions or events that have natural causes, which result in natural events. The three main coastal environment processes that operate at Muriwai are Coastal Erosion, Coastal Transportation and Coastal Deposition.…

    • 1065 Words
    • 5 Pages
    Good Essays
  • Satisfactory Essays

    Christian Ethics

    • 913 Words
    • 4 Pages

    I know that you are very bithy but for me it will be a very good aporchinty to get your guidance and assistance for feature work of myself or for farther study in this area. And I attached the topics and proposals. Thank you for your unreserved backing!…

    • 913 Words
    • 4 Pages
    Satisfactory Essays
  • Better Essays

    Abstract-- Endodontic therapy is a treatment where the infected tooth s cavity is removed completely and filled with gutta-percha .During this therapy the size of the abscission in the root apex is important. If the abscission size is small the dentists call them lesion which can be easily removed during the therapy itself. But if it is large, the therapy won’t be effective & will cause acute pain to the patient .In such a way the abscession must be removed by injecting penicillin for a week. But current techniques cannot discriminate between a lesion and an abscess .Our paper aims at distinguishing these two by astute measurement of their respective sizes using two algorithms –hill climbing algorithm, fuzzy C mean clustering algorithm aided by fundus photograph.…

    • 1660 Words
    • 7 Pages
    Better Essays
  • Better Essays

    4.2.4 Point Cloud Analysis

    • 1271 Words
    • 6 Pages

    The Multiscale Curvature Classification (MCC) algorithm was developed especially for forest environments by Evans and Hudak in 2007. The two scientists created this algorithm that automatically and objectively classifies LiDAR data with only two classification parameters and minimized post-processing requirements. The MCC algorithm is used especially in forest conditions because it minimizes the commission errors and retains a big amount of ground points providing better quality of Digital Terrain Models. Classifying points as ground or non ground under forest conditions is a challenging process because of the ground convolution and the vegetation returns. The usual…

    • 1271 Words
    • 6 Pages
    Better Essays
  • Satisfactory Essays

    Intensity value at every pixel is available :I(x,y) Calculate gradient magnitude (done by convolving the image with Sobel operator) Calculate gradient directions…

    • 250 Words
    • 1 Page
    Satisfactory Essays
  • Satisfactory Essays

    Segmentation techniques yield raw data in the form of pixels along a boundary or pixels contained in a region. These data sometimes are used directly to obtain descriptors. Standard uses techniques to compute more useful data (descriptors) from the raw data in order to decrease the size of data.…

    • 1255 Words
    • 6 Pages
    Satisfactory Essays
  • Better Essays

    Interest in digital image areas: improvement of pictorial information for human interpretation: and representation for autonomous machine perception.…

    • 1375 Words
    • 6 Pages
    Better Essays
  • Better Essays

    applications. Transform coding relies on the premise that pixels in an image exhibit a certain…

    • 5443 Words
    • 22 Pages
    Better Essays
  • Powerful Essays

    Finally, it is shown that the proposed algorithm offers better performance compared to some related…

    • 8565 Words
    • 35 Pages
    Powerful Essays
  • Good Essays

    William Paterson played a key role in creating our house of Representatives and the Senate because without his proposal, we would not have the governing system that we have today. He was prepared to create a system and compromise with other people’s ideas. He had gone to Princeton University and was a very smart man. After school, he had begun studying the law in the city of Princeton under Richard Stockton. When the War of Independence had begun, he had joined the vanguard of the New Jersey patriots. He had participated in the provincial congress (1775-1776), council of safety (1777), legislative council (1776-77), and the constitutional convention (1776). William Paterson had done many important things in his life that changed the way that…

    • 786 Words
    • 4 Pages
    Good Essays
  • Better Essays

    3.4.1 Localization

    • 1753 Words
    • 8 Pages

    The binary and noise free image is scanned and using AForge.NET library, rectangular shapes of white intensity with width greater than 80 pixels and height greater than 30 pixels (corresponding to the aspect ratio of Indian LP’s) are detected. Once found they are marked in red as shown in figure 6 below. The Blobcounter class is used that counts and extracts stand alone objects in images using connected components labeling algorithm. The algorithm treats all pixels with values less or equal to Background Threshold as background, but pixels with higher values are treated as objects' pixels. For blobs' searching the class supports 8 bpp indexed grayscale images and 24/32 bpp color images that are at least two pixels wide.…

    • 1753 Words
    • 8 Pages
    Better Essays

Related Topics