JP4213209B2 - Machine vision method and apparatus for edge-based image histogram analysis - Google Patents

Machine vision method and apparatus for edge-based image histogram analysis Download PDF

Info

Publication number
JP4213209B2
JP4213209B2 JP52461897A JP52461897A JP4213209B2 JP 4213209 B2 JP4213209 B2 JP 4213209B2 JP 52461897 A JP52461897 A JP 52461897A JP 52461897 A JP52461897 A JP 52461897A JP 4213209 B2 JP4213209 B2 JP 4213209B2
Authority
JP
Japan
Prior art keywords
mask
image
value
edge
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
JP52461897A
Other languages
Japanese (ja)
Other versions
JP2000503145A5 (en
JP2000503145A (en
Inventor
オオハシ,ヨシカズ
ワインジンマー,ラス
Original Assignee
コグネックス・コーポレイション
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US08/581,975 priority Critical
Priority to US08/581,975 priority patent/US5845007A/en
Application filed by コグネックス・コーポレイション filed Critical コグネックス・コーポレイション
Priority to PCT/US1996/020900 priority patent/WO1997024693A1/en
Publication of JP2000503145A publication Critical patent/JP2000503145A/en
Publication of JP2000503145A5 publication Critical patent/JP2000503145A5/ja
Application granted granted Critical
Publication of JP4213209B2 publication Critical patent/JP4213209B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation

Description

Copyright protection
The published content of this patent document includes material that is subject to copyright protection. The owner of this patent has no objection to reproduction of this patent document or published content by copying it in the form of a US Patent and Trademark Department patent file or record. However, the owner has all other rights under the Copyright Act.
Background of the Invention
The present invention relates to a machine vision (same as computer vision), and more particularly to a method and apparatus for image analysis using a histogram.
In automated production sites, it is often important to determine the position, shape, size and angular orientation of objects to be processed or assembled (including the meaning of measurement). For example, in an automated circuit assembly site, the exact location of the printed circuit board must be determined before the conductive leads are soldered.
An example is that many automated production systems rely on machine vision technology of image segmentation as a way to find the position and orientation of objects. Image segmentation is a technique for identifying the outline of an object in an image (image) in relation to other features (eg, background of the object).
One conventional method of image segmentation is to use pixel intensity threshold determination by creating a histogram of each pixel intensity value in the image. In a histogram that records the number of pixels at each possible intensity value, various prominent intensity values in the image, such as the prominent intensity values of the object and background, are shown as peaks. Thereby, the intermediate strength of the boundary line or edge separating the object from the background is estimated.
For example, in a histogram of an object irradiated with a light source from behind, a peak appears at the dark end in the intensity spectrum. This represents the object, more specifically the part in the image where the object blocks the light from behind from reaching the camera. The histogram also peaks at the light end in the intensity spectrum. This represents the background, more specifically the part in the image where the object does not block the light from behind from reaching the camera. The image intensity at the edge of the object contour is usually estimated to be approximately halfway between the bright and dark peaks in the histogram.
One problem with conventional image segmentation techniques is that the image of an object whose effectiveness is illuminated from behind, or other “bimodal” image (ie, two intensity values). That is, it is mainly limited to use in analysis). When the object is illuminated from the front, such as when it is necessary to identify the edge of a surface shape, such as a solder pad on a circuit board, the resulting image is not bimodal, rather many Shows a potentially complex pattern of image intensities. There is a problem with performing image segmentation by estimating the image intensity along the edge of the object of interest from the multiple peaks shown in the histogram.
According to the prior art, image features along the edges of objects of the type created by the image segmentation techniques described above, eg image segmentation thresholds, are other machine vision tools that determine the position or orientation of an object. Used by. Among such tools are two: a contour angle finder and a “blob” analyzer. The contour angle finder uses an image segmentation threshold to track the edge of the object and determine its angular orientation. The blob analyzer also uses the image segmentation threshold to determine the position and orientation of the object. Both tools are described, for example, in US Pat. No. 5,371,060. The rights of this patent are assigned to the assignee (patentee) Cognex Corporation of the present invention.
The contour angle finding tool and blob analysis tool of the type created by the same patentee of the present invention have proven quite effective in determining the angular orientation of an object, but with additional features that facilitate such determination. The goal remains to provide the right tools.
It is an object of the present invention to provide an improved method and apparatus for machine vision analysis, particularly an improved method for determining edge characteristics and edge regions in an image.
It is a further object of the present invention to provide an improved method and apparatus for determining characteristics such as salient strength and salient edge orientation.
Yet another object is such a method and apparatus that can determine edge characteristics from a wide variety of images regardless of whether the object represented by the image is illuminated from the front or back. Is to provide.
Another object of the present invention is such a method and apparatus that can easily be applied to the field of automatic vision applications, such as for automatic inspection and assembly operations, and other machine vision applications related to object position. It is to provide a method and apparatus.
Another further object of the present invention is to provide such a method and apparatus that can be quickly implemented by engaging various machine vision analysis instruments without using excessive resources.
Yet another object of the present invention is to provide an article of manufacture embodied in program code on a computer readable medium for performing such an improved method.
Summary of the Invention
The above objective is accomplished by the present invention which provides an apparatus and method for edge-based image histogram analysis.
In one form, the present invention provides an apparatus for identifying salient edge characteristics of an input image comprised of a plurality of image values (i.e., "pixels") including numerical values representing intensities (e.g., color or brightness). To do. The apparatus includes an edge detector that generates a plurality of edge magnitude values based on input image values. The edge detector generates edge intensity by obtaining the derivative of the input image value, i.e., the rate of intensity change between multiple image pixels.SobelIt can be an operator (Sobel operator).
The mask generator creates a mask based on the numerical value output by the edge detector. For example, the mask generator can generate an array of mask values (eg, 0 (plurality)) and non-mask values (eg, 1 (plurality)). Each of these values depends on the corresponding edge strength value. For example, if a certain edge strength value exceeds a threshold value, the corresponding mask value is 1, and in other cases it is 0.
In one form of the invention for determining significant edge strength, or image segmentation threshold, the mask is used to mask input image values that are not present in regions with sufficient high edge strength. In one form of the invention for determining salient edge directions, this mask is used to mask edge direction values that are not in such regions.
A mask applicator applies a pixel mask array to the selected image to produce a masked image. The selected image is an input image or an image generated from it (eg:SobelEdge direction image of the type generated by applying the operator to the input image.
Histogram generator generates a histogram of pixel values in the masked image, eg a histogram of the counts of image pixels that have passed through the mask in each intensity value or edge direction and that correspond to the unmasked value of the pixel mask To do. The peak detector provides significant image intensity values associated with edges detected by the edge detector in the form of the image segmentation threshold of the present invention, and this edge detection in the form of object orientation determination of the present invention. A histogram peak is identified that indicates a significant edge direction associated with an edge detected by the instrument.
A further aspect of the invention is to provide a method for object orientation determination that identifies an image segmentation threshold in parallel with the operation of the apparatus.
Yet another aspect of the present invention is an article of manufacture comprising program code embodied in a computer usable medium for performing the edge based image histogram analysis method on a digital data processor. is there.
The present invention can be widely applied for industrial and research purposes. In the remaining form of determining the salient edge strength threshold of the present invention, image segmentation for use in other machine vision tasks such as contour angle discovery and blob analysis to determine object position and orientation. Provide a threshold. These forms of determining the prominent edge direction of the present invention are also widely applicable throughout the industry, such as facilitating object determination without requiring additional machine vision manipulation.
These and other aspects of the invention are clearly shown in the following figures and description.
[Brief description of the drawings]
By referring to the figures, a more complete understanding of the present invention can be achieved.
FIG. 1 is a diagram illustrating a digital data processor including an apparatus for edge-based image histogram analysis according to the present invention.
FIG. 2A is a diagram illustrating the steps in the method for edge-based image histogram analysis according to the present invention for use in identifying image segmentation thresholds.
FIG. 2B is a diagram illustrating the steps in the method for edge-based image histogram analysis according to the present invention intended for use in determining the prominent edge direction in an image.
3A-3F are diagrams illustrating a series of images generated by the method and apparatus according to the present invention in a one-dimensional manner (eg, scan lines).
4A to 4F are two-dimensional diagrams showing a series of images generated by the method and apparatus according to the present invention.
FIG. 5 is a diagram illustrating pixel values of a sample input image processed by the method and apparatus according to the present invention.
FIGS. 6-8 illustrate the operation of the method and apparatus according to the present invention.SobelIt is a figure which shows the pixel value of the intermediate | middle image produced | generated by application of an operator, and the pixel value of an edge strength image.
FIG. 9 is a diagram showing pixel values of a sharpened edge intensity image generated by the method and apparatus according to the present invention.
FIG. 10 shows a pixel mask array generated by the method and apparatus according to the present invention.
FIG. 11 is a diagram illustrating pixel values of a masked input image generated by the method and apparatus according to the present invention.
FIG. 12 shows a histogram created from the masked image values of FIG.
Detailed Description of Illustrated Embodiment
FIG. 1 shows a system for edge-based image histogram analysis. As shown, an image capture device 10, such as a conventional video camera or scanner, generates an image of a scene that includes the object 1. The digital image data (or pixels) generated by the image capture device 10 is the image intensity (eg color) of each point in the scene at the resolution of the image capture device (image capture device) in a conventional manner. (Or brightness).
The digital image data is transferred from the image capturing device 10 to the image analysis system 12 through the communication path 11. This (image analysis system) is programmed according to the teachings herein to perform a conventional digital data processor or edge-based image histogram analysis and sold by the patentee Cognex Corporation of the present invention. It may be a certain type of vision process system. The image analysis system 12 comprises one or more central processing units 13, a main memory 14, an input / output system 15, and a disk drive (or other static mass storage device) 16, all of the conventional type. It does not matter.
This system 12, in particular the central processing unit 13, is taught here for operation as an edge detector 2, mask generator 3, mask applicator 4, histogram generator 5, peak detector 6, which will be described in detail below. It consists of programming instructions according to Those skilled in the art (those skilled in the art) can implement the methods and apparatus taught herein in special purpose hardware in addition to implementation on a programmable digital data processor. You will understand correctly.
FIG. 2A illustrates the steps of an edge-based image histogram analysis method according to the present invention for use in identifying an image segmentation threshold. This is used to measure the significant intensity of certain edges in the image. In order to better explain the processing performed in each of these steps, the figures and numerical examples shown in FIGS. 3 to 12 will be referred to throughout the following description. In connection with this, FIGS. 3A-3F and FIGS. 4A-4F illustrate a series of images generated by the method of FIG. The depictions in FIGS. 4A-4F are two-dimensional. The depictions in FIGS. 3A to 3F are one-dimensional such as the shape of the scanning line. 5-12 illustrate in numerical form pixel values in a similar series of images as well as intermediate arrays generated by the method of FIG. 2A.
While this example embodiment is concise, those skilled in the art can apply the teachings herein to determine the prominent edge characteristics of a wide variety of images, including those that are much more complex than these examples. Can understand. In addition, with reference to the images and intermediate arrays shown in FIGS. 5-11, it can be seen that the present teachings apply to any size image processing.
2A, it can be seen that the scene including the target object is captured by the image capturing device 10 (FIG. 1). As described above, a digital image representing this scene is obtained in a conventional manner using pixels that indicate the image intensity (eg, color or brightness) of each point in the scene according to the resolution of the image capture device 10. Generated by the image capture device 10. Captured images are referred to in this document as “input” or “original” images.
For simplicity, it will be assumed that in the present example embodiment shown by the figures and descriptions below, the input image shows a dark rectangle against a white background. This is illustrated in FIGS. 3A and 4A. Similarly, this is shown in numerical form in FIG. 5 and a dark rectangle against an image intensity 0 background is indicated by a grid with an image intensity of 10.
In step 22, the illustrated method operates as an edge detector, which uses the pixels corresponding to the input image pixels to generate an edge intensity image, which Reflects the rate of change (or differentiation) of the image intensity represented by the pixel. Looking at the example in the figure, such edge strengths are shown in FIG. 3B (graphic, one dimensional), FIG. 4B (graphic, two dimensional), and FIG. 8 (numerical, using pixels).
In one preferred embodiment, the edge detector step 22 determines the magnitude of these rates of change, more specifically, the rate of change along the X-axis and Y-axis directions of the input image. To decideSobelUse operators.SobelThe use of operators for the purpose of determining the rate of change of image intensity is known in the art. For example, in 1970, a doctoral dissertation from Stanford UniversitySobel(Sobel) "Camera Models and Machine Perception", 1970, B. of Academic Press. Lipkin and A.M. J. in “Picture Processing and Psychopictorics” by Rosenfeld (editor). The teachings in “Object Enhancement and Extraction” by Prewitt J. are incorporated herein by reference.SobelFurther preferred techniques for employing operators are described below and are implemented in the machine vision tool sold by the patentee of the present invention under the trade name CIP_SOBEL.
The pixel change rate of the input image along the X axis is preferably determined by convolving the image with the following matrix.
FIG. 6 illustrates an intermediate image resulting from the convolution of the input image of FIG. 5 using the matrix described above.
The pixel change rate of the input image along the Y axis is preferably determined by convolving the image with the following matrix.
FIG. 7 illustrates an intermediate image resulting from convolution of the input image of FIG. 5 using the matrix described above.
The pixel of the edge intensity image in FIG. 8 and the like is determined by the square root of the sum of the squares of the corresponding pixels in the intermediate image in FIGS. That is, for example, the intensity indicated by the pixel in the first row / first column of the edge strength image in FIG. 8 is (1) the square of the numerical value in the first row / first column of the intermediate image in FIG. And (2) is the square root of the sum of the squares of the numerical values in the first row and first column of the intermediate image in FIG. Returning to FIG. 2A, the means illustrated in step 23 acts as a peak sharpening means to produce a sharpened edge intensity image, which duplicates an edge intensity image with a sharpened peak. . In particular, in this peak sharpening step 23, all peaks other than the maximum edge intensity value among all peaks in the edge intensity image are removed. Looking at the example shown in this figure, such a sharpened edge intensity image is shown in FIG. 3C (graphic, 1D), FIG. 4C (graphic, 2D), and FIG. ).
In the illustrated example embodiment, this sharpened edge intensity image is generated by applying the cross-shaped neighborhood operator shown below to the edge intensity image. Only those edge intensity values at the center of each neighborhood that are the largest in the entire neighborhood are designated (assigned) as a sharpened edge intensity image, and thus every edge intensity value peak is narrowed.
For a better understanding of sharpening, please refer to FIG. 9, which illustrates a sharpened edge intensity image generated from the edge intensity image of FIG.
In step 24 of FIG. 2A, the illustrated means acts as a mask generator for generating a pixel mask array from the sharpened edge intensity image. In one preferred embodiment, the mask generator 24 generates an array of mask values (eg, 0 (s)) and non-mask values (eg, 1 (s)). Each of these depends on the corresponding pixel value in the sharpened edge intensity image. Accordingly, when the pixel intensity value of this sharpened edge intensity image exceeds a threshold value (indicated by element 33a in FIG. 3C), the corresponding element of the mask array is multiplied by a non-mask value of 1, otherwise it is masked by 0. .
Those skilled in the art will understand that the mask generator process 24 is the same as the binarization process of the sharpened edge intensity image. The example shown in the figure in accordance with this is differentiated accordingly. Please refer to the mask or binarized sharpened edge intensity image shown in FIG. 3D, FIG. 4D, and FIG. In the example using the numerical values of FIG.
Furthermore, those skilled in the art will appreciate that the peak sharpening step 23 is optional and that this mask can be generated by directly binarizing this edge intensity image.
In step 25 of FIG. 2A, the illustrated means isActs as a mask application device,Unmasked in pixel arrayValue (numeric value: 1)Pixels from the input image corresponding to(Numerical value: 10)Only included in the masked imageLikeThe input image(Fig. 5)Pixel mask array(Fig. 10)By applying,Masked image(Fig. 11)Generate.In the image segmentation embodiment of FIG. 2A, pixels of the input image where the pixel mask array is in an area where sufficient high edge strength is present(Numerical value: 10)Only through the masked imageCome outIt will be understood that it has an effect. Looking at the example shown in the figure, the masked input images are shown in FIG. 3E (graphic, 1D), FIG. 4E (graphic, 2D) and FIG. 11 (numerical values, using pixels). . FIG. 11 in particular reveals that the masked image includes a dark square (numerical value 10) portion and a background (numerical value 0) portion of the input image.
In step 26 of FIG. 2A, the illustrated means is a histogram of the values in the masked image, ie, from the input image through the pixel mask array.Get outActs as a histogram generator that generates an aggregate of the number of non-zero pixels at each possible intensity value. Looking at the example shown in the figure, such histograms are shown in FIGS. 3F, 4F and 12.
In step 27 of FIG. 2A, the means shown acts as a peak detector that produces an output signal indicative of the peaks in this histogram. As those skilled in the art will appreciate, these peaks indicate a variety of significant edge strengths in the masked input image. This information may be used with other machine vision tools such as contour angle finders and blob analyzers to determine the position and orientation of the object.
A list of software for a preferred embodiment for identifying image segmentation thresholds according to the present invention is set forth below. This program is executed (implemented) using the C programming language on the UNIX operating system.
The method and apparatus for edge-based image histogram analysis according to the present invention can be used to determine various salient edge characteristics including salient image intensity (or image segmentation threshold) as described above. In this regard, FIG. 2B illustrates a method for edge-based image histogram analysis according to the present invention that uses object orientation determination by determining one or more salient directions of edges in an image. Each step is shown. Describing where indicated with the same reference numbers, the steps of the method shown in FIG. 2B are similar to those of FIG. 2A.
As indicated by the new reference numeral 28, the means of FIG. 2B uses the pixel corresponding to the input image pixel to produce an edge direction image.(image)Including the additional step of generating(Edge direction image)Reflects the direction of the rate of change (or differentiation) of the image intensity represented by this input image pixel. In a preferred embodiment, step 28 is of the type described above to determine the direction of this rate of change.SobelAn operator is used. As mentioned above, the bookTechnologyIn the fieldSobeloperator(Also called sobel filter)Is known to be used in this way.
Further, the mask applicator step 25 of FIG. 2A generates a masked image by applying the pixel mask array to the input image, whereas the mask applicator step 25 'of FIG. Generate a masked image by applying it to the edge direction image. As a result, the output of the mask applicator step 25 'is a masked edge direction image (not a masked input image). As a result, the histogram generating step 26 and the peak finding step 27 of FIG. 2B have the effect of extracting one or more salient edge directions (not salient image intensities) of this input image.
From the information on the remarkable edge direction, the orientation of the object whose contour is the edge is inferred. For example, if the object is rectangular, the prominent edge direction value is interpreted as modulo 90 degrees to determine the angular rotation. To further illustrate, if the object is approximately circular and the edge is straight or scored (eg, like a semiconductor wafer), then a significant edge direction value can be easily determined, Can be used to determine the orientation of the object. One skilled in the art will appreciate that the orientation of other stationary objects that are stationary can be inferred from the significant edge direction information provided by the method of FIG. 2B.
A further exemplary embodiment of the present invention is a computer program for configuring the image analysis system 12 (FIG. 1) to operate as and in accordance with the edge-based histogram analysis apparatus and method described herein. A computer-readable medium storing a program, that is, an article of manufacture consisting of a magnetic disk, that is, a magnetic diskette (not shown) is provided. Such a diskette has a conventional structure and stores a computer program recorded on a conventional read magnetic medium that passes through a read / write head included in the diskette drive of the image analyzer 12. Other articles of manufacture comprising computer-usable media storing a program intended to run the computer in a manner consistent with the teachings herein are described herein by way of example only. It will be understood that it is included in the medium described by.
Here we have described an apparatus and method for edge-based image histogram analysis that meets the objectives of the invention described above. The embodiments described herein are not intended to be limiting, and other embodiments that have been added, deleted, or otherwise modified within the scope of ordinary skill in the art are the techniques of the present invention. It will be understood that it is in range. In view of the above, our claims are as follows.

Claims (20)

  1. An apparatus for determining the characteristics of edges appearing in an input image comprising a plurality of input image pixels,
    Edge strength detecting means for generating an edge intensity image formed of a plurality of edge strength pixels a value representing the degree of change in each value of the plurality of input image pixels are each of the pixel has,
    Coupled to said edge strength detecting means, generating a single or plurality of each of the edge intensity pixels values are pixel mask sequence value or level of the non-mask mask having the people each of the pixel mask determined by Mask generating means for
    Mask applying means coupled to the mask generating means to generate a masked image that includes only those pixels from the input image or an image generated from the input image corresponding to a non-mask value in the array. ,
    Histogram means for generating a histogram of pixels of the masked image coupled to the mask application means, and coupled to the histogram means for exhibiting salient features of the edges represented in the input image. An apparatus comprising peak detection means for specifying at least one value in the histogram and outputting a signal representing the value.
  2. The apparatus of claim 1, comprising:
    The input image pixel has a numerical value indicating image intensity;
    The mask applying means comprises means for generating the masked image including only each pixel of the input image corresponding to an unmasked value in the pixel mask array;
    The peak detecting means has means for specifying a peak value indicating a remarkable image intensity value of an edge represented in the input image by the histogram and outputting a signal representing the value. Equipment.
  3. The apparatus of claim 2, wherein the edge strength detection means includes means for applying a Sobel operator to the input image to generate the edge strength image.
  4. The apparatus of claim 1, comprising:
    Edge direction detecting means for generating an edge direction image including a plurality of edge direction pixels, each of the edge direction pixels having a value indicating a direction of a degree of change of a plurality of values of an individual input image pixel; Have
    The mask applying means includes means for generating the masked image including only each pixel of an edge direction image corresponding to an unmasked value in the pixel mask array;
    The peak detecting means includes means for specifying a peak value indicating a remarkable edge direction value of an edge represented in the input image by the histogram and outputting a signal representing the value. apparatus.
  5. 5. The apparatus of claim 4, wherein the edge direction detection means includes means for applying a Sobel operator to the input image to generate the edge direction image.
  6. An apparatus according to claim 2 or 4, wherein
    The mask generating means is
    Means for sharpening a peak in the edge intensity image to generate a sharpened edge intensity image including a plurality of sharpened edge intensity pixels indicative of the sharpened peak ;
    Apparatus characterized by including a means for generating the pixel mask sequences determined by the sharpening edge intensity pixels.
  7. 7. The apparatus according to claim 6, wherein the mask generating means generates respective pixel masks having mask values for edge intensity values within a predetermined first numerical range, and a predetermined second. An apparatus comprising binarization means for generating respective pixel masks having unmasked values for intensity values in a numerical range.
  8. The apparatus according to claim 7, wherein the binarizing means generates a pixel mask for each with a mask value for the low edge intensity value than a predetermined threshold, and high edge strength value than the threshold value An apparatus comprising: means for generating a pixel mask having a non-mask value for.
  9. A method for determining the characteristics of an edge appearing in an input image comprising a plurality of input image pixels, comprising:
    A plurality of respective values plurality of edge strength detecting step of generating an edge intensity image consisting of edge intensity pixels a value indicating variation degree is each of the pixel has in input image pixels,
    Mask generation step of generating a single or plurality of each of the edge intensity pixels values are pixel mask sequence value or level of the non-mask mask having the people each of the pixel mask determined by,
    Applying a mask to generate a masked image that includes only those pixels from the input image or images generated from the input image corresponding to the values of the non-mask in the array;
    A histogram step for generating a histogram of pixels of the masked image, and identifying at least one value in the histogram indicative of salient features represented in the input image, and a signal representing the value A method comprising a peak detection step of outputting.
  10. The method of claim 9, comprising:
    The input image pixel has a numerical value indicating image intensity;
    Applying the mask includes generating the masked image including only each pixel of the input image corresponding to an unmasked value in the array;
    The peak detecting step includes a step of specifying a peak value indicating a remarkable image intensity value of an edge represented in the input image in the histogram, and outputting a signal representing the value. .
  11. 11. The method of claim 10, wherein the edge strength detection step includes applying a Sobel operator to the input image to generate the edge strength image.
  12. The method of claim 9, comprising:
    An edge direction detection step for generating an edge direction image including a plurality of edge direction pixels, each of the edge direction pixels indicating a direction of change of one or more values of an individual input image pixel. Has the value shown,
    Applying the mask includes generating the masked image including only each pixel of the edge direction image corresponding to an unmasked value in the pixel mask array;
    Said peak detection step, a peak value that indicates the significant edge direction values of edges represented in said input image specified by the histogram, and characterized in that it comprises a step of outputting a signal indicating the value Method.
  13. 13. The method of claim 12, wherein the edge direction detection step includes applying a Sobel operator to the input image to generate the edge direction image.
  14. 13. The method according to claim 10 or 12, wherein the mask generation step includes:
    Sharpening a peak in the edge intensity image to generate a sharpened edge intensity image including a plurality of sharpening pixels indicative of the sharpened peak ;
    Generating the pixel mask array determined by the sharpened edge intensity pixels.
  15. 15. The method of claim 14, wherein the mask generating step generates each pixel mask having a mask value for an edge intensity value within a predetermined first numerical range, and a predetermined second. A method comprising a binarization step for generating respective pixel masks having unmasked values for intensity values in a numerical range.
  16. The method of claim 15, wherein the binarization step generates a pixel mask for each with a mask value for the low edge intensity value than a predetermined threshold, and high edge strength value than the threshold value Generating a respective pixel mask having a non-mask value for.
  17. A computer readable recording medium comprising program code for causing a digital data processor to perform a method for determining a characteristic of an edge represented in an input image comprising a plurality of input image pixels, the method comprising:
    A plurality of respective values plurality of edge strength detecting step of generating an edge intensity image consisting of edge intensity pixels a value indicating variation degree is each of the pixel has in input image pixels,
    Mask generation step of generating a single or plurality of each of the edge intensity pixels values are pixel mask sequence value or level of the non-mask mask having the people each of the pixel mask determined by,
    Applying a mask to generate a masked image that includes only those pixels from the input image or images generated from the input image corresponding to the values of the non-mask in the array;
    A histogram step for generating a histogram of pixels of the masked image, and identifying at least one value in the histogram indicative of salient features represented in the input image, and a signal representing the value A computer-readable recording medium comprising a peak detection step of outputting.
  18. An apparatus for determining the characteristics of edges appearing in an input image comprising a plurality of input image pixels,
    Edge direction detection means for generating an edge direction image composed of a plurality of edge directions of pixels a pixel value indicating the direction of change in the degree of each value of the plurality of input image pixels are each of the edge direction pixel has,
    Mask the linked to the edge direction detection unit, and generates a single or plurality of each of said are pixel mask sequence value or level of the non-mask mask having the people each of the pixel mask determined by the edge direction pixel Generating means,
    Mask applying means coupled to the mask generating means to generate a masked image including only those pixels from the edge direction image corresponding to the non-mask values in the array;
    Histogram means for generating a histogram of pixels of the masked image coupled to the mask applying means, and significant edge direction values for edges represented in the input image coupled to the histogram means. An apparatus comprising: peak detecting means for specifying a peak value indicating the value in the histogram and outputting a signal representing the value .
  19. A method for determining the characteristics of an edge appearing in an input image comprising a plurality of input image pixels, comprising:
    A plurality of respective values plurality of edge direction detection step of generating an edge direction image consisting of edge direction of pixels of the pixel value indicating the direction of the change degree respectively of the edge direction pixel has in input image pixels,
    Generating a mask mask array, each of which has a mask value or a non-mask value determined by one or more of said edge direction pixels, each of the pixel masks ;
    Applying a mask to generate a masked image containing only those pixels from the edge direction image corresponding to the unmasked values in the array;
    Histogram step, and a peak value that indicates the significant edge direction values of said is represented in the input image has edges identified in the histogram, the signal representing its value generates a histogram of pixels in the masked image A peak detection step of outputting.
  20. A computer readable recording medium comprising program code for causing a digital data processor to perform a method for determining a characteristic of an edge represented in an input image comprising a plurality of input image pixels, the method comprising:
    A plurality of respective values plurality of edge direction detection step of generating an edge direction image consisting of edge direction of pixels of the pixel value indicating the direction of the change degree respectively of the edge direction pixel has in input image pixels,
    Generating a mask mask array, each of which has a mask value or a non-mask value determined by one or more of said edge direction pixels, each of the pixel masks ;
    Applying a mask to generate a masked image containing only those pixels from the edge direction image corresponding to the unmasked values in the array;
    Histogram step, and a peak value that indicates the significant edge direction values of said is represented in the input image has edges identified in the histogram, the signal representing its value generates a histogram of pixels in the masked image A computer-readable recording medium characterized by having a peak detection step of outputting.
JP52461897A 1996-01-02 1996-12-31 Machine vision method and apparatus for edge-based image histogram analysis Expired - Lifetime JP4213209B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US08/581,975 1996-01-02
US08/581,975 US5845007A (en) 1996-01-02 1996-01-02 Machine vision method and apparatus for edge-based image histogram analysis
PCT/US1996/020900 WO1997024693A1 (en) 1996-01-02 1996-12-31 Machine vision method and apparatus for edge-based image histogram analysis

Publications (3)

Publication Number Publication Date
JP2000503145A JP2000503145A (en) 2000-03-14
JP2000503145A5 JP2000503145A5 (en) 2000-03-14
JP4213209B2 true JP4213209B2 (en) 2009-01-21

Family

ID=24327330

Family Applications (1)

Application Number Title Priority Date Filing Date
JP52461897A Expired - Lifetime JP4213209B2 (en) 1996-01-02 1996-12-31 Machine vision method and apparatus for edge-based image histogram analysis

Country Status (3)

Country Link
US (1) US5845007A (en)
JP (1) JP4213209B2 (en)
WO (1) WO1997024693A1 (en)

Families Citing this family (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5631734A (en) 1994-02-10 1997-05-20 Affymetrix, Inc. Method and apparatus for detection of fluorescently labeled materials
US6259827B1 (en) 1996-03-21 2001-07-10 Cognex Corporation Machine vision methods for enhancing the contrast between an object and its background using multiple on-axis images
US5933523A (en) * 1997-03-18 1999-08-03 Cognex Corporation Machine vision method and apparatus for determining the position of generally rectangular devices using boundary extracting features
US6075881A (en) 1997-03-18 2000-06-13 Cognex Corporation Machine vision methods for identifying collinear sets of points from an image
US6608647B1 (en) 1997-06-24 2003-08-19 Cognex Corporation Methods and apparatus for charge coupled device image acquisition with independent integration and readout
US6011558A (en) * 1997-09-23 2000-01-04 Industrial Technology Research Institute Intelligent stitcher for panoramic image-based virtual worlds
US6094508A (en) * 1997-12-08 2000-07-25 Intel Corporation Perceptual thresholding for gradient-based local edge detection
US6434271B1 (en) * 1998-02-06 2002-08-13 Compaq Computer Corporation Technique for locating objects within an image
US6381375B1 (en) 1998-02-20 2002-04-30 Cognex Corporation Methods and apparatus for generating a projection of an image
US6516092B1 (en) 1998-05-29 2003-02-04 Cognex Corporation Robust sub-model shape-finder
US7016539B1 (en) 1998-07-13 2006-03-21 Cognex Corporation Method for fast, robust, multi-dimensional pattern recognition
US6687402B1 (en) 1998-12-18 2004-02-03 Cognex Corporation Machine vision methods and systems for boundary feature comparison of patterns and images
US6381366B1 (en) 1998-12-18 2002-04-30 Cognex Corporation Machine vision methods and system for boundary point-based comparison of patterns and images
US6807298B1 (en) * 1999-03-12 2004-10-19 Electronics And Telecommunications Research Institute Method for generating a block-based image histogram
US6684402B1 (en) 1999-12-01 2004-01-27 Cognex Technology And Investment Corporation Control methods and apparatus for coupling multiple image acquisition devices to a digital data processor
GB2358702A (en) * 2000-01-26 2001-08-01 Robotic Technology Systems Plc Providing information of objects by imaging
US6748104B1 (en) 2000-03-24 2004-06-08 Cognex Corporation Methods and apparatus for machine vision inspection using single and multiple templates or patterns
US6999619B2 (en) * 2000-07-12 2006-02-14 Canon Kabushiki Kaisha Processing for accurate reproduction of symbols and other high-frequency areas in a color image
AU7973901A (en) * 2000-07-18 2002-01-30 Pamgene Bv Method for locating areas of interest of a substrate
WO2002025928A2 (en) * 2000-09-21 2002-03-28 Applied Science Fiction Dynamic image correction and imaging systems
US6748110B1 (en) * 2000-11-09 2004-06-08 Cognex Technology And Investment Object and object feature detector system and method
US8682077B1 (en) 2000-11-28 2014-03-25 Hand Held Products, Inc. Method for omnidirectional processing of 2D images including recognizable characters
US6681151B1 (en) * 2000-12-15 2004-01-20 Cognex Technology And Investment Corporation System and method for servoing robots based upon workpieces with fiducial marks using machine vision
US6987875B1 (en) 2001-05-22 2006-01-17 Cognex Technology And Investment Corporation Probe mark inspection method and apparatus
US6879389B2 (en) * 2002-06-03 2005-04-12 Innoventor Engineering, Inc. Methods and systems for small parts inspection
US7502525B2 (en) * 2003-01-27 2009-03-10 Boston Scientific Scimed, Inc. System and method for edge detection of an image
JP4068596B2 (en) * 2003-06-27 2008-03-26 株式会社東芝 Graphic processing method, graphic processing apparatus, and computer-readable graphic processing program
US7190834B2 (en) 2003-07-22 2007-03-13 Cognex Technology And Investment Corporation Methods for finding and characterizing a deformed pattern in an image
US8081820B2 (en) 2003-07-22 2011-12-20 Cognex Technology And Investment Corporation Method for partitioning a pattern into optimized sub-patterns
US8437502B1 (en) 2004-09-25 2013-05-07 Cognex Technology And Investment Corporation General pose refinement and tracking tool
US7416125B2 (en) * 2005-03-24 2008-08-26 Hand Held Products, Inc. Synthesis decoding and methods of use thereof
US8111904B2 (en) 2005-10-07 2012-02-07 Cognex Technology And Investment Corp. Methods and apparatus for practical 3D vision system
US9445025B2 (en) 2006-01-27 2016-09-13 Affymetrix, Inc. System, method, and product for imaging probe arrays with small feature sizes
US8055098B2 (en) 2006-01-27 2011-11-08 Affymetrix, Inc. System, method, and product for imaging probe arrays with small feature sizes
JP4973008B2 (en) * 2006-05-26 2012-07-11 富士通株式会社 Vehicle discrimination device and program thereof
US8162584B2 (en) 2006-08-23 2012-04-24 Cognex Corporation Method and apparatus for semiconductor wafer alignment
US8036468B2 (en) * 2007-12-24 2011-10-11 Microsoft Corporation Invariant visual scene and object recognition
US20090202175A1 (en) * 2008-02-12 2009-08-13 Michael Guerzhoy Methods And Apparatus For Object Detection Within An Image
WO2010044745A1 (en) * 2008-10-16 2010-04-22 Nanyang Polytechnic Process and device for automated grading of bio-specimens
US8675060B2 (en) * 2009-08-28 2014-03-18 Indian Institute Of Science Machine vision based obstacle avoidance system
JP5732217B2 (en) * 2010-09-17 2015-06-10 グローリー株式会社 Image binarization method and image binarization apparatus
US8942917B2 (en) 2011-02-14 2015-01-27 Microsoft Corporation Change invariant scene recognition by an agent
CN105210115A (en) * 2013-06-10 2015-12-30 英特尔公司 Performing hand gesture recognition using 2D image data
US9679224B2 (en) 2013-06-28 2017-06-13 Cognex Corporation Semi-supervised method for training multiple pattern recognition and registration tool models
CN103675588B (en) * 2013-11-20 2016-01-20 中国矿业大学 The machine vision detection method of printed component part polarity and equipment
US9704057B1 (en) * 2014-03-03 2017-07-11 Accusoft Corporation Methods and apparatus relating to image binarization
KR20170139725A (en) * 2016-06-09 2017-12-20 엘지디스플레이 주식회사 Method For Compressing Data And Organic Light Emitting Diode Display Device Using The Same

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5425782B2 (en) * 1973-03-28 1979-08-30
CH611017A5 (en) * 1976-05-05 1979-05-15 Zumbach Electronic Ag
US4183013A (en) * 1976-11-29 1980-01-08 Coulter Electronics, Inc. System for extracting shape features from an image
JPS562284B2 (en) * 1976-12-01 1981-01-19
US4200861A (en) * 1978-09-01 1980-04-29 View Engineering, Inc. Pattern recognition apparatus and method
JPS6236635B2 (en) * 1980-12-17 1987-08-07 Hitachi Ltd
DE3267548D1 (en) * 1982-05-28 1986-01-02 Ibm Deutschland Process and device for an automatic optical inspection
US4736437A (en) * 1982-11-22 1988-04-05 View Engineering, Inc. High speed pattern recognizer
US4860374A (en) * 1984-04-19 1989-08-22 Nikon Corporation Apparatus for detecting position of reference pattern
US4688088A (en) * 1984-04-20 1987-08-18 Canon Kabushiki Kaisha Position detecting device and method
US4922543A (en) * 1984-12-14 1990-05-01 Sten Hugo Nils Ahlbom Image processing device
US4685143A (en) * 1985-03-21 1987-08-04 Texas Instruments Incorporated Method and apparatus for detecting edge spectral features
US4876728A (en) * 1985-06-04 1989-10-24 Adept Technology, Inc. Vision system for distinguishing touching parts
US4783826A (en) * 1986-08-18 1988-11-08 The Gerber Scientific Company, Inc. Pattern inspection system
US4955062A (en) * 1986-12-10 1990-09-04 Canon Kabushiki Kaisha Pattern detecting method and apparatus
US5081656A (en) * 1987-10-30 1992-01-14 Four Pi Systems Corporation Automated laminography system for inspection of electronics
DE3806305A1 (en) * 1988-02-27 1989-09-07 Basf Ag Method for producing octadienols
US4876457A (en) * 1988-10-31 1989-10-24 American Telephone And Telegraph Company Method and apparatus for differentiating a planar textured surface from a surrounding background
US5081689A (en) * 1989-03-27 1992-01-14 Hughes Aircraft Company Apparatus and method for extracting edges and lines
US5153925A (en) * 1989-04-27 1992-10-06 Canon Kabushiki Kaisha Image processing apparatus
DE3923449A1 (en) * 1989-07-15 1991-01-24 Philips Patentverwaltung Method for determining edges in images
JP3092809B2 (en) * 1989-12-21 2000-09-25 株式会社日立製作所 Inspection method and inspection apparatus having automatic creation function of inspection program data
US4959898A (en) * 1990-05-22 1990-10-02 Emhart Industries, Inc. Surface mount machine with lead coplanarity verifier
US5113565A (en) * 1990-07-06 1992-05-19 International Business Machines Corp. Apparatus and method for inspection and alignment of semiconductor chips and conductive lead frames
US5206820A (en) * 1990-08-31 1993-04-27 At&T Bell Laboratories Metrology system for analyzing panel misregistration in a panel manufacturing process and providing appropriate information for adjusting panel manufacturing processes
US5086478A (en) * 1990-12-27 1992-02-04 International Business Machines Corporation Finding fiducials on printed circuit boards to sub pixel accuracy
US5133022A (en) * 1991-02-06 1992-07-21 Recognition Equipment Incorporated Normalizing correlator for video processing
JP3175175B2 (en) * 1991-03-01 2001-06-11 ミノルタ株式会社 Focus detection device
US5265173A (en) * 1991-03-20 1993-11-23 Hughes Aircraft Company Rectilinear object image matcher
US5273040A (en) * 1991-11-14 1993-12-28 Picker International, Inc. Measurement of vetricle volumes with cardiac MRI
US5371690A (en) * 1992-01-17 1994-12-06 Cognex Corporation Method and apparatus for inspection of surface mounted devices
JP3073599B2 (en) * 1992-04-22 2000-08-07 本田技研工業株式会社 Image edge detection device
US5525883A (en) * 1994-07-08 1996-06-11 Sara Avitzour Mobile robot location determination employing error-correcting distributed landmarks

Also Published As

Publication number Publication date
US5845007A (en) 1998-12-01
WO1997024693A1 (en) 1997-07-10
JP2000503145A (en) 2000-03-14

Similar Documents

Publication Publication Date Title
Bhardwaj et al. A survey on various edge detector techniques
O'Gorman et al. Practical algorithms for image analysis with CD-ROM
US8073286B2 (en) Detection and correction of flash artifacts from airborne particulates
KR100421221B1 (en) Illumination invariant object tracking method and image editing system adopting the method
US6421458B2 (en) Automated inspection of objects undergoing general affine transformation
JP2642215B2 (en) Edge and line extraction method and apparatus
JP2986383B2 (en) Method and apparatus for correcting skew for line scan images
EP0669593B1 (en) Two-dimensional code recognition method
US7058233B2 (en) Systems and methods for constructing an image having an extended depth of field
US7653238B2 (en) Image filtering based on comparison of pixel groups
JP4160258B2 (en) A new perceptual threshold determination for gradient-based local contour detection
US6748104B1 (en) Methods and apparatus for machine vision inspection using single and multiple templates or patterns
Avcibas et al. A classifier design for detecting image manipulations
JP4351911B2 (en) Method and apparatus for evaluating photographic quality of captured images in a digital still camera
US6674904B1 (en) Contour tracing and boundary detection for object identification in a digital image
US7702131B2 (en) Segmenting images and simulating motion blur using an image sequence
JP4695239B2 (en) Defect detection method and apparatus based on shape feature
JP4199939B2 (en) Semiconductor inspection system
Seul et al. Practical Algorithms for Image Analysis with CD-ROM: Description, Examples, and Code
JP5699788B2 (en) Screen area detection method and system
Nayar et al. Shape from focus: An effective approach for rough surfaces
EP0014857B1 (en) Method for the automatic marking of cells and for feature determination of cells from cytological smears
US8731306B2 (en) Increasing interest point coverage in an image
US7162073B1 (en) Methods and apparatuses for detecting classifying and measuring spot defects in an image of an object
US6061476A (en) Method and apparatus using image subtraction and dynamic thresholding

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20031218

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20031218

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20060912

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20061212

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20061212

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20070508

A601 Written request for extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A601

Effective date: 20070808

A602 Written permission of extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A602

Effective date: 20070914

A601 Written request for extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A601

Effective date: 20070910

A602 Written permission of extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A602

Effective date: 20071015

A601 Written request for extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A601

Effective date: 20071009

A602 Written permission of extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A602

Effective date: 20071119

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20071108

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20080325

RD13 Notification of appointment of power of sub attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7433

Effective date: 20080627

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20080723

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A821

Effective date: 20080627

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20080814

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20081003

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20081030

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111107

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20121107

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20121107

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20131107

Year of fee payment: 5

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

EXPY Cancellation because of completion of term