WO1997024693A1 - Machine vision method and apparatus for edge-based image histogram analysis - Google Patents

Machine vision method and apparatus for edge-based image histogram analysis Download PDF

Info

Publication number
WO1997024693A1
WO1997024693A1 PCT/US1996/020900 US9620900W WO9724693A1 WO 1997024693 A1 WO1997024693 A1 WO 1997024693A1 US 9620900 W US9620900 W US 9620900W WO 9724693 A1 WO9724693 A1 WO 9724693A1
Authority
WO
WIPO (PCT)
Prior art keywords
edge
image
generating
magnimde
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US1996/020900
Other languages
English (en)
French (fr)
Inventor
Yoshikazu Ohashi
Russ Weinzimmer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cognex Corp
Original Assignee
Cognex Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cognex Corp filed Critical Cognex Corp
Priority to JP52461897A priority Critical patent/JP4213209B2/ja
Publication of WO1997024693A1 publication Critical patent/WO1997024693A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation

Definitions

  • the invention pertains to machine vision and, more particularly, to methods and apparatus for histogram-based analysis of images.
  • Image segmentation is a technique for determimng the boundary of the object with respect to other features (e.g. , the object's background) in an image.
  • One traditional method for image segmentation involves determining a pixel intensity threshold by constructing a histogram of the intensity values of each pixel in the image.
  • the histogram which tallies the number of pixels at each possible intensity value, has peaks at various predominant intensities in the image, e.g. , the predominant intensities of the object and the background. From this, an intermediate intensity of the border or edge separating the object from the background can be inferred.
  • the histogram of a back-lit object has a peak at the dark end of the intensity spectrum which represents the object or, more particularly, portions of the image where the object prevents the back-lighting from reaching the camera.
  • the image intensity at the edges bounding the object is typically inferred to lie approximately half-way between the light and dark peaks in the histogram.
  • a problem with this traditional image segmentation technique is that its effectiveness is principally limited to use in analysis of images of back-lit objects, or other "bimodal" images (i.e. , images with only two intensity values, such as dark and light).
  • images of back-lit objects or other "bimodal" images (i.e. , images with only two intensity values, such as dark and light).
  • the resulting images are not bimodal but, rather, show a potentially complex pattern of several image intensities.
  • Performing image segmentation by inferring image intensities along the edge of an object of interest from the multitude of resulting histogram peaks is problematic.
  • image characteristics along object edges are used by other machine vision tools to determine an object's location or orientation.
  • Two such tools are the contour angle finder and the "blob" analyzer.
  • the contour angle finder tracks the edge of an object to determine its angular orientation.
  • the blob analyzer also uses an image segmentation thereshold to determine both the location and orientation of the object. Both tools are discussed, by way of example, in United States Patent No. 5,371,060, which is assigned to die assignee hereof, Cognex Corporation.
  • An object of this invention is to provide improved methods and apparatus for machine vision analysis and, particularly, improved methods for determining characteristics of edges and edge regions in an image.
  • a further object of the invention to provide improved methods and apparatus for determining such characteristics as predominant intensity and predominant edge direction.
  • Still another object is to provide such methods and apparatus that can determine edge characteristics from a wide variety of images, regardless of whether the images represent front-lit or back-lit objects.
  • Yet another object of the invention is to provide such methods and apparatus as can be readily adapted for use in a range of automated vision applications, such as automated inspection and assembly, as well as in other machine vision applications involving object location.
  • Yet still another object of the invention is to provide such methods and apparatus that can execute quickly, and without consumption of excessive resources, on a wide range of machine vision analysis equipment.
  • Still yet another object of the invention is to provide articles of manufacture comprising a computer readable medium embodying program code for carrying out such improved metiiods.
  • the invention provides an apparatus for identifying a predominant edge characteristic of an input image that is made up of a plurality of image values (i.e. , "pixels") that contain numerical values representing intensity (e.g. , color or brightness).
  • the apparatus includes an edge detector that generates a plurality of edge magnitude values based on the input image values.
  • the edge detector can be a Sobel operator that generates the magnimde values by taking the derivative of the input image values, i.e. , the rate of change of intensity over a plurality of image pixels.
  • a mask generator creates a mask based upon the values output by the edge detector.
  • the mask generator can create an array of masking values (e.g. Os) and non- masking values (e.g., Is), each of which depends upon the value of the corresponding edge magnitude values. For example, if an edge magnimde value exceeds a tiireshold, the corresponding mask value will contain a 1, otherwise it contains a 0.
  • the mask is utilized for masking input image values that are not in a region for which there is a sufficiently high edge magnitude.
  • the mask is utilized for masking edge direction values tiiat are not in such a region.
  • a mask applicator applies the pixel mask array to a selected image to generate a masked image.
  • That selected image can be an input image or an image generated tiierefrom (e.g. , an edge direction image of the type resulting from application of a Sobel operator to the input image).
  • a histogram generator generates a histogram of the pixel values in the masked image, e.g. , a count of the number of image pixels that pass through the mask at each intensity value or edge direction and that correspond with non-masking values of the pixel mask.
  • a peak detector identifies peaks in the histogram representing, in an image segmentation thresholding aspect of the invention, the predominant image intensity value associated with the edge detected by the edge detector - and, in an object orientation determination aspect of the invention, the predominant edge direction(s) associated with the edge detected by the edge detector.
  • Still further aspects of the invention provide methods for identifying an image segmentation threshold and for object orientation determination paralleling the operations of the apparatus described above.
  • Still other aspects of the invention is an article of manufacture comprising a computer usable medium embodying program code for causing a digital data processor to carry out the above methods of edge-based image histogram analysis.
  • the invention has wide application in industry and research applications.
  • Those aspects of the invention for determining a predominant edge intensity threshold provide, among other things, an image segmentation threshold for use in other machine vision operations, e.g. , contour angle finding and blob analysis, for determining the location and orientation of an object.
  • Those aspects of the invention for determining a predominant edge direction also have wide application in the industry, e.g., for facilitating determination of object orientation, without requiring the use of additional machine vision operations.
  • Figure 1 depicts a digital data processor including apparams for edge-based image histogram analysis according to the invention
  • Figure 2A illustrates the steps in a method for edge-based image histogram analysis according to the invention for use in identifying an image segmentation threshold
  • Figure 2B illustrates the steps in a method for edge-based image histogram analysis according to the invention for use in determining a predominant edge direction in an image
  • Figures 3 A - 3F graphically illustrate, in one dimension (e.g. , a "scan line"), the sequence of images generated by a metiiod and apparatus according to the invention
  • Figures 4A - 4F graphically illustrate, in two dimensions, the sequence of images generated by a method and apparams according to the invention
  • Figure 5 shows pixel values of a sample input image to be processed by a method and apparatus according to the invention
  • Figures 6 - 8 show pixel values of intermediate images and of an edge magnimde image generated by application of Sobel operator during a method and apparams according to the invention
  • Figure 9 shows pixel values of a sharpened edge magnitude image generated by a method and apparatus according to the invention
  • Figure 10 shows a pixel mask array generated by a method and apparams according to the invention
  • Figure 11 shows pixel values of a masked input image generated by a method and apparams according to the invention.
  • Figure 12 illustrates a histogram created from the masked image values of Fig. 11.
  • Figure 1 illustrates a system for edge-based image histogram analysis.
  • a capturing device 10 such as a conventional video camera or scanner, generates an image of a scene including object 1.
  • Digital image data (or pixels) generated by die capturing device 10 represent, in the conventional manner, the image intensity (e.g. , color or brightness) of each point in the scene at the resolution of the capturing device.
  • That digital image data is transmitted from capturing device 10 via a communications path 11 to an image analysis system 12.
  • This can be a conventional digital data processor, or a vision processing system of the type commercially available from the assignee hereof, Cognex Corporation, as programmed in accord with the teachings hereof to perform edge-based image histogram analysis.
  • the image analysis system 12 may have one or more central processing units 13, main memory 14, input- output system 15, and disk drive (or other static mass storage device) 16, all of the conventional type.
  • the system 12 and, more particularly, central processing unit 13, is configured by programming instructions according to teachings hereof for operation as an edge detector 2, mask generator 3, mask applicator 4, histogram generator 5 and peak detector 6, as described in further detail below.
  • edge detector 2 mask generator 3
  • mask applicator 4 histogram generator 5
  • peak detector 6 peak detector 6
  • Figure 2A illustrates the steps in a method for edge-based image histogram analysis according to the invention for use in identifying an image segmentation threshold, that is, for use in determining a predominant intensity of an edge in an image.
  • Figures 3 A - 3F and 4A - 4F graphically illustrate the sequence of images generated by the method of Figure 2 A.
  • the depiction in Figures 4A - 4F is in two dimensions; that of Figures 3A - 3F is in "one dimension," e.g., in the form of a scan line.
  • Figures 5 - 12 illustrate in numerical format the values of pixels of a similar sequence of images, as well as intermediate arrays generated by the method of Figure 2 A.
  • a scene including an object of interest is captured by image capture device 10 (of Figure 1).
  • image capture device 10 of Figure 1.
  • a digital image representing the scene is generated by the capturing device 10 in the conventional manner, with pixels representing the image intensity (e.g. , color or brightness) of each point in the scene per the resolution of the capturing device 10.
  • the captured image is referred to herein as the "input" or "original” image.
  • the input image is assumed to show a dark rectangle against a white background. This is depicted graphically in Figures 3A and 4A. Likewise, it is depicted numerically in the Figure 5, where the dark rectangle is represented by a grid of image intensity 10 against a background of image intensity 0.
  • step 22 the illustrated metiiod operates as an edge detector, generating an edge magnitude image with pixels that correspond to input image pixels, but which reflect the rate of change (or derivative) of image intensities represented by the input image pixels.
  • edge detector step 22 utilizes a Sobel operator to determine the magnitudes of those rates of change and, more particularly, to determine the rates of change along each of the x-axis and y-axis directions of the input image. Use of the Sobel operator to determine rates of change of image intensities is well known in the art.
  • the rate of change of the pixels of the input image along the x-axis is preferably determined by convolving that image with the matrix:
  • Figure 6 illustrates the intermediate image resulting from convolution of the input image of Figure 5 with the foregoing matrix.
  • the rate of change of the pixels of the input image along the y-axis is preferably determined by convolving that image with the matrix:
  • Figure 7 illustrates the intermediate image resulting from convolution of the input image of Figure 5 with the foregoing matrix.
  • the pixels of the edge magnimde image e.g. , of Figure 8 are determined as the square-root of the sum of the squares of the corresponding pixels of the intermediate images, e.g. , of Figures 6 and 7.
  • the magnitude represented by the pixel in row 1 /column 1 of the edge magnimde image of Figure 8 is the square root of the sum of (1) the square of the value in row 1/column 1 of the intermediate image of Figure 6, and (2) the square of the value in row 1/column 1 of the intermediate image of Figure 7.
  • step 23 the illustrated method operates as peak sharpener, generating a sharpened edge magnimde image that duplicates the edge magnimde image, but with sha ⁇ er peaks.
  • the peak sha ⁇ ening step 23 strips all but the largest edge magnimde values from any peaks in the edge magnimde image.
  • sha ⁇ ened edge magnimde images are shown in Figure 3C (graphic, one dimension), Figure 4C (graphic, two dimensions) and Figure 9 (numerical, by pixel).
  • the sha ⁇ ened edge magnimde image is generated by applying the cross-shaped neihborhood operator shown below to the edge magnimde image. Only those edge magnimde values in the center of each neighborhood that are the largest values for the entire neighborhood are assigned to the sha ⁇ ened edge magnimde image, thereby, narrowing any edge magnimde value peaks.
  • the illustrated method operates as a mask generator for generating a pixel mask array from the sha ⁇ ened edge magnimde image.
  • the mask generator step 24 generates the array with masking values (e.g. Os) and non-masking values (e.g. , Is), each of which depends upon the value of a corresponding pixel in the sha ⁇ ened edge magnimde image.
  • masking values e.g. Os
  • non-masking values e.g. , Is
  • the mask generator step 24 is tantamount to binarization of the sha ⁇ ened edge magnimde image.
  • the examples shown in the drawings are labelled accordingly. See, the mask or binarized sha ⁇ ened edge magnimde image shown in Figures 3D, 4D and 10.
  • the threshold value is 42.
  • the peak sha ⁇ ening step 23 is optional and that the mask can be generated by directly “binarizing" the edge magnimde image.
  • the illustrated method operates as a mask applicator, generating a masked image by applying the pixel mask array to the input image to include in the masked image only those pixels from the input image that correspond to non-masking values in the pixel array.
  • the pixel mask array has the effect of passing through to the masked image only those pixels of the input image that are in a region for which there is a sufficiently high edge magnimde.
  • a masked input image is shown in Figure 3E (graphic, one dimension), Figure 4E (graphic, two dimensions) and Figure 11 (numerical, by pixel).
  • Figure 11 particularly, reveals that the masked image contains a portion of the dark square (the value 10) and a portion of the background (the value 0) of the input image.
  • the illustrated method operates as a histogram generator, generating a histogram of values in the masked image, i.e. , a tally of die number of non ⁇ zero pixels at each possible intensity value that passed from the input image through the pixel mask array.
  • a histogram is shown in Figures 3F, 4F and 12.
  • the illustrated metiiod operates as a peak detector, generating an output signal representing peaks in the histogram.
  • those peaks represent various predominant edge intensities in the masked input image.
  • This information may be utilized in connection with other machine vision tools, e.g. , contour angle finders and blob analyzers, to determine the location and/or orientation of an object.
  • a software listing for a preferred embodiment for identifying an image segmentation threshold according to the invention is reprinted below.
  • the program is implemented in the C programming language on the UNIX operating system.
  • Figure 2B illustrates the steps in a method for edge-based image histogram analysis according to the invention for use in determining object orientation by determining one or more predominant directions of an edge in an image. Where indicated by like reference numbers, the steps of the method shown in Figure 2B are identical to tiiose of Figure 2 A.
  • the method of Figure 2B includes the additional step of generating an edge direction image with pixels that correspond to input image pixels, but which reflect the direction of the rate of change (or derivative) of image intensities represented by the input image pixels.
  • step 28 utilizes a Sobel operator of the type described above to determine the direction of those rates of change. As before, use of the Sobel operator in this manner is well known in the art.
  • mask applicator step 25 of Figure 2 A generates a masked image by applying the pixel mask array to the input image
  • the mask applicator step 25' of Figure 2B generates the masked image by applying the pixel mask array to the edge direction image.
  • the output of mask applicator step 25' is a masked edge direction image (as opposed to a masked input image). Consequently, the histogram generating step 26 and the peak finding step 27 of Figure 2B have the effect of calling out one or more predominant edge directions (as opposed to the predominant image intensities) of the input image.
  • the orientation of the object bounded by the edge can be inferred.
  • the values of the predominant edge directions can be inte ⁇ reted modulo 90-degrees to determine angular rotation.
  • the values of the predominant edge directions can also be readily determined and, in turn, used to determine the orientation of the object.
  • the orientation of still other regularly shaped objects can be inferred from the predominant edge direction information supplied by the method of Figure 2B.
  • a further embodiment of the invention provides an article of manufacmre, to wit, a magnetic diskette (not illustrated), composed of a computer readable media, to wit, a magnetic disk, embodying a computer program that causes image analysis system 12 ( Figure 1) to be configured as, and operate in accord with, the edge-based histogram analysis apparams and method described herein.
  • a diskette is of conventional construction and has the computer program stored on the magnetic media therein in a conventional manner readable, e.g. , via a read/write head contained in a diskette drive of image analyzer 12.
  • # ifndef c_ansiprot_h include ⁇ ansiprot.h> #endif
  • #ifndef c_cip_peak_h .. include ⁇ edgedet/cip_peak.h> /* cip_peak_params */ #endif
  • the Sobel Threshold is such an intensity that
  • cip_edge_params * edge_params_p /* input to cip_sobel_init() */ cip_peak_params * peak_params_p; /* a part of cip_edge_params */ cip_edge_data * edge_data_p; /* output from cip_sobel_init() +/ /* input to cip_sobel(). Need not */ /* be allocated by a user */ cip_edge_results * edge_results_p; /* output from cip_sobel() */ int internal; /* internal use */ ⁇ ceet data;
  • threshold e.g. the Sobel
  • VC2 is used for Sobel operations and VCl (or VC3) is
  • ceet_create() can be modified. (See cip_edge.h and cip_peak.h for
  • ceet_delete PROTO (ceet_data * thresh_data)
  • ceet_data * thresh_data; /* ceet_data pointer */ thresh_data ceet_create(NULL); /* when ceet_create() is called */
  • thresh_data->peak_params_p is NULL
  • # include ⁇ cvc.h> #endif
  • # include ⁇ cvc_def.h > #endif
  • #ifndef c_cct_h ⁇ include ⁇ cct.h > /* CGEN ERR BADARG - via cgen er.h */ #endif #ifndef c_cgr_zoom_h #include ⁇ cgr_zoom.h> /* scroll/zoom graphics */
  • cip_buffer *cip_minmax PROTO ((cip_buffer*,cip_buffer*,cip_buffer*, cip_buffer*, unsigned char *, cip_minmax_mode, int *, int *)); /* * If VC2 is not available, cip_minmax() is not called because of its * incompatible results. Instead, cip_min() is used. */
  • peak_params_p (cip_peak_params *)cvc_alloc(sizeof(cip_peak_params)) ;
  • edge_results_p edge_results */
  • edge_params_p 159 if(!td- > edge_params_p) /* edge_params */
  • edge_params_p (cip_edge_params *) cvc_alloc(sizeof(cip_edge_params));
  • edge_params_p edge_params_p; 167
  • edge_params_p- > num_bits_xy * * 7 ;
  • edge_params_p- > xy_compress_type CIP EDGE MAP LINEAR;
  • edge_params_p- > mag_compress_type CIP EDGE MAP LINEAR; 174
  • edge_params_p- > output_type CIP_EDGE_OUTPUT_MAG
  • edge _p arams_p- > xy_compress_map_p NULL;
  • mapping_p NULL
  • edge_params_p- > peak_params_p td- > peak_paramsj_;
  • edge_data_p cip_sobel_init(td- > edge_params_p);
  • ttable is a threshold map table ⁇ 0,0,0,...,0,255,...,255.255 ⁇
  • map is a subset in ttable such that
  • 331 tablet tp->diag_context->tablet
  • mag_image td->edge_results_p->mag_img_p;
  • 390 limit tp->sobel_top * (mag_image->width * mag_image->height - hist[0]); 391
  • 396 map &ttable[256-threshold]; /* starting address of a map array */ 397
  • bin_image cip_create(mag_image->width,mag_image->height,8);
  • zoom cgrz_get_zoom (tablet);
  • zoom cgrz_get_zoom (tablet); 554
  • ceet_test_bIob PROTO ( ceet_params*, ceet_demo.c
  • win_src ceet_get_image(&tablet, ctb3, &x_offset, &y_offset);
  • bin_image cip_create(win_original- > width, win original- > height, 8) ;
  • static void place_box PROTO ((cgrz_tablet * t, box * w,ctb3_obj * ctb3_p)); static void draw_box PROTO ((cgrz_tablet * t, box * w, int color));
  • do_pe 1 values 1 ;
  • img_copy cip_create(img- > width, img- > height, 8);
  • CDB_ERR_BADFD printf("Bad file name (%s)",infile); break;
  • CDB_ERR_OPEN printf("Bad drive name (%s)",infile); break;
  • CDB_ERR_EOF printf(" Passed end of file (rec # %d)",rec); break;

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
PCT/US1996/020900 1996-01-02 1996-12-31 Machine vision method and apparatus for edge-based image histogram analysis Ceased WO1997024693A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP52461897A JP4213209B2 (ja) 1996-01-02 1996-12-31 エッジベースのイメージ・ヒストグラム解析用マシン・ビジョンの方法および装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US08/581,975 US5845007A (en) 1996-01-02 1996-01-02 Machine vision method and apparatus for edge-based image histogram analysis
US08/581,975 1996-01-02

Publications (1)

Publication Number Publication Date
WO1997024693A1 true WO1997024693A1 (en) 1997-07-10

Family

ID=24327330

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1996/020900 Ceased WO1997024693A1 (en) 1996-01-02 1996-12-31 Machine vision method and apparatus for edge-based image histogram analysis

Country Status (3)

Country Link
US (1) US5845007A (enExample)
JP (1) JP4213209B2 (enExample)
WO (1) WO1997024693A1 (enExample)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2358702A (en) * 2000-01-26 2001-08-01 Robotic Technology Systems Plc Providing information of objects by imaging
US6516092B1 (en) 1998-05-29 2003-02-04 Cognex Corporation Robust sub-model shape-finder
US7006669B1 (en) 2000-12-31 2006-02-28 Cognex Corporation Machine vision method and apparatus for thresholding images of non-uniform materials
WO2009082719A1 (en) * 2007-12-24 2009-07-02 Microsoft Corporation Invariant visual scene and object recognition
US7639861B2 (en) 2005-09-14 2009-12-29 Cognex Technology And Investment Corporation Method and apparatus for backlighting a wafer during alignment
WO2010044745A1 (en) * 2008-10-16 2010-04-22 Nanyang Polytechnic Process and device for automated grading of bio-specimens
US8867847B2 (en) 1998-07-13 2014-10-21 Cognex Technology And Investment Corporation Method for fast, robust, multi-dimensional pattern recognition
US8942917B2 (en) 2011-02-14 2015-01-27 Microsoft Corporation Change invariant scene recognition by an agent
US9147252B2 (en) 2003-07-22 2015-09-29 Cognex Technology And Investment Llc Method for partitioning a pattern into optimized sub-patterns
CN108335701A (zh) * 2018-01-24 2018-07-27 青岛海信移动通信技术股份有限公司 一种进行声音降噪的方法及设备

Families Citing this family (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5631734A (en) 1994-02-10 1997-05-20 Affymetrix, Inc. Method and apparatus for detection of fluorescently labeled materials
US6259827B1 (en) 1996-03-21 2001-07-10 Cognex Corporation Machine vision methods for enhancing the contrast between an object and its background using multiple on-axis images
US5933523A (en) * 1997-03-18 1999-08-03 Cognex Corporation Machine vision method and apparatus for determining the position of generally rectangular devices using boundary extracting features
US6075881A (en) 1997-03-18 2000-06-13 Cognex Corporation Machine vision methods for identifying collinear sets of points from an image
US6608647B1 (en) 1997-06-24 2003-08-19 Cognex Corporation Methods and apparatus for charge coupled device image acquisition with independent integration and readout
US6011558A (en) * 1997-09-23 2000-01-04 Industrial Technology Research Institute Intelligent stitcher for panoramic image-based virtual worlds
US6094508A (en) * 1997-12-08 2000-07-25 Intel Corporation Perceptual thresholding for gradient-based local edge detection
US6434271B1 (en) * 1998-02-06 2002-08-13 Compaq Computer Corporation Technique for locating objects within an image
US6381375B1 (en) 1998-02-20 2002-04-30 Cognex Corporation Methods and apparatus for generating a projection of an image
US6687402B1 (en) 1998-12-18 2004-02-03 Cognex Corporation Machine vision methods and systems for boundary feature comparison of patterns and images
US6381366B1 (en) 1998-12-18 2002-04-30 Cognex Corporation Machine vision methods and system for boundary point-based comparison of patterns and images
US6807298B1 (en) 1999-03-12 2004-10-19 Electronics And Telecommunications Research Institute Method for generating a block-based image histogram
US6684402B1 (en) 1999-12-01 2004-01-27 Cognex Technology And Investment Corporation Control methods and apparatus for coupling multiple image acquisition devices to a digital data processor
US6748104B1 (en) 2000-03-24 2004-06-08 Cognex Corporation Methods and apparatus for machine vision inspection using single and multiple templates or patterns
US6999619B2 (en) * 2000-07-12 2006-02-14 Canon Kabushiki Kaisha Processing for accurate reproduction of symbols and other high-frequency areas in a color image
EP1301806A1 (en) * 2000-07-18 2003-04-16 PamGene B.V. Method for locating areas of interest on a substrate
US7522745B2 (en) 2000-08-31 2009-04-21 Grasso Donald P Sensor and imaging system
EP1323292A2 (en) * 2000-09-21 2003-07-02 Applied Science Fiction Dynamic image correction and imaging systems
US6748110B1 (en) * 2000-11-09 2004-06-08 Cognex Technology And Investment Object and object feature detector system and method
US8682077B1 (en) 2000-11-28 2014-03-25 Hand Held Products, Inc. Method for omnidirectional processing of 2D images including recognizable characters
US6681151B1 (en) * 2000-12-15 2004-01-20 Cognex Technology And Investment Corporation System and method for servoing robots based upon workpieces with fiducial marks using machine vision
US6987875B1 (en) 2001-05-22 2006-01-17 Cognex Technology And Investment Corporation Probe mark inspection method and apparatus
US6879389B2 (en) * 2002-06-03 2005-04-12 Innoventor Engineering, Inc. Methods and systems for small parts inspection
AU2003270386A1 (en) 2002-09-06 2004-03-29 Rytec Corporation Signal intensity range transformation apparatus and method
US7502525B2 (en) * 2003-01-27 2009-03-10 Boston Scientific Scimed, Inc. System and method for edge detection of an image
JP4068596B2 (ja) * 2003-06-27 2008-03-26 株式会社東芝 図形処理方法、図形処理装置およびコンピュータ読取り可能な図形処理プログラム
US7190834B2 (en) 2003-07-22 2007-03-13 Cognex Technology And Investment Corporation Methods for finding and characterizing a deformed pattern in an image
US8437502B1 (en) 2004-09-25 2013-05-07 Cognex Technology And Investment Corporation General pose refinement and tracking tool
US7416125B2 (en) * 2005-03-24 2008-08-26 Hand Held Products, Inc. Synthesis decoding and methods of use thereof
US8111904B2 (en) 2005-10-07 2012-02-07 Cognex Technology And Investment Corp. Methods and apparatus for practical 3D vision system
US8055098B2 (en) 2006-01-27 2011-11-08 Affymetrix, Inc. System, method, and product for imaging probe arrays with small feature sizes
US9445025B2 (en) 2006-01-27 2016-09-13 Affymetrix, Inc. System, method, and product for imaging probe arrays with small feature sizes
JP4973008B2 (ja) * 2006-05-26 2012-07-11 富士通株式会社 車両判別装置及びそのプログラム
US8162584B2 (en) 2006-08-23 2012-04-24 Cognex Corporation Method and apparatus for semiconductor wafer alignment
US20090202175A1 (en) * 2008-02-12 2009-08-13 Michael Guerzhoy Methods And Apparatus For Object Detection Within An Image
US8675060B2 (en) * 2009-08-28 2014-03-18 Indian Institute Of Science Machine vision based obstacle avoidance system
JP5732217B2 (ja) * 2010-09-17 2015-06-10 グローリー株式会社 画像2値化方法および画像2値化装置
US9390320B2 (en) 2013-06-10 2016-07-12 Intel Corporation Performing hand gesture recognition using 2D image data
US9679224B2 (en) 2013-06-28 2017-06-13 Cognex Corporation Semi-supervised method for training multiple pattern recognition and registration tool models
CN103675588B (zh) * 2013-11-20 2016-01-20 中国矿业大学 印刷电路元件极性的机器视觉检测方法及设备
US9704057B1 (en) * 2014-03-03 2017-07-11 Accusoft Corporation Methods and apparatus relating to image binarization
KR102555096B1 (ko) * 2016-06-09 2023-07-13 엘지디스플레이 주식회사 데이터 압축 방법 및 이를 이용한 유기 발광 다이오드 표시 장치

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4876457A (en) * 1988-10-31 1989-10-24 American Telephone And Telegraph Company Method and apparatus for differentiating a planar textured surface from a surrounding background
US5153925A (en) * 1989-04-27 1992-10-06 Canon Kabushiki Kaisha Image processing apparatus
US5225940A (en) * 1991-03-01 1993-07-06 Minolta Camera Kabushiki Kaisha In-focus detection apparatus using video signal

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5425782B2 (enExample) * 1973-03-28 1979-08-30
CH611017A5 (enExample) * 1976-05-05 1979-05-15 Zumbach Electronic Ag
US4183013A (en) * 1976-11-29 1980-01-08 Coulter Electronics, Inc. System for extracting shape features from an image
JPS5369063A (en) * 1976-12-01 1978-06-20 Hitachi Ltd Detector of position alignment patterns
US4200861A (en) * 1978-09-01 1980-04-29 View Engineering, Inc. Pattern recognition apparatus and method
JPS57102017A (en) * 1980-12-17 1982-06-24 Hitachi Ltd Pattern detector
EP0095517B1 (de) * 1982-05-28 1985-11-21 Ibm Deutschland Gmbh Verfahren und Einrichtung zur automatischen optischen Inspektion
US4736437A (en) * 1982-11-22 1988-04-05 View Engineering, Inc. High speed pattern recognizer
US4860374A (en) * 1984-04-19 1989-08-22 Nikon Corporation Apparatus for detecting position of reference pattern
US4688088A (en) * 1984-04-20 1987-08-18 Canon Kabushiki Kaisha Position detecting device and method
DE3580918D1 (de) * 1984-12-14 1991-01-24 Sten Hugo Nils Ahlbom Anordnung zur behandlung von bildern.
US4685143A (en) * 1985-03-21 1987-08-04 Texas Instruments Incorporated Method and apparatus for detecting edge spectral features
US4876728A (en) * 1985-06-04 1989-10-24 Adept Technology, Inc. Vision system for distinguishing touching parts
US4783826A (en) * 1986-08-18 1988-11-08 The Gerber Scientific Company, Inc. Pattern inspection system
US4955062A (en) * 1986-12-10 1990-09-04 Canon Kabushiki Kaisha Pattern detecting method and apparatus
US5081656A (en) * 1987-10-30 1992-01-14 Four Pi Systems Corporation Automated laminography system for inspection of electronics
DE3806305A1 (de) * 1988-02-27 1989-09-07 Basf Ag Verfahren zur herstellung von octadienolen
US5081689A (en) * 1989-03-27 1992-01-14 Hughes Aircraft Company Apparatus and method for extracting edges and lines
DE3923449A1 (de) * 1989-07-15 1991-01-24 Philips Patentverwaltung Verfahren zum bestimmen von kanten in bildern
JP3092809B2 (ja) * 1989-12-21 2000-09-25 株式会社日立製作所 検査方法、並びに検査プログラムデータの自動作成機能を有する検査装置
US4959898A (en) * 1990-05-22 1990-10-02 Emhart Industries, Inc. Surface mount machine with lead coplanarity verifier
US5113565A (en) * 1990-07-06 1992-05-19 International Business Machines Corp. Apparatus and method for inspection and alignment of semiconductor chips and conductive lead frames
US5206820A (en) * 1990-08-31 1993-04-27 At&T Bell Laboratories Metrology system for analyzing panel misregistration in a panel manufacturing process and providing appropriate information for adjusting panel manufacturing processes
US5086478A (en) * 1990-12-27 1992-02-04 International Business Machines Corporation Finding fiducials on printed circuit boards to sub pixel accuracy
US5133022A (en) * 1991-02-06 1992-07-21 Recognition Equipment Incorporated Normalizing correlator for video processing
US5265173A (en) * 1991-03-20 1993-11-23 Hughes Aircraft Company Rectilinear object image matcher
US5273040A (en) * 1991-11-14 1993-12-28 Picker International, Inc. Measurement of vetricle volumes with cardiac MRI
US5371690A (en) * 1992-01-17 1994-12-06 Cognex Corporation Method and apparatus for inspection of surface mounted devices
JP3073599B2 (ja) * 1992-04-22 2000-08-07 本田技研工業株式会社 画像のエッジ検出装置
US5525883A (en) * 1994-07-08 1996-06-11 Sara Avitzour Mobile robot location determination employing error-correcting distributed landmarks

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4876457A (en) * 1988-10-31 1989-10-24 American Telephone And Telegraph Company Method and apparatus for differentiating a planar textured surface from a surrounding background
US5153925A (en) * 1989-04-27 1992-10-06 Canon Kabushiki Kaisha Image processing apparatus
US5225940A (en) * 1991-03-01 1993-07-06 Minolta Camera Kabushiki Kaisha In-focus detection apparatus using video signal

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6516092B1 (en) 1998-05-29 2003-02-04 Cognex Corporation Robust sub-model shape-finder
US8867847B2 (en) 1998-07-13 2014-10-21 Cognex Technology And Investment Corporation Method for fast, robust, multi-dimensional pattern recognition
GB2358702A (en) * 2000-01-26 2001-08-01 Robotic Technology Systems Plc Providing information of objects by imaging
US7006669B1 (en) 2000-12-31 2006-02-28 Cognex Corporation Machine vision method and apparatus for thresholding images of non-uniform materials
US9147252B2 (en) 2003-07-22 2015-09-29 Cognex Technology And Investment Llc Method for partitioning a pattern into optimized sub-patterns
US7639861B2 (en) 2005-09-14 2009-12-29 Cognex Technology And Investment Corporation Method and apparatus for backlighting a wafer during alignment
WO2009082719A1 (en) * 2007-12-24 2009-07-02 Microsoft Corporation Invariant visual scene and object recognition
CN101911116B (zh) * 2007-12-24 2013-01-02 微软公司 不变视觉场景和对象识别
US8406535B2 (en) 2007-12-24 2013-03-26 Microsoft Corporation Invariant visual scene and object recognition
US8036468B2 (en) 2007-12-24 2011-10-11 Microsoft Corporation Invariant visual scene and object recognition
WO2010044745A1 (en) * 2008-10-16 2010-04-22 Nanyang Polytechnic Process and device for automated grading of bio-specimens
US8942917B2 (en) 2011-02-14 2015-01-27 Microsoft Corporation Change invariant scene recognition by an agent
US9619561B2 (en) 2011-02-14 2017-04-11 Microsoft Technology Licensing, Llc Change invariant scene recognition by an agent
CN108335701A (zh) * 2018-01-24 2018-07-27 青岛海信移动通信技术股份有限公司 一种进行声音降噪的方法及设备

Also Published As

Publication number Publication date
JP4213209B2 (ja) 2009-01-21
US5845007A (en) 1998-12-01
JP2000503145A (ja) 2000-03-14

Similar Documents

Publication Publication Date Title
WO1997024693A1 (en) Machine vision method and apparatus for edge-based image histogram analysis
US5706416A (en) Method and apparatus for relating and combining multiple images of the same scene or object(s)
Bradley et al. Adaptive thresholding using the integral image
Fathy et al. An image detection technique based on morphological edge detection and background differencing for real-time traffic analysis
US7254268B2 (en) Object extraction
US6748104B1 (en) Methods and apparatus for machine vision inspection using single and multiple templates or patterns
US6342917B1 (en) Image recording apparatus and method using light fields to track position and orientation
US6548800B2 (en) Image blur detection methods and arrangements
US10853990B2 (en) System and method for processing a graphic object
US10930059B2 (en) Method and apparatus for processing virtual object lighting inserted into a 3D real scene
US20020196252A1 (en) Method and apparatus for rendering three-dimensional images with tile-based visibility preprocessing
US11580634B2 (en) System and method for automated surface assessment
US5978502A (en) Machine vision methods for determining characteristics of three-dimensional objects
US7095893B2 (en) System and method for determining an image decimation range for use in a machine vision system
CN113240672A (zh) 镜头污染物的检测方法、装置、设备及存储介质
Shostko et al. Optical-electronic system of automatic detection and higt-precise tracking of aerial objects in real-time.
CN114022843A (zh) 图片处理方法、装置、计算机设备和存储介质
CN115880365A (zh) 一种双工位自动螺丝拧装检测方法、系统及装置
US20240029423A1 (en) Method for detecting defect in image and device for detecting defect in image
AU2011265340A1 (en) Method, apparatus and system for determining motion of one or more pixels in an image
US20030185431A1 (en) Method and system for golden template image extraction
US7298918B2 (en) Image processing apparatus capable of highly precise edge extraction
Loomis et al. Performance development of a real-time vision system
CN115880327B (zh) 抠图方法、摄像设备、会议系统、电子设备、装置、介质
CN113256482B (zh) 一种拍照背景虚化方法、移动终端及存储介质

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): CA JP KR SG

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase