US20190056333A1 - Method and system for detecting defects on surface of object - Google Patents

Method and system for detecting defects on surface of object Download PDF

Info

Publication number
US20190056333A1
US20190056333A1 US15/678,335 US201715678335A US2019056333A1 US 20190056333 A1 US20190056333 A1 US 20190056333A1 US 201715678335 A US201715678335 A US 201715678335A US 2019056333 A1 US2019056333 A1 US 2019056333A1
Authority
US
United States
Prior art keywords
image
images
defect
operations
potential defect
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US15/678,335
Other versions
US10215714B1 (en
Inventor
Ziyan Wu
Rameswar Panda
Jan Ernst
Kevin P. Bailey
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens Energy Inc
Original Assignee
Siemens Energy Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Energy Inc filed Critical Siemens Energy Inc
Priority to US15/678,335 priority Critical patent/US10215714B1/en
Assigned to SIEMENS ENERGY, INC. reassignment SIEMENS ENERGY, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BAILEY, KEVIN P.
Assigned to SIEMENS CORPORATION reassignment SIEMENS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ERNST, JAN, WU, ZIYAN, PANDA, RAMESWAR
Assigned to SIEMENS ENERGY, INC. reassignment SIEMENS ENERGY, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SIEMENS CORPORATION
Priority to DE102018211453.6A priority patent/DE102018211453B4/en
Publication of US20190056333A1 publication Critical patent/US20190056333A1/en
Application granted granted Critical
Publication of US10215714B1 publication Critical patent/US10215714B1/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0006Industrial image inspection using a design-rule based approach
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/956Inspecting patterns on the surface of objects
    • G06K9/4661
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/145Illumination specially adapted for pattern recognition, e.g. using gratings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/60Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • G06V10/993Evaluation of the quality of the acquired pattern
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10152Varying illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Definitions

  • This invention relates generally to a method and a system for detecting a defect on a surface of an object.
  • Defect detection is an important aspect of industrial production quality assurance process which may provide an important guarantee for quality of products.
  • Defects may be cracks on a surface of an object. Defects may be very small. Scale of the defects may be as small as micrometers.
  • micro defect detection involves significant manual process which may not guarantee consistent quality of detection. Due to small scale of the micro defects, it may require a considerably long detection time to achieve certain accuracy using traditional methods. Some traditional detection methods may require applying certain chemicals which may damage the object. Methods currently being used in industry for detecting defects on surface of object are time consuming and may not provide consistent quality of detection.
  • aspects of the present invention relate to a method and a system for detecting a defect on a surface of an object.
  • a method for detecting a defect on a surface of an object comprises supporting the object on a platform.
  • the method comprises illuminating the surface of the object with a plurality of illumination sources comprising at least one ambient illumination source and at least one dark field illumination source.
  • the method comprises capturing images of the surface of the object under illumination conditions with the illumination sources using an imaging device.
  • the method comprises processing the captured images with a plurality of image operations using an image processor to detect an area of a potential defect at a location on the surface of the object.
  • the method comprises cutting the area of the potential defect from the processed images to sub images using the image processor.
  • the method comprises stitching the sub images together to generate a hypothesis of the potential defect at the location on the surface of the object using the image processor.
  • the method comprises classifying the hypothesis in the stitched image with a classifier to determine whether the potential defect is a true defect using the image processor.
  • the classifier is trained with training data having characteristics of the true defect.
  • the method comprises generating an output of the classification comprising the detected true defect and the location on the surface of the object.
  • a system for detecting a defect on a surface of an object comprises a platform for supporting the object.
  • the system comprises a plurality of illumination sources comprising at least one ambient illumination source and at least one dark field illumination source for illuminating the surface of the object.
  • the system comprises an imaging device for capturing images of the surface of the object under illumination conditions with the illumination sources.
  • the system comprises an image processor.
  • the image processor processes the captured images with a plurality of image operations using to detect an area of a potential defect at a location on the surface of the object.
  • the image processor cuts the area of the potential defect from the processed images to sub images.
  • the image processor stitches the sub images together to generate a hypothesis of the potential defect at the location on the surface of the object.
  • the image processor classifies the hypothesis in the stitched image with a classifier to determine whether the potential defect is a true defect.
  • the classifier is trained with training data having characteristics of the true defect.
  • the image processor generates an output of the classification comprising the detected true defect and the location on the surface of the object.
  • a computer program executable in a computer for performing a method of detecting a defect on a surface of an object stores images of the surface of the object under illumination conditions with illumination sources comprising at least one ambient illumination source and at least one dark field illumination source.
  • the method comprises step of processing the images with a plurality of image operations to detect an area of a potential defect at a location on the surface of the object.
  • the method comprises step of cutting the area of the potential defect from the processed images to sub images.
  • the method comprises step of stitching the sub images together to generate a hypothesis of the potential defect at the location on the surface of the object.
  • the method comprises step of classifying the hypothesis in the stitched image with a classifier to determine whether the potential defect is a true defect.
  • the classifier is trained with training data having characteristics of the true defect.
  • the method comprises step of generating an output of the classification comprising the detected true defect and the location on the surface of the object.
  • FIG. 1 illustrates a schematic side view of a system for detecting a defect at a surface of an object according to an embodiment of the invention
  • FIG. 2 illustrates a schematic top view of a system for detecting a defect at a surface of an object according to an embodiment of the invention
  • FIG. 3 illustrates a schematic diagram of a pattern consisting of a bright region and a shadow region of a defect on a surface under a dark field illumination source according to an embodiment of the invention
  • FIG. 4 illustrates a schematic flow chart of a method for detecting a defect at a surface of an object according to an embodiment of the invention.
  • FIG. 5 illustrates a schematic flow chart of a step for processing images of the method as illustrated in FIG. 4 according to an embodiment of the invention.
  • FIGS. 1 and 2 respectively illustrate a schematic side view and top view of a system 100 for detecting a defect at a surface 112 of an object 110 according to an embodiment of the invention.
  • the system 100 may include a platform 120 that supports the object 110 .
  • the system 100 may include a motor 122 .
  • the platform 120 may be movable along at least one direction by the motor 122 .
  • the motor 122 may have a motor controller 124 that controls a movement of the platform 120 .
  • the system 100 may include a hood 130 arranged above the platform 120 .
  • the hood 130 may have a hollow cone shape.
  • the system 100 may have an imaging device 140 .
  • the imaging device 140 may be arranged inside the hood 130 .
  • the imaging device 140 may be located at top of the hood 130 .
  • the imaging device 140 may include, for example, a camera.
  • the imaging device 140 may include a lens 142 .
  • the lens 142 may be arranged at a location relative to the surface 112 of the object 110 such that the imaging device 140 may have a desirable field of view 144 on the surface 112 of the object 110 .
  • the imaging device 140 may pan and tilt relative to the surface 112 of the object 110 to achieve a desired field of view 144 on the surface 112 of the object 100 .
  • the motor 122 may move the platform 120 along with the object 110 so that the field of view 144 of the imaging device 140 may cover different areas of the surface 112 of the object 110 .
  • the system 100 may have at least one ambient illumination source 150 .
  • the ambient illumination source 150 may be arranged inside the hood 130 .
  • the ambient illumination source 150 may be located at top of the hood 130 .
  • the ambient illumination source 150 may provide an ambient illumination condition at the surface 112 of the object 110 .
  • the ambient illumination source 150 may include, for example, a light emitting diode (LED) strobe light.
  • the ambient illumination source 150 may have a ring shape. According to an exemplary embodiment as illustrated in FIG. 1 , the lens 142 may extend through the ring shaped ambient illumination source 150 .
  • the system 100 may include at least one dark field illumination source 160 .
  • the dark field illumination source 160 may be arranged at bottom of the hood 130 .
  • the dark field illumination source 160 may provide a dark field illumination condition on the surface 112 of the object 110 .
  • the dark field illumination source 160 may include, for example, a LED strobe light.
  • the dark field illumination source 160 may be oriented at a location relative to the surface 112 of the object 110 such that a dark field illumination condition may be provided within the field of view 144 of the lens 142 at the surface 112 of the object 110 . Under a dark field illumination condition, a defect may have a predictable pattern consisting of a bright region that is illuminated by the dark field illumination source 160 and a dark or shadow region that is not illuminated by the dark field illumination source 160 .
  • FIG. 2 illustrates a schematic top view of the system 100
  • four dark field illumination sources 160 are arranged at bottom of the hood 130 .
  • Two dark field illumination sources 160 are arranged along an x-axis in a plane of the field of view 144 .
  • the two dark field illumination sources 160 are located at two sides of the field of view 144 respectively, denoted x-positive and x-negative.
  • Two other dark field illumination sources 160 are arranged along a y-axis in the plane of the field of view 144 .
  • the two other dark field illumination sources 160 are located at two sides of the field of view 144 respectively, denoted y-positive and y-negative.
  • Different numbers of dark field illumination sources 160 may be arranged at bottom of the hood 130 .
  • the system 100 may include a trigger controller 170 .
  • the trigger controller 170 may functionally connect to the imaging device 140 , the ambient illumination source 150 and the dark field illumination sources 160 .
  • the trigger controller 170 may trigger the ambient illumination source 150 and the dark field illumination sources 160 at a defined pattern, sequence, or simultaneously.
  • the trigger controller 170 may trigger the imaging device 140 to capture images of the surface 112 of the object 110 under the triggered illumination conditions respectively.
  • the trigger controller 170 may control configurations of the dark field illumination sources 160 .
  • the configurations of the dark field illumination sources 160 may include orientations of the dark field illumination sources 160 relative to the surface 112 of the object 110 , illumination intensities of the dark field illumination sources 160 , etc.
  • the trigger controller 170 may be a computer having a computer program implemented.
  • the system 100 may include an image processor 180 .
  • the image processor 180 may functionally connect to the imaging device 140 .
  • the images captured by the imaging device 140 may be stored in the image processor 180 .
  • the image processor 180 may also process the images captured by the imaging device 140 to detect defects on the surface 112 of the object 110 .
  • the image processor 180 may be a computer.
  • a computer program is executable in the imaging processor 180 .
  • the trigger controller 170 and the image processor 180 may be integrated parts of one computer.
  • the system 100 may include a display device 190 functionally connected to the imaging processor 180 .
  • the display device 190 may display the captured images.
  • the display device 190 may display the processed images.
  • the display device 190 may display an output including information of a detected defect.
  • the display device 190 may be a monitor.
  • the display device 190 may be an integrated part of the imaging processor 180 .
  • a shape of a potential defect may have a specific and predictable pattern consisting of bright region and shadow region in an image under different configurations of the dark field illumination sources 160 .
  • FIG. 3 illustrates a schematic diagram of a pattern consisting of a bright region 115 and a shadow region 117 of a defect 113 on a surface 112 under a dark field illumination source 160 according to an embodiment.
  • a surface 112 may have a V-shaped defect 113 .
  • a first portion 114 of the V-shaped defect 113 is illuminated by the dark filed illumination source 160 that forms a bright region 115 in an image.
  • a second portion 116 of the V-shaped defect 113 is not illuminated by the dark filed illumination source 160 that forms a shadow region 117 in the image.
  • the bright region 115 and the shadow region 117 are within a field of view 144 of an imaging device 140 .
  • the V-shaped defect 113 may be a micro defect, for example, scale of the V-shaped 113 defect may be as small as micrometers.
  • FIG. 4 illustrates a schematic flow chart of a method 200 for detecting a defect at a surface 112 of an object 110 using an image processor 180 according to an embodiment of the invention.
  • the imaging device 140 may be calibrated relative to the surface 112 of the object 110 to be inspected.
  • the calibration may estimate parameters of the lens 142 of the imaging device 140 to correct distortion of the lens 142 of the imaging device 140 when capturing images of the surface 112 of the object 110 .
  • the ambient illumination source 150 and the dark field illumination sources 160 may be triggered by the trigger controller 170 to illuminate the surface 112 of the object 110 with an ambient illumination condition and dark field illumination conditions.
  • the imaging device 140 may be triggered by the trigger controller 170 to capture images of the surface 112 of the object 110 under the ambient illumination condition and the dark field illumination conditions respectively. Each image captures a field of view 144 of the surface 112 of the object 110 under different illumination conditions.
  • Each image may contain potential defects having specific shapes on the surface 112 of the object 100 . Shapes of the potential defects have predictable patterns consisting of bright region and shadow region based on configurations of the dark field illumination sources 160 .
  • the captured images of the surface 112 of the object 110 are processed by the image processor 180 .
  • the processing step 230 may implement a plurality of image operations to the captured images to detect areas of potential defects at locations on the surface 112 of the object 110 .
  • a plurality of areas may be detected having potential defects. Each area may have a potential defect.
  • the locations of the plurality of areas may be represented by, such as x, y locations on a plane of the surface 112 .
  • the potential defects may be detected by patterns consisting of bright region and shadow region in the processed images.
  • the plurality of image operations may enhance shapes of potential defects and reduce false detection rate.
  • the areas showing the potential defects are cut off from the processed images to sub images. Size of each area to be cut off may be small enough to detect a micro defect. For example, size of each area may be less than 100 by 100 pixels, depending on resolution of the image. Areas that do not show indications of potential defects may be pruned out and do not need further processing.
  • step 250 sub images having a potential defect at same location on the surface 112 of the object 110 are stitched together to generate a hypothesis of the potential defect at the location.
  • a plurality of hypotheses of potential defects may be generated in a plurality of stitched images from sub images having potential defects at same locations on the surface 112 .
  • the stitched images are classified with a classifier to determine whether the potential defects are true defects on the surface 112 of the object 110 .
  • the classifier may be trained with a training data having characteristics of a true defect.
  • a random forest classifier may be used to classify the potential defects.
  • the random forest classifier may classify hypotheses with high efficiency and scalability in large scale applications.
  • an output of the classification is generated.
  • the output may include the detected true defects and the locations.
  • the output may be a report form.
  • the output may be an image with the detected true defects marked at the locations.
  • the image may be one of the captured images or one of the processed images.
  • the output may be stored in the imaging processor 180 , or displayed on a display device 190 , or print out by a printer.
  • FIG. 5 illustrates a schematic flow chart of a step 230 for processing images of the method 200 as illustrated in FIG. 4 according to an embodiment of the invention.
  • the trigger controller 170 may sequentially turn the ambient illumination source 150 and the dark field illumination sources 160 on and off.
  • the trigger controller 170 may trigger the imaging device 140 to sequentially capture images of the surface 112 of the object 110 under ambient illumination condition and dark field illumination conditions respectively.
  • the dark filed illumination sources 160 may be sequentially turned on and off by the trigger controller 170 so that images are sequentially captured by the imaging device 140 under sequential dark field illumination conditions.
  • image_amb image captured with the ambient illumination source 150 turned on is denoted as image_amb.
  • Image captured with the dark field illumination source 160 located on x-positive position turned on is denoted as image_xpos.
  • Image captured with the dark field illumination source 160 located on x-negative position turned on is denoted as image_xneg.
  • Image captured with the dark field illumination source 160 located on y-positive location turned on is denoted as image_ypos.
  • Image captured with the dark field illumination source 160 located on y-negative location turned on is denoted as image_yneg.
  • convolution operations are implemented to the captured images with corresponding kernels.
  • Convolution operations may filter noises in the captured images.
  • Kernels are defined corresponding to predefined configurations of the dark field illumination sources 160 to enhance detecting a potential defect in the captured images based on a pattern consisting of bright region and shadow region under the predefined configurations of the dark field illumination sources 160 .
  • Shape of a potential defect such as size or length of the potential defect, may have a specific and predictable pattern consisting of bright region and shadow region in an image under different configurations of the dark field illumination sources 160 .
  • kernels for the five different images captured with certain predefined configurations of dark field illumination sources 160 may be defined as followings:
  • kernel_xpos [ - 1 - 1 - 1 - 1 - 1 - 1 1 1 0 0 0 ]
  • kernel_xneg [ 0 0 0 0 1 1 - 1 - 1 - 1 - 1 ]
  • kernel_ambx [ 1 1 1 - 1 - 1 - 1 - 1 - 1 1 1 1 ]
  • ⁇ kernel_ypos ⁇ ⁇ is ⁇ ⁇ the ⁇ ⁇ convolution ⁇ ⁇ operator ⁇ ⁇ for ⁇ ⁇ image_ypos .
  • ⁇ kernel_yneg ⁇ ⁇ is ⁇ ⁇ the ⁇ ⁇ convolution ⁇ ⁇ operator ⁇ ⁇ for ⁇ ⁇ image_yneg
  • ⁇ kernel_amby ⁇ ⁇ is ⁇ ⁇ the ⁇ ⁇ convolution ⁇ ⁇ operator ⁇ ⁇ for ⁇ ⁇ image_amb
  • ⁇ kernel_ambx ⁇ _y ⁇ ⁇ is ⁇ ⁇ the ⁇ ⁇ convolution ⁇ ⁇ operator ⁇ ⁇ for ⁇ ⁇ image_amb .
  • the kernels may be redefined to detect potential defects in the captured images once configurations of the dark field illumination sources 160 changes, such as orientation, intensities, etc.
  • the kernels are redefined based on different patterns of potential defects consisting of bright region and shadow region in the captured images under different configurations of the dark field illumination sources 160 .
  • dilation operations are implemented to the images filtered by the convolutions.
  • Dilation operations are morphological operations that may probe and expand shapes of potential defects 112 in the filtered images using structuring elements.
  • three-by-three flat structuring elements may be used in the dilation operations at step 232 .
  • step 233 multiply operations are implemented to the convoluted and dilated images.
  • the multiply operations may further filter noises in the images.
  • images captured with dark field illumination sources 160 located at x-axis including image_xpos and image_xneg after operations of convolution using kernel_xpos and kernel_xneg respectively and dilations are multiplied with image_amb after operations of convolution using kernel_ambx and dilation to one image.
  • Images captured with dark field illumination sources 160 located at y-axis including image_ypos and image_yneg after operations of convolution using kernel_ypos and kernel_yneg respectively and dilations are multiplied with image_amb after operations of convolution using kernel_amby and dilation to another image.
  • step 234 median filtering operations are implemented to the multiplied images for further filtering the images.
  • Median filtering operations may preserve potential defects 112 in the image while removing noises.
  • the output images after median filtering operations may be detonated as image_x and image_y and may be output as two output images of imaging processing step 230 .
  • a magnitude operation is implemented to the two images image_x and image_y after the median filtering operations. Magnitude operation may maximize signal-to-nose ratio.
  • the output image after magnitude operation is multiplied with image_amb after operations of convolution using kernel_ambx_y and dilation to one image detonated as image_xy.
  • the processed images are output to three processed images of the imaging processing step 230 .
  • the three output processed images may include image_x, image_y, and image_xy.
  • the three output processed images are enhanced from the captured images to detect areas of potential defects at locations on the surface 112 of the object 110 . Areas of potential defects in each of the three output processed images are cut to sub images in step 240 . With reference to FIG. 3 , step 240 is then followed by step 250 and step 260 .
  • the proposed system 100 and method use a plurality of image processing techniques, such as image enhancement, morphological operation and machine learning tools including hypothesis generation and classification to accurately detect and quantify defects on any type of surfaces 112 of any objects 110 without relying on strong assumptions on characteristics of defects.
  • the proposed system 100 and method iteratively prune false defects and detect true defect by focusing on smaller areas on a surface 112 of an object 110 .
  • the proposed system 100 and method may be used in power generation industry to accurately detect and quantify micro defects on surfaces of generator wedges.
  • the micro defects may be a micro crack.
  • the micro defects may be as small as micrometers, which is difficult to detect using traditional inspection methods.
  • the proposed system 100 and method may be automatically operated by a computer to detect micro defects on a surface 112 of an object 110 .
  • the proposed system 100 and method may provide efficient automated micro defect detection on a surface 112 of an object 110 .
  • the proposed system 100 and method may provide a plurality of advantages in detecting micro defects on a surface 112 of an object 110 , such as higher detection accuracy, cost reduction, and consistent detection performance, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Signal Processing (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Image Processing (AREA)

Abstract

Method and system for detecting defects on surface of object are presented. An imaging device captures images of surface of object under ambient and dark field illumination conditions. The images are processed with a plurality of image operations to detect area of potential defect at location on surface of object based on predictable pattern consisting of bright and shadow regions. Kernels are defined corresponding to configurations of dark field illumination sources to enhance detecting potential defect. Areas of potential defect are cut from processed images to sub images. Sub images are stitched together to generate hypothesis of potential defect at location on surface of object. The hypothesis is classified with a classifier to determine whether the potential defect is true defect. The classifier is trained with training data having characteristics of true defect. The method provides efficient automated detection of micro defects on surface of object.

Description

    TECHNICAL FIELD
  • This invention relates generally to a method and a system for detecting a defect on a surface of an object.
  • DESCRIPTION OF RELATED ART
  • Defect detection is an important aspect of industrial production quality assurance process which may provide an important guarantee for quality of products. Defects may be cracks on a surface of an object. Defects may be very small. Scale of the defects may be as small as micrometers.
  • Traditional methods of micro defect detection involve significant manual process which may not guarantee consistent quality of detection. Due to small scale of the micro defects, it may require a considerably long detection time to achieve certain accuracy using traditional methods. Some traditional detection methods may require applying certain chemicals which may damage the object. Methods currently being used in industry for detecting defects on surface of object are time consuming and may not provide consistent quality of detection.
  • SUMMARY OF INVENTION
  • Briefly described, aspects of the present invention relate to a method and a system for detecting a defect on a surface of an object.
  • According to an aspect, a method for detecting a defect on a surface of an object is presented. The method comprises supporting the object on a platform. The method comprises illuminating the surface of the object with a plurality of illumination sources comprising at least one ambient illumination source and at least one dark field illumination source. The method comprises capturing images of the surface of the object under illumination conditions with the illumination sources using an imaging device. The method comprises processing the captured images with a plurality of image operations using an image processor to detect an area of a potential defect at a location on the surface of the object. The method comprises cutting the area of the potential defect from the processed images to sub images using the image processor. The method comprises stitching the sub images together to generate a hypothesis of the potential defect at the location on the surface of the object using the image processor. The method comprises classifying the hypothesis in the stitched image with a classifier to determine whether the potential defect is a true defect using the image processor. The classifier is trained with training data having characteristics of the true defect. The method comprises generating an output of the classification comprising the detected true defect and the location on the surface of the object.
  • According to an aspect, a system for detecting a defect on a surface of an object is presented. The system comprises a platform for supporting the object. The system comprises a plurality of illumination sources comprising at least one ambient illumination source and at least one dark field illumination source for illuminating the surface of the object. The system comprises an imaging device for capturing images of the surface of the object under illumination conditions with the illumination sources. The system comprises an image processor. The image processor processes the captured images with a plurality of image operations using to detect an area of a potential defect at a location on the surface of the object. The image processor cuts the area of the potential defect from the processed images to sub images. The image processor stitches the sub images together to generate a hypothesis of the potential defect at the location on the surface of the object. The image processor classifies the hypothesis in the stitched image with a classifier to determine whether the potential defect is a true defect. The classifier is trained with training data having characteristics of the true defect. The image processor generates an output of the classification comprising the detected true defect and the location on the surface of the object.
  • According to an aspect, a computer program executable in a computer for performing a method of detecting a defect on a surface of an object is presented. The computer stores images of the surface of the object under illumination conditions with illumination sources comprising at least one ambient illumination source and at least one dark field illumination source. The method comprises step of processing the images with a plurality of image operations to detect an area of a potential defect at a location on the surface of the object. The method comprises step of cutting the area of the potential defect from the processed images to sub images. The method comprises step of stitching the sub images together to generate a hypothesis of the potential defect at the location on the surface of the object. The method comprises step of classifying the hypothesis in the stitched image with a classifier to determine whether the potential defect is a true defect. The classifier is trained with training data having characteristics of the true defect. The method comprises step of generating an output of the classification comprising the detected true defect and the location on the surface of the object.
  • Various aspects and embodiments of the application as described above and hereinafter may not only be used in the combinations explicitly described, but also in other combinations. Modifications will occur to the skilled person upon reading and understanding of the description.
  • BRIEF DESCRIPTION OF DRAWINGS
  • Exemplary embodiments of the application are explained in further detail with respect to the accompanying drawings. In the drawings:
  • FIG. 1 illustrates a schematic side view of a system for detecting a defect at a surface of an object according to an embodiment of the invention;
  • FIG. 2 illustrates a schematic top view of a system for detecting a defect at a surface of an object according to an embodiment of the invention;
  • FIG. 3 illustrates a schematic diagram of a pattern consisting of a bright region and a shadow region of a defect on a surface under a dark field illumination source according to an embodiment of the invention;
  • FIG. 4 illustrates a schematic flow chart of a method for detecting a defect at a surface of an object according to an embodiment of the invention; and
  • FIG. 5 illustrates a schematic flow chart of a step for processing images of the method as illustrated in FIG. 4 according to an embodiment of the invention.
  • To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures.
  • DETAILED DESCRIPTION OF INVENTION
  • A detailed description related to aspects of the present invention is described hereafter with respect to the accompanying figures.
  • FIGS. 1 and 2 respectively illustrate a schematic side view and top view of a system 100 for detecting a defect at a surface 112 of an object 110 according to an embodiment of the invention. The system 100 may include a platform 120 that supports the object 110. The system 100 may include a motor 122. The platform 120 may be movable along at least one direction by the motor 122. The motor 122 may have a motor controller 124 that controls a movement of the platform 120.
  • The system 100 may include a hood 130 arranged above the platform 120. The hood 130 may have a hollow cone shape. The system 100 may have an imaging device 140. The imaging device 140 may be arranged inside the hood 130. The imaging device 140 may be located at top of the hood 130. The imaging device 140 may include, for example, a camera. The imaging device 140 may include a lens 142. The lens 142 may be arranged at a location relative to the surface 112 of the object 110 such that the imaging device 140 may have a desirable field of view 144 on the surface 112 of the object 110. The imaging device 140 may pan and tilt relative to the surface 112 of the object 110 to achieve a desired field of view 144 on the surface 112 of the object 100. The motor 122 may move the platform 120 along with the object 110 so that the field of view 144 of the imaging device 140 may cover different areas of the surface 112 of the object 110.
  • The system 100 may have at least one ambient illumination source 150. The ambient illumination source 150 may be arranged inside the hood 130. The ambient illumination source 150 may be located at top of the hood 130. The ambient illumination source 150 may provide an ambient illumination condition at the surface 112 of the object 110. The ambient illumination source 150 may include, for example, a light emitting diode (LED) strobe light. The ambient illumination source 150 may have a ring shape. According to an exemplary embodiment as illustrated in FIG. 1, the lens 142 may extend through the ring shaped ambient illumination source 150.
  • The system 100 may include at least one dark field illumination source 160. The dark field illumination source 160 may be arranged at bottom of the hood 130. The dark field illumination source 160 may provide a dark field illumination condition on the surface 112 of the object 110. The dark field illumination source 160 may include, for example, a LED strobe light. The dark field illumination source 160 may be oriented at a location relative to the surface 112 of the object 110 such that a dark field illumination condition may be provided within the field of view 144 of the lens 142 at the surface 112 of the object 110. Under a dark field illumination condition, a defect may have a predictable pattern consisting of a bright region that is illuminated by the dark field illumination source 160 and a dark or shadow region that is not illuminated by the dark field illumination source 160.
  • With reference to FIG. 2 which illustrates a schematic top view of the system 100, four dark field illumination sources 160 are arranged at bottom of the hood 130. Two dark field illumination sources 160 are arranged along an x-axis in a plane of the field of view 144. The two dark field illumination sources 160 are located at two sides of the field of view 144 respectively, denoted x-positive and x-negative. Two other dark field illumination sources 160 are arranged along a y-axis in the plane of the field of view 144. The two other dark field illumination sources 160 are located at two sides of the field of view 144 respectively, denoted y-positive and y-negative. Different numbers of dark field illumination sources 160 may be arranged at bottom of the hood 130.
  • With reference to FIG. 1, the system 100 may include a trigger controller 170. The trigger controller 170 may functionally connect to the imaging device 140, the ambient illumination source 150 and the dark field illumination sources 160. The trigger controller 170 may trigger the ambient illumination source 150 and the dark field illumination sources 160 at a defined pattern, sequence, or simultaneously. The trigger controller 170 may trigger the imaging device 140 to capture images of the surface 112 of the object 110 under the triggered illumination conditions respectively. The trigger controller 170 may control configurations of the dark field illumination sources 160. The configurations of the dark field illumination sources 160 may include orientations of the dark field illumination sources 160 relative to the surface 112 of the object 110, illumination intensities of the dark field illumination sources 160, etc. According to an embodiment, the trigger controller 170 may be a computer having a computer program implemented.
  • The system 100 may include an image processor 180. The image processor 180 may functionally connect to the imaging device 140. The images captured by the imaging device 140 may be stored in the image processor 180. The image processor 180 may also process the images captured by the imaging device 140 to detect defects on the surface 112 of the object 110. According to an embodiment, the image processor 180 may be a computer. A computer program is executable in the imaging processor 180. According to an embodiment, the trigger controller 170 and the image processor 180 may be integrated parts of one computer. The system 100 may include a display device 190 functionally connected to the imaging processor 180. The display device 190 may display the captured images. The display device 190 may display the processed images. The display device 190 may display an output including information of a detected defect. The display device 190 may be a monitor. The display device 190 may be an integrated part of the imaging processor 180.
  • According to an embodiment, a shape of a potential defect, such as size or length of the defect, may have a specific and predictable pattern consisting of bright region and shadow region in an image under different configurations of the dark field illumination sources 160. FIG. 3 illustrates a schematic diagram of a pattern consisting of a bright region 115 and a shadow region 117 of a defect 113 on a surface 112 under a dark field illumination source 160 according to an embodiment. As illustrated in FIG. 3, a surface 112 may have a V-shaped defect 113. A first portion 114 of the V-shaped defect 113 is illuminated by the dark filed illumination source 160 that forms a bright region 115 in an image. A second portion 116 of the V-shaped defect 113 is not illuminated by the dark filed illumination source 160 that forms a shadow region 117 in the image. The bright region 115 and the shadow region 117 are within a field of view 144 of an imaging device 140. The V-shaped defect 113 may be a micro defect, for example, scale of the V-shaped 113 defect may be as small as micrometers.
  • FIG. 4 illustrates a schematic flow chart of a method 200 for detecting a defect at a surface 112 of an object 110 using an image processor 180 according to an embodiment of the invention. In step 210, the imaging device 140 may be calibrated relative to the surface 112 of the object 110 to be inspected. The calibration may estimate parameters of the lens 142 of the imaging device 140 to correct distortion of the lens 142 of the imaging device 140 when capturing images of the surface 112 of the object 110.
  • In step 220, the ambient illumination source 150 and the dark field illumination sources 160 may be triggered by the trigger controller 170 to illuminate the surface 112 of the object 110 with an ambient illumination condition and dark field illumination conditions. The imaging device 140 may be triggered by the trigger controller 170 to capture images of the surface 112 of the object 110 under the ambient illumination condition and the dark field illumination conditions respectively. Each image captures a field of view 144 of the surface 112 of the object 110 under different illumination conditions. Each image may contain potential defects having specific shapes on the surface 112 of the object 100. Shapes of the potential defects have predictable patterns consisting of bright region and shadow region based on configurations of the dark field illumination sources 160.
  • In step 230, the captured images of the surface 112 of the object 110 are processed by the image processor 180. The processing step 230 may implement a plurality of image operations to the captured images to detect areas of potential defects at locations on the surface 112 of the object 110. A plurality of areas may be detected having potential defects. Each area may have a potential defect. The locations of the plurality of areas may be represented by, such as x, y locations on a plane of the surface 112. The potential defects may be detected by patterns consisting of bright region and shadow region in the processed images. The plurality of image operations may enhance shapes of potential defects and reduce false detection rate.
  • In step 240, the areas showing the potential defects are cut off from the processed images to sub images. Size of each area to be cut off may be small enough to detect a micro defect. For example, size of each area may be less than 100 by 100 pixels, depending on resolution of the image. Areas that do not show indications of potential defects may be pruned out and do not need further processing.
  • In step 250, sub images having a potential defect at same location on the surface 112 of the object 110 are stitched together to generate a hypothesis of the potential defect at the location. A plurality of hypotheses of potential defects may be generated in a plurality of stitched images from sub images having potential defects at same locations on the surface 112.
  • In step 260, the stitched images are classified with a classifier to determine whether the potential defects are true defects on the surface 112 of the object 110. The classifier may be trained with a training data having characteristics of a true defect. According to an embodiment, a random forest classifier may be used to classify the potential defects. The random forest classifier may classify hypotheses with high efficiency and scalability in large scale applications.
  • In step 270, an output of the classification is generated. The output may include the detected true defects and the locations. The output may be a report form. The output may be an image with the detected true defects marked at the locations. The image may be one of the captured images or one of the processed images. The output may be stored in the imaging processor 180, or displayed on a display device 190, or print out by a printer.
  • FIG. 5 illustrates a schematic flow chart of a step 230 for processing images of the method 200 as illustrated in FIG. 4 according to an embodiment of the invention. Referring to FIG. 2 and FIG. 4 step 220, the trigger controller 170 may sequentially turn the ambient illumination source 150 and the dark field illumination sources 160 on and off. The trigger controller 170 may trigger the imaging device 140 to sequentially capture images of the surface 112 of the object 110 under ambient illumination condition and dark field illumination conditions respectively. The dark filed illumination sources 160 may be sequentially turned on and off by the trigger controller 170 so that images are sequentially captured by the imaging device 140 under sequential dark field illumination conditions. For example, image captured with the ambient illumination source 150 turned on is denoted as image_amb. Image captured with the dark field illumination source 160 located on x-positive position turned on is denoted as image_xpos. Image captured with the dark field illumination source 160 located on x-negative position turned on is denoted as image_xneg. Image captured with the dark field illumination source 160 located on y-positive location turned on is denoted as image_ypos. Image captured with the dark field illumination source 160 located on y-negative location turned on is denoted as image_yneg.
  • In step 231 of the step 230, convolution operations are implemented to the captured images with corresponding kernels. Convolution operations may filter noises in the captured images. Kernels are defined corresponding to predefined configurations of the dark field illumination sources 160 to enhance detecting a potential defect in the captured images based on a pattern consisting of bright region and shadow region under the predefined configurations of the dark field illumination sources 160. Shape of a potential defect, such as size or length of the potential defect, may have a specific and predictable pattern consisting of bright region and shadow region in an image under different configurations of the dark field illumination sources 160.
  • Different kernels affect output filtered images. For example, assuming color code as: black=1, white=0, grey=−1, the kernels for the five different images captured with certain predefined configurations of dark field illumination sources 160 may be defined as followings:
  • kernel_xpos = [ - 1 - 1 - 1 - 1 - 1 1 1 0 0 0 0 ] , kernel_xneg = [ 0 0 0 0 1 1 - 1 - 1 - 1 - 1 - 1 ] , kernel_ambx = [ 1 1 1 - 1 - 1 - 1 - 1 - 1 1 1 1 ] kernel_ypos = [ kernel_xpos ] T = [ - 1 , - 1 , - 1 , - 1 , - 1 , 1 , 1 , 0 , 0 , 0 , 0 ] kernel_yneg = [ kernel_xneg ] T = [ 0 , 0 , 0 , 0 , 1 , 1 , - 1 , - 1 , - 1 , - 1 , - 1 ] kernel_amby = [ kernel_ambx ] T = [ 1 , 1 , 1 , - 1 , - 1 , - 1 , - 1 , - 1 , 1 , 1 , 1 ] kernel_ambx _y = [ - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 - 0.1667 ] Wherein : kernel_xpos is the convolution operator for image_xpos , kernel_xneg is the convolution operator for image_xneg kernel_ambx is the convolution operator for image_amb . kernel_ypos is the convolution operator for image_ypos . kernel_yneg is the convolution operator for image_yneg , kernel_amby is the convolution operator for image_amb , kernel_ambx _y is the convolution operator for image_amb .
  • The kernels may be redefined to detect potential defects in the captured images once configurations of the dark field illumination sources 160 changes, such as orientation, intensities, etc. The kernels are redefined based on different patterns of potential defects consisting of bright region and shadow region in the captured images under different configurations of the dark field illumination sources 160.
  • In step 232, dilation operations are implemented to the images filtered by the convolutions. Dilation operations are morphological operations that may probe and expand shapes of potential defects 112 in the filtered images using structuring elements. According to an embodiment, three-by-three flat structuring elements may be used in the dilation operations at step 232.
  • In step 233, multiply operations are implemented to the convoluted and dilated images. The multiply operations may further filter noises in the images. With reference to FIG. 4, images captured with dark field illumination sources 160 located at x-axis including image_xpos and image_xneg after operations of convolution using kernel_xpos and kernel_xneg respectively and dilations are multiplied with image_amb after operations of convolution using kernel_ambx and dilation to one image. Images captured with dark field illumination sources 160 located at y-axis including image_ypos and image_yneg after operations of convolution using kernel_ypos and kernel_yneg respectively and dilations are multiplied with image_amb after operations of convolution using kernel_amby and dilation to another image.
  • In step 234, median filtering operations are implemented to the multiplied images for further filtering the images. Median filtering operations may preserve potential defects 112 in the image while removing noises. The output images after median filtering operations may be detonated as image_x and image_y and may be output as two output images of imaging processing step 230.
  • In step 235, a magnitude operation is implemented to the two images image_x and image_y after the median filtering operations. Magnitude operation may maximize signal-to-nose ratio. The output image after magnitude operation is multiplied with image_amb after operations of convolution using kernel_ambx_y and dilation to one image detonated as image_xy.
  • In step 236, the processed images are output to three processed images of the imaging processing step 230. The three output processed images may include image_x, image_y, and image_xy. The three output processed images are enhanced from the captured images to detect areas of potential defects at locations on the surface 112 of the object 110. Areas of potential defects in each of the three output processed images are cut to sub images in step 240. With reference to FIG. 3, step 240 is then followed by step 250 and step 260.
  • According to an aspect, the proposed system 100 and method use a plurality of image processing techniques, such as image enhancement, morphological operation and machine learning tools including hypothesis generation and classification to accurately detect and quantify defects on any type of surfaces 112 of any objects 110 without relying on strong assumptions on characteristics of defects. The proposed system 100 and method iteratively prune false defects and detect true defect by focusing on smaller areas on a surface 112 of an object 110. The proposed system 100 and method may be used in power generation industry to accurately detect and quantify micro defects on surfaces of generator wedges. The micro defects may be a micro crack. The micro defects may be as small as micrometers, which is difficult to detect using traditional inspection methods.
  • According to an aspect, the proposed system 100 and method may be automatically operated by a computer to detect micro defects on a surface 112 of an object 110. The proposed system 100 and method may provide efficient automated micro defect detection on a surface 112 of an object 110. The proposed system 100 and method may provide a plurality of advantages in detecting micro defects on a surface 112 of an object 110, such as higher detection accuracy, cost reduction, and consistent detection performance, etc.
  • Although various embodiments that incorporate the teachings of the present invention have been shown and described in detail herein, those skilled in the art can readily devise many other varied embodiments that still incorporate these teachings. The invention is not limited in its application to the exemplary embodiment details of construction and the arrangement of components set forth in the description or illustrated in the drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless specified or limited otherwise, the terms “mounted,” “connected,” “supported,” and “coupled” and variations thereof are used broadly and encompass direct and indirect mountings, connections, supports, and couplings. Further, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings.
  • REFERENCE LIST
    • 100: System
    • 110: Object
    • 112: Surface of the Object
    • 113: V-shaped Defect
    • 114: First Portion of the V-shaped Defect
    • 115: Bright Region
    • 116: Second Portion of the V-shaped Defect
    • 117: Shadow Region
    • 120: Platform
    • 122: Motor
    • 124: Motor Controller
    • 130: Hood
    • 140: Imaging Device
    • 142: Lens
    • 144: Field of View
    • 150: Ambient Illumination Source
    • 160: Dark Filed Illumination Source
    • 170: Trigger Controller
    • 180: Image Processor
    • 190: Display Device
    • 200: Method

Claims (20)

1. A method for detecting a defect on a surface of an object comprising:
supporting the object on a platform;
illuminating the surface of the object with a plurality of illumination sources comprising at least one ambient illumination source and at least one dark field illumination source;
capturing images of the surface of the object under illumination conditions with the illumination sources using an imaging device;
processing the captured images with a plurality of image operations using the image processor to detect an area of a potential defect at a location on the surface of the object;
cutting the area of the potential defect from the processed images to sub images using the image processor;
stitching the sub images together to generate a hypothesis of the potential defect at the location on the surface of the object using the image processor;
classifying the hypothesis in the stitched image with a classifier to determine whether the potential defect is a true defect using the image processor, wherein the classifier is trained with training data having characteristics of the true defect; and
generating an output of the classification comprising the detected true defect and the location on the surface of the object.
2. The method as claimed in claim 1, wherein the potential defect comprises a pattern consisting of bright region and shadow region in the images.
3. The method as claimed in claim 1, wherein the processing step further comprises implementing convolution operations to the captured images using corresponding kernels.
4. The method as claimed in claim 3, wherein the kernels are defined corresponding to configuration of the dark field illumination source to detect the potential defect.
5. The method as claimed in claim 3, wherein the processing step further comprises implementing dilation operations to the convoluted images.
6. The method as claimed in claim 5, wherein the processing step further comprises multiplying the convoluted and dilated images to one image.
7. The method as claimed in claim 6, wherein the processing step further comprises implementing median filtering operations to the multiplied image, and wherein the median filtered image is the output processed image.
8. The method as claimed in claim 7, wherein the processing step further comprises implementing magnitude operation to the median filtered image.
9. The method as claimed in claim 8, wherein the processing step further comprises multiplying the magnitude image with one of the captured images under ambient illumination condition and processed with convolution and dilation operations, and wherein the multiplied image is the output processed image.
10. The method as claimed in claim 1, wherein the illumination sources are sequentially turned on and off.
11. A system for detecting a defect on a surface of an object comprising:
a platform for supporting the object;
a plurality of illumination sources comprising at least one ambient illumination source and at least one dark field illumination source for illuminating the surface of the object;
an imaging device for capturing images of the surface of the object under illumination conditions with the illumination sources; and
an image processor for:
processing the captured images with a plurality of image operations to detect an area of a potential defect at a location on the surface of the object;
cutting the area of the potential defect from the processed images to sub images;
stitching the sub images together to generate a hypothesis of the potential defect at the location on the surface of the object;
classifying the hypothesis in the stitched image with a classifier to determine whether the potential defect is a true defect, wherein the classifier is trained with training data having characteristics of the true defect; and
generating an output of the classification comprising the detected true defect and the location on the surface of the object.
12. The system as claimed in claim 11, wherein the potential defect comprises a pattern consisting of bright region and shadow region in the images.
13. The system as claimed in claim 11, wherein the processing operations comprise convolution operations to the captured images using corresponding kernels.
14. The system as claimed in claim 13, wherein the kernels are defined corresponding to configuration of the dark field illumination source to detect the potential defect.
15. The system as claimed in claim 13, wherein the processing operations comprise dilation operations to the convoluted images.
16. The system as claimed in claim 15, wherein the processing operations comprise multiplying the convoluted and dilated images to one image.
17. The system as claimed in claim 16, wherein the processing operations comprise median filtering operations to the multiplied image, and wherein the median filtered image is the output processed image.
18. The system as claimed in claim 17, wherein the processing operations comprise magnitude operation to the median filtered image.
19. The system as claimed in claim 18, wherein the processing step further comprises multiplying the magnitude image with one of the captured images under ambient illumination condition and processed with convolution and dilation operations, and wherein the multiplied image is the output processed image.
20. A computer program stored in a non-transitory computer readable medium and executable in a computer for performing a method of detecting a defect on a surface of an object, wherein the computer stores images of the surface of the object under illumination conditions with illumination sources comprising at least one ambient illumination source and at least one dark field illumination source, wherein the method comprises steps of:
processing the captured images with a plurality of image operations to detect an area of a potential defect at a location on the surface of the object;
cutting the area of the potential defect from the processed images to sub images;
stitching the sub images together to generate a hypothesis of the potential defect at the location on the surface of the object;
classifying the hypothesis in the stitched image with a classifier to determine whether the potential defect is a true defect, wherein the classifier is trained with training data having characteristics of the true defect; and
generating an output of the classification comprising the detected true defect and the location on the surface of the object.
US15/678,335 2017-08-16 2017-08-16 Method and system for detecting defects on surface of object Active 2037-08-22 US10215714B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/678,335 US10215714B1 (en) 2017-08-16 2017-08-16 Method and system for detecting defects on surface of object
DE102018211453.6A DE102018211453B4 (en) 2017-08-16 2018-07-11 Method and system for detecting defects on a surface of an object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/678,335 US10215714B1 (en) 2017-08-16 2017-08-16 Method and system for detecting defects on surface of object

Publications (2)

Publication Number Publication Date
US20190056333A1 true US20190056333A1 (en) 2019-02-21
US10215714B1 US10215714B1 (en) 2019-02-26

Family

ID=65235419

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/678,335 Active 2037-08-22 US10215714B1 (en) 2017-08-16 2017-08-16 Method and system for detecting defects on surface of object

Country Status (2)

Country Link
US (1) US10215714B1 (en)
DE (1) DE102018211453B4 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110827249A (en) * 2019-10-28 2020-02-21 上海悦易网络信息技术有限公司 Electronic equipment backboard appearance flaw detection method and equipment
CN110956627A (en) * 2019-12-13 2020-04-03 智泰科技股份有限公司 Intelligent optical detection sample characteristic and flaw intelligent lighting image capturing method and device
US20210295165A1 (en) * 2020-03-18 2021-09-23 Donghua University Method for constructing efficient product surface defect detection model based on network collaborative pruning
CN114119343A (en) * 2021-10-21 2022-03-01 浙江大学 Dark field image storage method utilizing sparse matrix characteristic
CN114723748A (en) * 2022-06-06 2022-07-08 深圳硅山技术有限公司 Detection method, device and equipment of motor controller and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11703460B2 (en) 2019-07-09 2023-07-18 Kla Corporation Methods and systems for optical surface defect material characterization

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5917935A (en) * 1995-06-13 1999-06-29 Photon Dynamics, Inc. Mura detection apparatus and method
US5828778A (en) * 1995-07-13 1998-10-27 Matsushita Electric Industrial Co., Ltd. Method and apparatus for analyzing failure of semiconductor wafer
US6780656B2 (en) * 2000-09-18 2004-08-24 Hpl Technologies, Inc. Correction of overlay offset between inspection layers
US7525659B2 (en) * 2003-01-15 2009-04-28 Negevtech Ltd. System for detection of water defects
US20060192949A1 (en) * 2004-12-19 2006-08-31 Bills Richard E System and method for inspecting a workpiece surface by analyzing scattered light in a back quartersphere region above the workpiece
EP2147296A1 (en) * 2007-04-18 2010-01-27 Micronic Laser Systems Ab Method and apparatus for mura detection and metrology
JP5579588B2 (en) * 2010-12-16 2014-08-27 株式会社日立ハイテクノロジーズ Defect observation method and apparatus
US9885934B2 (en) * 2011-09-14 2018-02-06 View, Inc. Portable defect mitigators for electrochromic windows
US9838583B2 (en) 2015-09-21 2017-12-05 Siemens Energy, Inc. Method and apparatus for verifying lighting setup used for visual inspection
US10234402B2 (en) * 2017-01-05 2019-03-19 Kla-Tencor Corporation Systems and methods for defect material classification

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110827249A (en) * 2019-10-28 2020-02-21 上海悦易网络信息技术有限公司 Electronic equipment backboard appearance flaw detection method and equipment
WO2021082921A1 (en) * 2019-10-28 2021-05-06 上海悦易网络信息技术有限公司 Back cover appearance defect detection method for electronic apparatus, and apparatus
CN110956627A (en) * 2019-12-13 2020-04-03 智泰科技股份有限公司 Intelligent optical detection sample characteristic and flaw intelligent lighting image capturing method and device
US20210295165A1 (en) * 2020-03-18 2021-09-23 Donghua University Method for constructing efficient product surface defect detection model based on network collaborative pruning
US11966847B2 (en) * 2020-03-18 2024-04-23 Donghua University Method for constructing efficient product surface defect detection model based on network collaborative pruning
CN114119343A (en) * 2021-10-21 2022-03-01 浙江大学 Dark field image storage method utilizing sparse matrix characteristic
CN114723748A (en) * 2022-06-06 2022-07-08 深圳硅山技术有限公司 Detection method, device and equipment of motor controller and storage medium

Also Published As

Publication number Publication date
DE102018211453A1 (en) 2019-02-21
US10215714B1 (en) 2019-02-26
DE102018211453B4 (en) 2023-11-02

Similar Documents

Publication Publication Date Title
US10192301B1 (en) Method and system for detecting line defects on surface of object
US10215714B1 (en) Method and system for detecting defects on surface of object
CN108445007B (en) Detection method and detection device based on image fusion
EP3531114B1 (en) Visual inspection device and illumination condition setting method of visual inspection device
JP6917781B2 (en) Image inspection equipment
US10890537B2 (en) Appearance inspection device, lighting device, and imaging lighting device
EP3243166B1 (en) Structural masking for progressive health monitoring
JP6967373B2 (en) Image inspection equipment
JP7188870B2 (en) Image inspection equipment
CN114136975A (en) Intelligent detection system and method for surface defects of microwave bare chip
JP5417197B2 (en) Inspection apparatus and inspection method
JP2008170256A (en) Flaw detection method, flaw detection program and inspection device
JP6917780B2 (en) Image inspection equipment
CN110596118A (en) Print pattern detection method and print pattern detection device
JP2011174896A (en) Imaging apparatus and method
CN211403010U (en) Foreign body positioning device for display panel
KR101375213B1 (en) Method for eccentricity measuring of lens type led module using concentric type
JP2008026149A (en) Visual examination device
US6758384B2 (en) Three-dimensional soldering inspection apparatus and method
JP6792283B2 (en) Visual inspection equipment
JP2019066222A (en) Visual inspection device and visual inspection method
JP2003329428A (en) Device and method for surface inspection
JP2018205001A (en) Image inspection device
JP2018205000A (en) Image inspection device
CN116685444A (en) Tool inspection device, tool inspection program, and tool inspection method for robot arm

Legal Events

Date Code Title Description
AS Assignment

Owner name: SIEMENS ENERGY, INC., FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BAILEY, KEVIN P.;REEL/FRAME:043602/0387

Effective date: 20170801

Owner name: SIEMENS CORPORATION, NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WU, ZIYAN;PANDA, RAMESWAR;ERNST, JAN;SIGNING DATES FROM 20170803 TO 20170804;REEL/FRAME:043602/0606

Owner name: SIEMENS ENERGY, INC., FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SIEMENS CORPORATION;REEL/FRAME:043602/0729

Effective date: 20170828

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4