CN106959293B - System and method for detecting defects on reflective surface through vision system - Google Patents

System and method for detecting defects on reflective surface through vision system Download PDF

Info

Publication number
CN106959293B
CN106959293B CN201611000662.4A CN201611000662A CN106959293B CN 106959293 B CN106959293 B CN 106959293B CN 201611000662 A CN201611000662 A CN 201611000662A CN 106959293 B CN106959293 B CN 106959293B
Authority
CN
China
Prior art keywords
camera
light
illumination
optic
illuminator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611000662.4A
Other languages
Chinese (zh)
Other versions
CN106959293A (en
Inventor
F·罗斯塔米
J·F·菲尔哈伯
F·钱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cognex Corp
Original Assignee
Cognex Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/349,131 external-priority patent/US11493454B2/en
Application filed by Cognex Corp filed Critical Cognex Corp
Publication of CN106959293A publication Critical patent/CN106959293A/en
Application granted granted Critical
Publication of CN106959293B publication Critical patent/CN106959293B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/21Polarisation-affecting properties
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
    • G01N21/8901Optical details; Scanning details
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
    • G01N21/8901Optical details; Scanning details
    • G01N21/8903Optical details; Scanning details using a multiple detector array
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
    • G01N21/8914Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles characterised by the material examined
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
    • G01N21/892Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles characterised by the flaw, defect or object feature examined
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/14Beam splitting or combining systems operating by reflection only
    • G02B27/149Beam splitting or combining systems operating by reflection only using crossed beamsplitting surfaces, e.g. cross-dichroic cubes or X-cubes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • G01N2021/0106General arrangement of respective parts
    • G01N2021/0112Apparatus in one mechanical, optical or electronic block
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • G01N2021/8829Shadow projection or structured background, e.g. for deflectometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • G01N2021/8841Illumination and detection on two sides of object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • G01N2021/8848Polarisation of light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
    • G01N21/8901Optical details; Scanning details
    • G01N2021/8908Strip illuminator, e.g. light tube
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
    • G01N21/892Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles characterised by the flaw, defect or object feature examined
    • G01N2021/8924Dents; Relief flaws

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Pathology (AREA)
  • Immunology (AREA)
  • Engineering & Computer Science (AREA)
  • Textile Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present invention relates to a system and method for detecting and imaging retroreflective surface defects on a retroreflective surface using knife-edge technology, wherein a camera aperture or external device is provided to form a physical knife-edge structure within the optical path that advantageously masks reflected light from the illuminated retroreflective surface having a predetermined slope value and causes light deflected at a different slope to reach vision system camera sensors. Light reflected from the flat portion of the surface is mostly blocked by the knife edge. Most of the light reflected from the sloped portion of the defect is reflected into the entrance aperture. The illuminating beam is angled relative to the optical axis of the camera to provide an angle of incidence at a suitable angle to the surface being inspected. The surface may be stationary or moving relative to the camera.

Description

System and method for detecting defects on reflective surface through vision system
Cross reference to related patent applications
The present application claims the benefit of the following documents: a pending U.S. provisional application serial No.62/255,360 entitled "SYSTEM and METHOD FOR detecting defects ON retroreflective SURFACEs by a VISION SYSTEM" (SYSTEM AND METHOD FOR DETECTING DEFECTS ON surveillance SURFACE WITH a VISION SYSTEM) "filed 11/13/2015; the title "System and method for detecting defects ON retroreflective surfaces by visual System" (SYSTEMAND METHOD FOR DETECTING DEFECTS ON A SPECULAR SURFACEWITH A VISION) "filed ON 31.12.2015, U.S. provisional application serial No.62/274,094; and U.S. provisional application serial No.62/404,431, entitled "SYSTEM and METHOD FOR detecting defects ON retroreflective SURFACEs by visual SYSTEM" (SYSTEM AND METHOD FOR detecting defects ON a substrate SURFACE WITH a VISION SYSTEM) "filed ON 2016, 10, 5, and the teachings of which are expressly incorporated herein by reference.
Technical Field
The present invention relates to machine vision systems for inspecting the surface of an object, and more particularly, to vision systems for inspecting reflective surfaces.
Background
Machine vision systems, also referred to herein as "vision systems," are used to perform various tasks in a manufacturing environment. Typically, a vision system consists of one or more cameras with image sensors (or "imagers") that acquire grayscale or color images of a scene that includes the objects to be produced. The image of the object may be analyzed to provide data/information and associated production processes to the user. The image-generated data is typically analyzed and processed by a vision system in one or more vision system processors, which may be dedicated, or part of one or more software applications instantiated within a general purpose computer (e.g., a PC, laptop, tablet, or smartphone). Some types of tasks performed by vision systems may include inspection of objects and surfaces that are stationary or moving (vehicles), such as surfaces on conveyors or mobile stations.
Surface inspection of objects with polished reflective surfaces has proven challenging for vision systems. Typically, reflections from the surface cause defects and surface imperfections (e.g., small pits/valleys and/or bumps/peaks) that indicate small differences in slope of small areas relative to the surrounding surface to appear washed out due to the large amount of reflected light entering the camera. One method for attempting to find surface defects on reflective surfaces is through the use of dark field illumination, where the illumination light projected onto the object is not collected at the objective lens. This serves to highlight any surface defects that scatter light. However, this technique has limitations for installation and use in environments that include relative movement between the object and the camera assembly.
Disclosure of Invention
The shortcomings of the prior art are overcome by the present invention, which provides a system and method for detecting and imaging reflective surface defects on a reflective surface using knife-edge technology, wherein a camera aperture or external device is provided to form a physical knife-edge structure within the optical path that advantageously masks reflected light from an illuminated reflective surface having a predetermined slope value and causes light deflected at a different slope to reach a vision system camera sensor. In one embodiment, the illuminator is focused so as to emerge from a larger area of the illuminator than the area of the surface being inspected and converge on the area of the surface. The light rays are reflected by the (reflective surface) area and continue to converge to either a spot near the entrance aperture of the camera or a spot on an aperture stop (e.g., an adjustable aperture) inside the camera. At any one location, most of the light reflected from the flat portion of the surface is blocked by the knife edge or aperture stop. Conversely, most of the light reflected from the sloped portion of the defect is reflected into the entrance aperture. The illuminating beam is angled relative to the optical axis of the camera to provide an angle of incidence at a suitable angle to the surface being inspected. Illustratively, the illuminator may include a linear polarizer that transmits polarized light to the surface of the object. The object may be multilayered and may comprise, for example, a polarizing layer. The polarized light rays reflect from the surface and enter the crossed polarizer out of the camera sensor/camera optics. Illustratively, the surface may be stationary and may be acquired by a 2D sensor array, or the surface may be moved relative to a camera, which may be defined as a line scan camera having a line scan sensor.
In one illustrative embodiment, a system and method for imaging defects on a reflective surface of an object is provided. The surface is imaged by a vision system camera having an image sensor and optics and defining an optical axis. An illuminator assembly projecting the structured light beam onto the surface at a predetermined angle that is not parallel to the optical axis. A knife edge element is associated with the optic, the knife edge element variably blocking a portion of a maximum field of view of the optic. The knife edge element and the predetermined angle are each arranged such that: light emitted from the optics to the sensor substantially penetrates the sloped peaks and valleys or the waves and ripples of the defect feature on the surface, and reflected light around the sloped defect feature is blocked by the knife edge element. Illustratively, the knife edge element includes a variable aperture in the optical element, and the predetermined angle is associated with a slope that deforms from a surface of the planar surface. In an embodiment, the sensor is a 2D sensor and the object is stationary relative to the camera. Alternatively, the sensor defines a line scan camera arrangement and the object moves relative to the camera, and the illumination assembly will illuminate the line. The use of line illumination lines projected onto the surface allows inspection of moving parts by the 2D sensor and inspection of parts much larger than the area covered by a single image. In an embodiment, the illuminator generally defines an infrared or near-infrared wavelength range in addition to visible light, and the object may define a layer comprising an anti-reflective coating and/or a polarizing layer, in which case the illuminator may be polarized and the optics comprise a polarizing filter. By way of non-limiting example, the object may be an AMOLED display screen, and the polarizing layer is a 1/4 λ retarder and the polarizing filter is defined as an orthogonal polarizing filter. The illuminator may include a polarizer for polarizing the illumination light, and the optics include a polarizing filter. The illumination source may be defined as a converging beam that converges toward a point near the knife edge structure. The knife edge structure may be defined as an external structure (located between the optic and the object) located in the light path in front of the optic. Illustratively, the illuminator assembly projects light rays through a beam splitter located on the optical axis of the vision system camera such that: off-axis illumination from the illuminator assembly is projected onto the object surface coincident with the optical axis. In another embodiment, the illuminator assembly defines a plurality of illumination sources, each of the plurality of illumination sources projecting light onto a respective beam splitter, each beam splitter being located on an optical axis of the vision system camera such that: off-axis illumination from the illuminator assembly is projected by each beam splitter onto the object surface coincident with the optical axis.
Illustratively, the knife edge element may be defined as a blocking structure within the optic that is located on the optical axis. The blocking structure is disposed on the mask member adjacent to the front face of the optical device. The blocking structure may be arranged to selectively enhance or suppress feature-associated scattered light. The blocking structure may define a line extending through the optic along the elongate direction and may have a width along the transverse elongate direction related to the size of the focused illumination spot on the optic. The elongate direction may be defined by the orientation of the feature. The mask member includes a peripheral opaque region on each opposing side of the line with a linear aperture between the line and the opaque region. The blocking structure comprises a disk having a center located approximately on the optical axis and having a diameter related to the size of one or more of the features. The annular structure may surround the disc and may define an annular aperture between the disc and an inner periphery of the annular region. The annular region may be arranged to suppress scattered light. Illustratively, the mask member defines at least one of the following components: a snap-on or screw-on lens cover, a decal located over the front of the optic, and a variable mode electro-optic mechanism located on the optic. In an embodiment, an apparatus may include a first polarizer disposed in conjunction with optics and a second polarizer disposed in conjunction with an illuminator assembly.
Drawings
The invention is described below with reference to the accompanying drawings, in which:
FIG. 1 is a diagram of an exemplary vision system having a camera including a 2D pixel array for capturing images of an exemplary stationary object surface having defects, the exemplary vision system including an illuminator and an aperture control configured to resolve surface distortion defect features on a reflective surface of the stationary object surface;
FIG. 2 is a diagram of an exemplary vision system having a camera including a line-scan (1D) pixel array for acquiring images of an exemplary moving object surface having defects, the exemplary vision system including an illuminator and an aperture control configured to resolve surface deformation defect features on a reflective surface of the moving object surface;
FIG. 3 is a graphical representation of an application of the knife edge effect to resolve surface distortion defect characteristics according to the apparatus of FIG. 1 or FIG. 2;
FIG. 4 is an illustration of the vision system arrangement of FIG. 2 scanning an exemplary object in the form of an AMOLED display screen configured as a plurality of layers including an exemplary polarizing layer, wherein the object includes a peak and valley defect feature within at least one of the layers;
FIG. 5 is a side view of the vision system arrangement and exemplary scanned object of FIG. 4, showing the optical path of the illumination light and the collected reflected rays for a peak defect feature;
FIG. 6 is a side view of the vision system arrangement and exemplary scanned object of FIG. 4, showing the optical path of the illumination light and the collected reflected rays about valley defect features;
FIG. 7 is a side view of a vision system arrangement showing the use of a knife edge structure associated with a camera and optics, according to an exemplary embodiment;
FIG. 8 is a side view of a vision system arrangement and an exemplary scanned object showing multiple illumination sources and associated knife edge assemblies, in accordance with another embodiment;
FIGS. 9 and 10 are illustrations of a vision system operating in accordance with the description of FIG. 1, including a beam splitter that provides off-axis illumination light coincident with an optical axis of a camera, the beam splitter using one illuminator and the camera using two illuminators;
FIG. 11 shows an image of an exemplary object surface having visible defect features that are each displayed in a shadowgraph around the surface using a vision system according to embodiments herein;
FIG. 12 is a flow diagram of a process for determining waviness on a light-reflective surface of an object using off-axis illumination and a knife edge device, according to an embodiment;
FIGS. 13-15 are exemplary histograms of image intensity showing the combination of smooth surface features and ripple surface features, and response of smooth surface features, respectively;
FIG. 16 is an exemplary optical arrangement of an illuminator and an image sensor for illuminating and scanning a row of a surface, according to an embodiment;
FIG. 17 is a perspective view of the illustration of FIG. 16;
FIG. 18 is an illustration of an exemplary vision system camera and illuminator in which a knife edge element in the form of a mask is used that includes fixed or variable filter elements in front of or within the optical components of the camera, in accordance with an exemplary embodiment;
FIG. 19 is a front view of a mask for defining an exemplary non-transparent centerline within a linear aperture in conjunction with the exemplary camera of FIG. 18;
FIG. 20 is a front view of a mask used to define an exemplary disk element located at the center of an optical assembly;
21-25 are front views of exemplary non-transparent disk elements and outer annular elements, respectively, separated by an annular aperture therebetween, and each defining a predetermined diameter;
FIG. 26 is an image of an exemplary object (e.g., a touch screen surface) imaged by the vision system camera and illuminator device with mask of FIG. 18, showing details of a corrugated surface on the exemplary object;
FIG. 27 is an image of the exemplary object of FIG. 26 imaged by the vision camera system and illuminator device of FIG. 18, further illustrating details (e.g., a sensor array) on the touch screen surface; and
fig. 28 is an image of the exemplary object of fig. 26 imaged by the vision camera system and illuminator device of fig. 18, showing highly precise details of the sensor array of fig. 27.
Detailed Description
Fig. 1 is an illustration of an exemplary vision system arrangement 100, in which a scene includes a retro-reflective object 110 that is stationarily disposed relative to a stationary vision system camera 120, in accordance with an exemplary embodiment. In this embodiment, the vision system camera 120 includes a two-dimensional (2D) image sensor S that includes an N × M pixel array in, for example, a rectangular arrangement. The camera includes an optics package OC, which may include any acceptable lens assembly (e.g., a lens with a C-type objective mount, an F-type objective mount, or an M12-type base). In this embodiment, the lens includes manual or automatic aperture control-e.g., an iris diaphragm, wherein a user or other external controller can input appropriate aperture settings 124 into the manual or automatic aperture control (as described further below). The sensor S and optics package OC collectively define an optical axis OA that is generally perpendicular to the generalized surface plane of the object 110. The arrangement 100 comprises an illuminator 130, which illuminator 130 projects a parallel beam 132 of light (e.g. through an optical device OI) onto the surface 110. In one embodiment, the beam 132 is adjusted to largely remain on the object and avoid extending to the rest of the scene. The light beam 132 is positioned at an angle a to the general plane of the light-reflecting face of the object 110. The angle is not perpendicular (typically acute) to the plane normal N. The normal N is generally parallel to the camera optical axis OA.
As shown, the surface of the object 110 includes defect features 140, which defect features 140 may define downwardly opening valleys (also referred to as "pits") or upwardly projecting peaks (also referred to as "bumps"), which may be efficiently imaged using the arrangements and techniques described below. In an exemplary embodiment, the image data 140 from the illuminated scene and object 110 is transmitted to the vision system processor 150. The processor 150 may be integrated directly into one or more camera components, or as depicted may be located on a separate computing device 160, the separate computing device 160 having suitable user interfaces (e.g., mouse 162, keyboard 164) and display functionality (screen and/or touch screen 166). The computing device 160 may include a server, a PC, a laptop, a smartphone or dedicated processing device, other types of processors with associated memory, network devices, data storage, etc., as should be apparent to those skilled in the art.
The vision system processor 150 may include various functional software processors and modules. These processes/modules may include a controller 152 regarding various parameters that control the camera/sensors and the illuminator 130 (via illumination control information 170). The vision system processor 150 also includes various vision tools 152, such as feature detectors (e.g., edge detectors, corner detectors, blur tools (blob tools), etc.). These tools are used to analyze the surface features of the image and, for example, locate the defect features 140 under the illumination and optical conditions described below. The vision system processor also includes a defect finder/finding module 156, which defect finder/finding module 156 employs various tools 154 to locate and identify defects on the surface. The defects may be quantified and appropriate information 172 sent to a process (e.g., partial reject and warning process) 174.
As described further below, in various embodiments, the camera 120 may include a polarizing filter P (within the optical path). Another filter PI may be provided on the illuminator to facilitate delivery of the polarized light beam onto the surface.
Referring to fig. 2, a vision system arrangement 200 is shown in which an exemplary object 200 having a light-reflective surface is directed along a direction of movement (arrow M) through an imaged scene. Illustratively, the object is moved by moving a vehicle 212, which moving vehicle 212 may comprise a mobile station or conveyor. In this embodiment, camera 220 includes sensor S1 and optics OC 1. Illustratively, in this embodiment, the sensor S1 is provided as a 1D pixel array (or 2D array, in which one row of pixels is processed), thereby defining a line scan camera. In one embodiment, the camera is operable to be able to read one or more rows of the 2D pixel array. In such devices, temporal techniques, as will be clear to those skilled in the art, are used to combine the pixel information from the rows into a single image. Thus, the object may be mechanically scanned while the imager is continuously read to form a step-by-step scanned image of the part. Thus, the dimensions of the camera may be much larger than the high contrast area of the imaging system or detector. In an alternative embodiment, it should be noted that the imaging/illumination system may also be scanned as a unit while the component remains stationary so that a line scan will typically provide high contrast to an area of wireless size opposite the stationary device of fig. 1, where the high contrast is concentrated on a point or area. However, in certain applications, both moving object devices and stationary object devices have advantages.
As described above, optics OC1 are included in aperture control 224. The scene is illuminated by an illuminator 230, which illuminator 230 illustratively projects lines 232 of light onto the scene and the surface of the object 210. Notably, the lines extend as follows: parallel to the extension direction of the sensor pixel array and orthogonal to the movement direction M. The optical axis OA1 of the camera sensor S1 and optics OC1 is generally perpendicular/normal to the generalized surface plane of the object, and the projected "fan" (fan) of light rays is oriented at a non-perpendicular (acute) angle a1 to the surface plane normal N1. The camera sends the image data 240 in the form of a series of scan lines to a visual image processor 250. Which operates in a manner similar to processor 150 (fig. 1) described above. In this embodiment, the transport also sends movement information (e.g., in the form of encoder clicks or pulses) 242 to the processor 250. This information is used to record each scan line (e.g., based on a predetermined physical movement increment associated with each pulse along the direction of movement M) relative to the physical coordinate space of the object. This enables the processor 250 to construct a 2D image of the object from a series of 1D pixel rows. Processor 250 may be part of a suitable processing device (not shown, but similar to device 160 above). The processor 250 also provides an illumination control 270 and delivers appropriate defect information relating to the surface of the object based on the operation of the imager and illumination control 252, vision system tool 254 and defect finder 256. These operations are similar to those performed by processor 150 (FIG. 1) described above. The camera optics may include a polarizer P1, and the illuminator 230 may likewise include a polarizer, the function of which will be described below.
In one embodiment, the illuminator may be defined as an LED-driven, fiber optic illuminator, or any other acceptable illuminator. In either arrangement, the light may be provided as visible, infrared or near infrared light, as well as other wavelengths of light. It should be noted that in various embodiments, the relative movement between the object and the camera may be achieved by moving the object, moving the camera, or both moving the object and the camera. The movement may be linear or arcuate.
Optical relationship
Having described two exemplary apparatus 100 and 200 for acquiring images of objects having a light-reflective surface containing surface imperfections, such as pit and peak defects, the operation of the system in association with various exemplary surfaces will now be described in further detail. The following description relates to both devices. As shown in the exemplary (schematic) device 300 in fig. 3, an exemplary surface 310 comprising peaks 312 and valleys 314 is illuminated by a light source oriented at a non-right angle. Illuminator 320 defines an area (represented by width IA) of a spot (spot) illuminated on surface 310 that IS generally larger than the area (represented by width IS). The illuminated spot or surface is the area to be inspected, which may include peaks and valleys. The illuminated rays 322 converge on a spot based on suitable focusing optics (which may be conventional optics) in the illuminator 320, and the reflected rays 324 remain focused on a spot 326, either near the entrance aperture of the camera or on an aperture stop inside the camera. At either location, light emitted from the surface is primarily blocked by the knife edge structure 330 and/or an aperture stop (e.g., an iris diaphragm). Based on the illumination beam and the relative tilt between camera optical axis 342 and surface 310 relative to each other, light (ray 344) reflected from the sloped portion of the peak and valley defects is mostly transmitted through the knife-edge structure 330 and into the entrance aperture of optics 340 to reach sensor 350. The light on the opposite slope of each defect is completely reflected from the entrance aperture/optic 340 and/or enters the knife edge structure 330 (360).
The area of the resulting image 370 on the surface 310 has the form of a shadow map, where the imaged peaks 372 are half bright (based on incident light rays 344 from facing the ramp) and half dark (based on blocked light rays 360 from opposing the ramp); while the imaged valleys are half dark (based on blocking light 360) and the other half bright (based on light 344 from the facing ramp). The system can distinguish between peaks and valleys based on which half is dark and which half is bright, as shown, the left half is bright to represent peaks and the right half is bright to represent valleys. The areas around the hills and valleys are dark or less bright areas relative to the reflected sloped areas. The effect of the slope towards the optical axis of the camera is that the reflected light can be focused on the slope, and the deviation of the slope (first deviation) produces a high contrast change with respect to the intensity of the defect, while the light from the surrounding area of the defect is effectively attenuated due to the combination of the inclination of the knife edge and the occlusion effect.
It should be clear to those skilled in the art that installation of the device 300 requires properly directing the illumination beam toward and inclined relative to the ramp, and directing the camera toward the surface. The setting of the knife edge, either by setting external structures or moving an adjustable aperture, is then used to obtain the desired level of light blockage required to enhance defects in the image field.
Further use
The above described vision system arrangement can be operated on a variety of objects and surfaces. Fig. 4 shows a line scan vision apparatus 400 in which the line scan vision system camera 220 (described in fig. 2) is paired with a moving (moving M) object 420, which object 420 passes under the line illuminator 230 described above. In one exemplary embodiment, illuminator 230 may include a linear polarizer PI1 and camera optics OC1 may include an orthogonal polarizing filter P1. For example, the object may be defined as a reflective surface, a layered surface, such as an AMOLED display screen. This example includes a top anti-reflective coating 422 on a top glass or sapphire layer 424. Which covers the polarizer and/or other filters and a coating 426. the coating 426 is located on the active display layer 428. The active layer 428 includes an exemplary peak defect 430 located above the active layer 428 and a valley defect 440 located below the active layer 428.
Referring again to fig. 5 and 6, which illustrate illuminator beam 510 positioned at an oblique AP1 relative to camera optical axis OA1, an exemplary (active) AMOLED layer 428 may define a conventional polarization rotating layer, such as a 1/4 λ retarder. Thus, by transmitting polarized illumination beam 510, apparatus 400 can take advantage of the inherent properties of the object. The upper surface reflects some of the illumination light 510, typically by fresnel reflection. This light is largely blocked by the edges of the entrance aperture of the camera optics OC 1. The remaining light that can enter the aperture is blocked by the crossed polarizer P1 at the entrance aperture, the crossed polarizer P1 being oriented at 90 degrees to the illumination bias plate PI 1. Illumination light that penetrates the top layers 422, 424 passes through the 1/4 lambda retarder, reflects off the active layer 428, and then passes through the 1/4 lambda retarder a second time to convert from linear bias to circular polarization, after the first pass is completed, then back to linear bias, and the second pass is rotated 90 degrees. The reflected beam 520 exits the surface through polarizer P1 into the entrance aperture of camera optics OC 1. In this way, only light reaching the layer including defects (peaks in fig. 5 and valleys in fig. 6) will be received by the image sensor, and the received (filtered) light is then broken down by the knife edge to identify the tilted defect features.
Due to the presence of different thin film layers and coatings (e.g., anti-reflective coating 422) on the surface of the object, it is desirable to provide illumination beam 510 in the infrared or near infrared wavelength range/band. Most coatings and thin films on reflective surfaces (e.g., AMOLED display screens, etc.) are used to filter out light in the visible spectrum. Thus, the use of infrared or near infrared illuminators overcomes the filtering effect of these coatings or film layers due to the longer wavelength of the illumination light being transmitted. It should be noted that the knife edge structure KE1 having any acceptable arrangement may be provided in connection with the camera optics OC 1. In one embodiment, it may be located between the lens and the polarizer. In some embodiments, the knife edge may be integrated with a polarizer, as described below.
Referring to fig. 7, it is contemplated that the knife edge structure may be applied in a variety of ways in the optical path of the camera and optics. The basic knife edge structure 710 with blade 712 and bracket 714 is shown mounted in front of the lens optics 720 of the vision system camera 730. The knife edge structure obscures a portion of the entire aperture AC so that it can interact with the oblique illumination beam 740 of illuminator 750 when the oblique illumination beam 740 of illuminator 750 is emitted off of surface 760.
FIG. 8 illustrates another embodiment of an apparatus 800 in which a pair of illumination assemblies 810 and 812 project light beams 820 and 822, respectively, onto a retroreflective surface defect comprising a surface 830. Each light beam 820, 822 is tilted along a different orientation ( angles 840 and 842, respectively) with respect to the optical axis of optics 852 and sensor 854 of vision system camera 850. Thus, the light rays will be reflected differently (possibly opposite slopes of the peaks and valleys). A pair of corresponding knife edge structures 860, 862 are positioned in front of the optical entrance to block the reflected beams 820 and 822, respectively. Alternatively, a knife edge may be provided for both beams by an adjustable (double arrow 870) aperture 872 of the optical (lens) assembly 852. It should be noted that other (more than two) illuminators may illuminate the surface at other oblique angles, and that suitable knife edge configurations may be used.
In general, the adjustment of the lens aperture can be done in various ways. In the case where an adjustment ring is provided on the mirror body, the user can rotate the adjustment ring while viewing the display of the exemplary object until a suitable high contrast image of the defect is obtained. This process may be performed automatically when the lens and/or camera assembly includes an electromechanically (or otherwise) driven aperture. The vision system processor may optimize the aperture settings by determining which settings enable the defects in the acquired image to provide the highest contrast difference.
Referring now to fig. 9 and 10, there are shown exemplary vision systems 900 and 1000 having 2D pixel array vision system camera optics 910 and 912, respectively, the vision system camera optics 910 and 912 having a 2D pixel array for capturing 2D images of an exemplary stationary object 920 having a reflective surface, including one illuminator 930 (fig. 9) or multiple (two) illuminators 1030 and 1032 (fig. 10), both of which provide off-axis illumination to illuminate the above-described peak and valley defects on or below the reflective surface of the object 920. The aperture stop associated with optics 912 provides the knife edge (generally represented by element KE 2). The illuminators 930, 1030, and 1032 may each be an LED illuminator (such as the illustrated exemplary off-axis LED940 and optics 942), a fiber bundle illuminator, or any other acceptable front light illuminator. Beam splitters 950, 1050, and 1052 of conventional design are disposed parallel to each optical axis 960 and 1060, and the illuminator projects a light beam at a 90 degree angle relative to the axes 960 and 1060, defined as a plate, cube, prism, or any other device that can split an incident light beam into two or more light beams, which may or may not have the same optical power, and which may or may not be oriented at a 90 degree angle. In this way, the off-axis becomes coincident with the optical axis of the imager. This makes the design more compact and potentially enables the illuminator to be integrated with the camera optics. When one illuminator 930 of fig. 9 is used, using two illuminators 1030, 1032 (fig. 10) providing illumination from opposite sides will produce a more uniform image of the defect on the surface. It should be noted that the beam splitter may include various polarizing filters and other light conditioning components, including lenses, etc. For example, the camera may include polarizer P2 in combination with optics 912. Illuminator 930, 1030 may include a respective polarizer PI2 in the optical path, and illuminator 1032 includes a respective polarizer PI3 along the optical path. The polarizer is arranged and functions as described above (see fig. 5).
Results IV
Fig. 11 illustrates a graphical representation of a display image 1100 generated by a vision system according to embodiments herein. The image specifies the surface of the object, and a plurality of surface (or subsurface) defects 1110, 1120, 1130, 1140, 1150, and 1160 in the surface of the object have been identified. These exemplary defects are either valleys (1110, 1120, 1130, and 1140) or peaks (1150 and 1160) whose bright and dark halves are oriented in different directions depending on whether the peaks or valleys are illuminated. However, as a result of the illumination being tilted, each peak and each valley shows a bright half and a dark half with the same direction, regardless of their size/shape. Other vision system processes may use images associated with defects to determine whether the defects represent potentially unacceptable dimensions.
V. detection and evaluation of wavy surface features
The above-described systems and methods may be used to determine flaws/defects in the form of undulating, wavy, or wavy surface features on a retroreflective article. For example, a flat screen may define (to some extent continuous) wave-shaped (wavy) regions rather than peaks or valleys. While some waviness is acceptable, it is contemplated that these features, which are excessive depending on the area or size (amplitude) of the waveform, may exceed acceptable thresholds, resulting in objects appearing to be defective.
Fig. 12 illustrates in detail a process 1200 in which the waviness of the light-reflecting surface is determined and evaluated. Waviness in step 1210 of process 1200, an image system acquires an image of a reflective object surface that may be wavy using off-axis illumination and a knife edge structure as generally described above. An image of the entire object may be acquired using a suitable lens, or may be acquired in a line scan fashion (described further below). The illumination and knife edge arrangement causes most of the light to be projected onto the surface to reflect off the image sensor optics or into the knife edge structure, with a portion of the reflected light entering the image sensor based on a wavy or wave-shaped slope. This causes the light ripples (e.g., lines) to be surrounded by dark areas. A series of such ripples that look like bright lines are defined in the acquired image.
In one exemplary embodiment, the acquired image data may be subjected to various image processing processes, such as a Gaussian smoothing process.
At step 1220 of process 1200, a statistical analysis may be performed on the entire image intensity map of pixels in the captured image-e.g., a histogram of pixel (grayscale) intensity versus pixel frequency in the image. Referring to fig. 13, a histogram of an image having both smooth and wavy surface features is shown. Typically, the smooth areas show a dense intensity distribution at high frequencies. In contrast, the wavy region shows a broad spread of intensity at low frequencies (histogram regions 1310 and 1320). Thus, the wavy region may be represented by the relatively wide histogram in fig. 14, while the smooth region may be represented by the relatively narrow histogram 1500. It should be noted that this is one of a number of statistical methods-typically involving the extent to which certain pixel intensity values occur-to analyze smooth regions of the acquired image versus wavy regions of the acquired image.
Referring again to process 1200 in FIG. 12, the distribution of intensity values in, for example, a histogram, is evaluated (step 1230). The evaluation may comprise a histogram analysis, wherein for example a grey level distribution of pixel values is calculated and a histogram tail (tail) is generated. The process 1200 then determines whether a waveform or other defect is present by, for example, calculating whether the histogram tail is within an acceptable range of the mean (decision step 1250). If a ripple/defect is present (e.g., the histogram tail is outside the range of mean values), process 1200 branches to step 1260. For example, each image with a histogram of the out-of-range tails may be limited to a threshold. The threshold may be user defined or automatically determined. Then, the size and location of all defects within the thresholded image are measured. If the measurement of any defects (or aggregates of defects) results in a predetermined index being exceeded (which may be user defined or automatically set), then the process 1200 indicates the specific defects on the surface of the object and/or the location of such defects. The process may also take other actions, such as partially rejecting or issuing a warning signal (step 1270). Conversely, if the histogram tail does not indicate a ripple and/or defect, the object is considered acceptable and generally has no substantial defects. This condition may be indicated and/or no action may be taken (step 1280).
The above-described process for evaluating the wavy surface features on the light reflecting object (1200) may be performed in accordance with a line scanning process. Fig. 16 and 17 illustrate apparatus 1600 and apparatus 1700, respectively, in which an object 1610 having exemplary wavy surface features 1720 (fig. 17) is moved (double arrow M) directly along a known (e.g., via an encoder) field of view (line 1630) of an image sensor LS. Note that the scanning movement MO is along one of the opposite directions or along both of the opposite directions, as required. The image sensor LS is located within the camera LSC and is configured as a line scan sensor having a line of pixels such that it can acquire an image emitted from the surface of the object. The area of the camera field of view 1630 is illuminated by off-axis illumination provided by the line scan source LI within the housing LIH. The illumination source (LI) may be any acceptable light device-for example, a row of LEDs. This light is focused by a cylindrical lens 1650 of conventional design and shape, which provides an illumination area (transverse to the direction of movement MO across the object surface) of the desired width W1 and infinite length. Note that in one exemplary embodiment, the illumination line width W1 may be a few millimeters or less-but may be narrower or wider, depending on the resolution of the scan. The length is determined by the light source LI and the corresponding length of the cylindrical lens 1650. The cylindrical lens is positioned spaced from the illumination source LI by an enclosed lens holder 1640 that provides a desired focal length between the light source LI and the surface 1610 of the object. In one exemplary embodiment, cylindrical lens 1650 can be defined as a semi-cylinder that is spaced a distance from lens holder 1649 and focuses the line on a surface. The off-axis projection of the light rays shown is such that a majority of the emitted light rays 1652 (the projected plane or fan depicted in fig. 17) are emitted (line 1654) outside the image sensor device (e.g., lens aperture) LA, and/or any external knife-edge structure. As shown, the received light rays 1656 emitted by the ramp surface are received by the line scanning camera LSC. In one exemplary embodiment, another cylindrical lens 1660 located at the end of the enclosed lens holder 1670 focuses the received light into the camera optics (knife edge structure LA) and line scan sensor LS. Various camera optics arrangements other than the cylindrical lens described will be apparent to those skilled in the art. As shown in fig. 16, the polarizer PI4 may be placed in the optical path of the illumination source LI (at different locations along the optical path). Likewise, a polarizer P3 may be arranged in the received light path of the sensor LS. These elements are provided, but are not shown in the description of the apparatus 1700 of fig. 17 for clarity.
It should be noted that although cylindrical lens shapes are used, various cross-sectional shapes, such as paraboloids, may be used in alternative arrangements. Likewise, in addition to lenses, mirrors may be substituted for the lenses to focus the illumination light. Advantageously, the illumination means ensure that the entire surface is constantly highly illuminated and that each scanned line fully represents a local slope of the surface. The apparatus may advantageously enable surfaces of any size to be imaged and analyzed for pits, peaks and ripples. For example, a screen of a tablet or laptop computer, or a larger flat panel television, may be analyzed by providing a sufficiently long line illumination assembly and one or more line scanning cameras across the surface of the object. Each camera may image a portion of the entire object and provide separate or stitched together assessment of the surface.
It should also be noted that it is expressly contemplated that in alternative embodiments, for example, an illuminator in combination with a fresnel lens or other optical device may be used to image a larger area of the object.
Line, disk and circular optical mask
Fig. 18 shows a diagram of an exemplary embodiment of a generalized vision system arrangement 1800 including an optical component 18030, and a vision system camera component 1810 having an image sensor 1820. Sensor 1820 is interconnected (as shown) to a vision system processor in the manner generally described above, and performs appropriate vision system tasks on the images acquired by sensor 1820. The optical assembly 1830 may be any acceptable variable or fixed focal length and/or variable or fixed aperture lens unit, or combination of lens units, such as a conventional M12-type base, F-type objective mount, or C-type objective mount lens.
According to an exemplary embodiment, the front face 1832 of the optical/lens assembly 1830 may be covered by a fixed or movable mask assembly 1840. Mask assembly 1840 may be of the screw-on or snap-on type, or may be mounted via a bracket (not shown) on the front of lens assembly 1830. The mask assembly 1840 may also be applied as an adhesive decal or directly coated on the front (or other) surface of the lens assembly. In the case of a screw-on attachment, the mask assembly 1840 may operate in a manner similar to other conventional filters to be used in conjunction with various lens arrangements, and may be adapted to thread through an end of a conventional lens filter holder.
Alternatively, the mask assembly 1840 may be manually or automatically (e.g., via solenoids, servo systems, steppers, etc.) out of or into the optical path of the lens, as desired. The mask assembly may also be defined, for example, as an electro-optical mechanism that can be varied between fully transparent and partially opaque having a desired size and shape via optional control circuitry 1850. By way of non-limiting example, the mask assembly 1840 may include a window (typically circular) that includes an LCD shutter or other form of configurable window.
Apparatus 1800 includes an illuminator 1860 as described above, the illuminator 1860 being oriented to project light at a non-right angle (as shown) relative to the entire plane of surface 1870. In this example, surface 1870 defines a corrugation in at least one direction that includes a series of peaks 1872 and troughs 1874 between the peaks. The angled light rays strike and are scattered by the peaks and troughs, and a portion of the light rays enter the camera optics 1830. The mask assembly 1840, in its various forms, defines a knife edge element that attenuates most of the scattered light and directs only light having a certain limited angular range toward the sensor 1820. In this embodiment, the covered/coated mask is represented by dashed line 1880, which includes a central coverage area 1882 and an outer coverage area 1884, with an open aperture between the central coverage area 1882 and the outer coverage area 1884 through which an emitted light ray 1890 from surface 1870 passes. In various embodiments, polarizer PI5 is integrated with illuminator 1860, and a corresponding polarizer P4 may be provided in combination with an optical/lens assembly. The polarizer is arranged and functions as generally described above (e.g., fig. 5).
Fig. 19 shows a more detailed example of a vision system arrangement 1900 in accordance with an illustrative embodiment. This embodiment includes a beam splitter and polarizer arrangement similar to that shown and described above in fig. 9. Specifically, the apparatus 1900 includes a camera assembly 1910 and a lens/optics 1920. According to embodiments herein, the lens/optic 1920 includes a mask assembly 1930. In front of the mask assembly 1940 is a polarizing plate P5 that operates according to the above principle, and a beam splitter 1950 is provided through which reflected light from the object 1960 to be inspected is emitted to the camera 1910. A lighting assembly 1970 is provided. Illumination assembly 1970 includes an illumination source 1972 and a condenser lens 1974. The front of the condenser lens is provided with a polarizer PI 6. It should be noted that the polarizer P5 may include a mask pattern on its surface, and the assembly may be provided as a twist or snap-on attachment to the front of the lens 1920.
According to an exemplary embodiment, another vision system arrangement 2000 is provided. The apparatus 2000 includes a camera assembly 2010 and a lens/optics 2020. According to embodiments herein, the lens/optic 2020 includes a mask assembly 2030. In front of mask assembly 2040 is a polarizer P6 that operates according to the principles described above, and a beam splitter 2050 is provided through which reflected light from object 2060 being inspected is transmitted to camera 2010. In this embodiment, a condenser lens 2070 is provided between the beam splitter 2050 and the object 2060. The condenser operates in conjunction with an illumination assembly 2080, the illumination assembly 2080 including an illumination source 2082, a focusing lens 2084, and a polarizer PI 7. It should be noted that the focusing lens 2084, the condenser lens 2070 and other optical components may be sized and arranged according to well-known optical principles that will be apparent to those skilled in the art.
The center and outer coverage areas of the various mask assemblies described above may define various aggregate shapes, sizes, and relationships. The appropriate mask may be selected empirically or by trial and error to achieve the best image for a given surface being inspected. This is described in further detail in fig. 21-27, which provide various types/sizes of mask patterns.
Referring to FIG. 21, a mask 2100 that produces one form of a knife edge defines a central opaque line 2110 that is located within a transparent linear aperture 2120. The remaining outer regions 2130 of the mask that are located around the transparent apertures 2120 are also opaque. The lines 2110, 2120 are oriented substantially parallel to the elongate direction of the characteristic undulations of the surface (if any), and this form of mask is most effective in such situations. More generally, the direction of elongation is selected (e.g., by rotating the mask) to enhance or suppress surface features, as desired. By way of non-limiting example and in order to better understand the function of the device, the width WLA of the aperture is variable, for example between 5 and 10 mm for a lens with a diameter D0 of 50-55 mm, and the central opaque line WL between 1 and 5 mm. Typically, the width of the line WL is dimensioned to match the width of the focused spot from the illuminator. It should be noted that each of the following mask devices (fig. 22 to 27) is assumed to have a similar lens diameter D0. For larger or smaller diameter lenses, the overall dimensions may vary proportionally.
Fig. 22 shows a mask 2200 comprising a central opaque, circular (blocking) disc 2210 of diameter DD. The disc provides the desired knife edge element for the device. Typically, the dimensions of the disc are selected to match the dimensions of the surface features (e.g., defects) to be enhanced or suppressed. It should be noted that the exemplary masking device 2200 to the edge of the lens (dashed circle 2230) does not have any outer opaque regions and is transparent. The basic knife edge element enables light to be received over a given angular range of peaks and troughs that may be oriented in various directions on a surface.
Fig. 23 shows a central opaque (blocking) disc 2310 having a diameter DD1 (about 9 mm) and an annular opaque region outer region 2330 having an inner diameter DA (about 14 mm). The difference between the disc diameter DD1 and the outer region 2330 creates a transparent annular window 2320 through which light rays reflected from the surface may pass. It should be noted that the diameter of the central blocking disk defines the degree of light attenuation in the manner of a knife-edge element, while the diameter of the annular outer region defines the confocal effect of the optical system for enhanced clarity.
Some other examples of mask configurations 2400, 2500, 2600, and 2700 are depicted in respective fig. 24, 25, 26, and 27, with a central blocking disk and an outer annular region defining an annular aperture therebetween. By way of non-limiting example, the disk diameter DD2 of the mask 2400 is approximately 5 to 6 millimeters and the outer annular region inner diameter DA1 is approximately 8 to 9 millimeters. The disk diameter DD3 of the mask 2500 is approximately 3 to 4 millimeters and the outer annular region inner diameter DA2 is approximately 5 to 6 millimeters. The disk diameter DD4 of the mask 2600 is approximately 3 to 4 millimeters and the outer annular region inner diameter DA3 is approximately 8 to 9 millimeters. The disk diameter DD5 of the mask 2700 was approximately 5 to 6 millimeters and the outer annular region inner diameter DA4 was approximately 10 to 12 millimeters. These dimensions are merely examples of a wide range of possible dimensions that may be adjusted for respective features of the surface being inspected, such as angles, illumination intensities, and/or wavelengths of the vision system arrangement.
As generally described above, a mask can be constructed by applying a coating having a suitable pattern to a filter-like glass surface using various techniques (e.g., printing, photolithography, applying a transparent film having a printed or molded pattern, etc.). It should be clear to those skilled in the art that various techniques can be used to apply the fixed mask pattern to the camera optics. Likewise, as described above, the mask may define an active component comprising, for example, a pixelated surface. A controller, either separate from or part of the vision system processor, selectively processes individual pixels of the active mask to generate a mask pattern having a desired shape and size. It should be noted that the controller may be adapted through various configurations until the user or an automated vision system process (e.g., based on contrast) determines the optimal pattern settings. The shape of the pattern may be similar to that described in fig. 21 to 27, or may have a more complex shape to better conform to the unique surface features and/or the wave pattern.
It should be noted that in some embodiments, multiple cameras interconnected with one or more vision system processors may be used. Each camera may acquire images of the object surface with different sized and/or configured masks (e.g., different sized blocker discs) from the same or different angles, and multiple images of the surface may be analyzed to ensure that ripple features having different sizes, shapes, and/or orientations are properly imaged. Similarly, where the mask is variable (either by placing a different mask in front of the optics or by changing the pattern of the mask), multiple images may be acquired and analyzed.
Referring to image 2800 of FIG. 28, a conventional touch screen of a handheld device is shown that is imaged using a mask according to the above-described embodiments. In this image, the surface waviness can be clearly identified, whether by visual inspection or by a plurality of conventional visual system arrangements with relatively flat featureless surfaces. In fig. 29, the image 2900 also shows details that are not generally visible, in this example, a sensor matrix/array 2910 of the touch screen. The level of detail achieved using the masking and imaging techniques described herein is further illustratively shown by the image in fig. 30, where the individual wires 3010 in the array 2910 in fig. 29 can be clearly discerned in a close-up view of the area of the touch screen.
VII. conclusion
It should be apparent that the above-described systems and methods provide an efficient technique for identifying both slope defects, including hills and valleys, and waveform/ripple defects on a plurality of layered and non-layered retroreflective surfaces. By employing illumination light and filters of appropriate wavelengths (e.g., various polarizers), the systems and methods can effectively image surfaces having various coatings and layers. Ideally, the illustrative knife edge device can distinguish the slope (first derivative) of a defect that causes light reflected from a peak or valley to be either bright or dark relative to the background, depending on which side the defect is on. The size of the defect may be proportional to the slope of the defect. Smaller defects will have smaller slopes, which will deflect less illumination light from the background. The small spatial extent of the light source allows it to appear slightly focused after emission from the inspected surface, allowing it to more easily block the background without blocking the defective light. However, a more extended light source would reduce the negative effects of random test surface tilt, which would be encountered in a production environment at the expense of reduced defect contrast. Thus, the knife edge increases contrast by reducing the background by blocking background light. Furthermore, slope, shape, and polarization detection are illustratively utilized such that most of the background light is reflected and filtered out of the camera's aperture, while rays from the slope defect are focused in the camera with high contrast. Furthermore, exemplary devices enable application to a wide range of sizes of retroreflective surfaces, typically through the use of line scan cameras and focused illumination lines. Also embodied herein is a mask comprising a blade member and other elements (e.g., confocal elements) to provide a highly refined description of the exemplary embodiments of the invention that have been described in detail above. Various modifications and additions can be made without departing from the spirit and scope of the invention. Features of each of the various embodiments described above may be combined with features of other described embodiments as desired in order to provide a wide variety of combinations of features associated with the new embodiments. Furthermore, while various independent embodiments of the apparatus and method of the present invention have been described above, the description is merely illustrative of the application of the principles of the invention. For example, the terms "process" and/or "processor" as used herein should be taken to broadly encompass the underlying functionality and components of various electronic hardware and/or software (which may alternatively be referred to as "modules" or "elements"). Further, the described processes or processors may be combined with other processes and/or processors or divided into sub-processes or sub-processors. Such sub-processes and/or sub-processors may be combined differently according to embodiments herein. Likewise, it is expressly contemplated that any of the functions, processes, and/or processors herein can be implemented using electronic hardware, software comprised of a non-transitory computer readable medium having program instructions, or a combination of hardware and software. Furthermore, the various directional and positional terms used, such as "vertical," "horizontal," "upper," "lower," "bottom," "top," "side," "front," "rear," "left," "right," and the like, are used only as relative conventions, not absolute directions/arrangements, with respect to the fixed left space, owner, direction of action of gravity. Further, the terms "substantially" or "approximately" are used with respect to a given measurement or measurement characteristic to refer to a quantity that is within a normal operating range to achieve a desired result, but which includes some variation (e.g., 1% -5%) due to inherent inaccuracies and errors within the allowable deviation of the system. Accordingly, this description is to be taken only by way of example and not to otherwise limit the scope of the invention.

Claims (26)

1. A system for imaging a defect on a reflective surface of an object, the system comprising:
a vision system camera having an image sensor and optics and defining an optical axis, the vision system camera oriented to image the surface;
an illuminator assembly projecting the structured light beam onto the surface at a predetermined angle that is not parallel to the optical axis; and
a knife edge element associated with the optic and variably obscuring a portion of a maximum field of view of the optic, wherein the knife edge element and the predetermined angle are each arranged such that: light reflected by the optics entering the sensor penetrates the ramped peaks and valleys of the features on the surface, and reflected light around the ramped peaks and valleys is obscured by the knife edge element.
2. The system of claim 1, wherein the knife edge element comprises a variable aperture in the optic.
3. The system of claim 1 wherein the predetermined angle is associated with a slope of the peaks and valleys.
4. The system of claim 1, wherein the sensor is a 2D sensor and the object is stationary relative to the camera.
5. The system of claim 1, wherein the sensor defines an arrangement of line scan cameras and the object moves relative to the cameras.
6. The system of claim 5, wherein the illuminator assembly projects illumination lines onto the surface.
7. The system of claim 6, wherein the illumination defines an IR or near IR wavelength range.
8. The system of claim 7, wherein the object defines a layer comprising an anti-reflective coating.
9. The system of claim 8, wherein the layer comprises a polarizing layer, the illumination is polarized and the optics comprise a polarizing filter.
10. The system of claim 9, wherein the object is an AMOLED display screen, and the polarizing layer is a 1/4 λ retarder, and the polarizing filter is defined as an orthogonal polarizing filter.
11. The system of claim 6, wherein the illuminator comprises a polarizer for polarized illumination and the optics comprise a polarizing filter.
12. The system of claim 1, wherein the illumination assembly defines a concentrated beam that converges toward a point near the knife edge structure.
13. The system of claim 1, wherein the knife edge structure defines an outer structure that is located in the light path in front of the optics.
14. The system of claim 1, wherein the illuminator assembly projects light rays through a beam splitter located on an optical axis of the vision system camera such that off-axis illumination from the illuminator assembly is projected by the beam splitter onto the object surface coincident with the optical axis.
15. The system of claim 1, wherein the illuminator assembly defines a plurality of illumination sources, wherein each illumination source projects light onto a respective beam splitter, each beam splitter being located on an optical axis of the vision system camera such that: off-axis illumination from the illuminator assembly is projected by the beam splitter onto the object surface coincident with the optical axis.
16. The system of claim 5, wherein the vision system camera is defined as a line imaging lens, and the illuminator assembly projects onto the surface and then focuses into a focal line that falls outside of the imaging lens' entrance aperture after reflection.
17. The system of claim 16, wherein the illuminator assembly comprises a cylindrical lens to focus the focal line.
18. The system of claim 1, wherein a feature is defined as a ripple on a region of the surface, and further comprising an analysis and evaluation process that determines a distribution of pixel intensity values in an image acquired by the image sensor and compares the distribution to a threshold.
19. The system of claim 18, wherein the distribution is defined by at least one histogram of pixel intensity versus frequency in the image.
20. The system of claim 1, wherein the knife edge element defines a blocking structure within the optic on the optical axis on a mask member disposed adjacent a front face of the optic, the blocking structure being disposed to selectively enhance or suppress scattered light associated with the feature.
21. The system of claim 20, wherein the blocking structure defines a line extending through the optic along an elongated direction and having a width along a direction transverse to the elongated direction that is related to a size of a focused illumination spot on the optic, and wherein the elongated direction is defined by an orientation of the feature.
22. The system of claim 21, wherein the mask member includes a surrounding opaque region on each opposing side of the line with a linear aperture between the line and the opaque region.
23. The system of claim 20, wherein the blocking structure comprises a disk centered on the optical axis and having a diameter related to a size of one or more of the features.
24. The system of claim 23, further comprising an annular region located around the disk and defining an annular aperture between the annular region and the disk, the annular region configured to suppress scattered light.
25. The system of claim 20, wherein the mask member defines at least one of: a snap-on or screw-on lens cover, a decal located over the front face of the optic, and a variable mode electro-optic mechanism located on the optic.
26. The system of claim 20, further comprising a first polarizer disposed in conjunction with the optics and a second polarizer disposed in conjunction with the illuminator assembly.
CN201611000662.4A 2015-11-13 2016-11-14 System and method for detecting defects on reflective surface through vision system Active CN106959293B (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US201562255360P 2015-11-13 2015-11-13
US62/255,360 2015-11-13
US201562274094P 2015-12-31 2015-12-31
US62/274,094 2015-12-31
US201662404431P 2016-10-05 2016-10-05
US62/404,431 2016-10-05
US15/349,131 2016-11-11
US15/349,131 US11493454B2 (en) 2015-11-13 2016-11-11 System and method for detecting defects on a specular surface with a vision system

Publications (2)

Publication Number Publication Date
CN106959293A CN106959293A (en) 2017-07-18
CN106959293B true CN106959293B (en) 2020-09-25

Family

ID=59050504

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611000662.4A Active CN106959293B (en) 2015-11-13 2016-11-14 System and method for detecting defects on reflective surface through vision system

Country Status (3)

Country Link
JP (1) JP6568672B2 (en)
KR (1) KR101945927B1 (en)
CN (1) CN106959293B (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107179323B (en) * 2017-07-25 2019-11-08 常州工程职业技术学院 A kind of suspension disc insulator visible detection method
US10502695B2 (en) * 2017-08-16 2019-12-10 The Boeing Company Automated inspection of foreign materials, cracks and other surface anomalies
CN108106549B (en) * 2017-11-24 2019-10-18 清华大学 A kind of motor rotor magnetic steel piece assembles the structure light detection method of radial defect
CN108180826B (en) * 2017-12-20 2023-12-22 深圳湾新科技有限公司 Detection equipment and detection method for boundary of lithium battery winding layer
CN108519064B (en) * 2018-04-20 2019-12-03 天津工业大学 A kind of reflective suppressing method applied to multi-frequency three-dimensional measurement
WO2020020130A1 (en) * 2018-07-27 2020-01-30 深圳中科飞测科技有限公司 Light emitting device, optical detection system, optical detection device and optical detection method
CN110828294A (en) * 2018-08-14 2020-02-21 合肥晶合集成电路有限公司 Grinding performance detection method of chemical mechanical grinding equipment
US10755403B2 (en) 2018-08-17 2020-08-25 The Boeing Company Apparatus and methods for shot peening evaluation
CN111183351A (en) * 2018-09-11 2020-05-19 合刃科技(深圳)有限公司 Image sensor surface defect detection method and detection system
KR20200050497A (en) 2018-11-01 2020-05-12 에스케이하이닉스 주식회사 Method of detecting printing defects on photoresist pattern
CN110057825B (en) * 2019-04-30 2024-02-09 中国地质大学(武汉) Jade egg surface transparency effect grading instrument and grading method thereof
CN110095472B (en) * 2019-05-08 2019-12-13 湖北工业大学 HDRI-based high-reflectivity metal surface defect detection method and system
CN112683925A (en) * 2019-10-17 2021-04-20 神讯电脑(昆山)有限公司 Image detection scanning method and system for possible defects on surface of object
CN111112130A (en) * 2019-12-23 2020-05-08 中国电子科技集团公司第四十一研究所 Hose tail sealing quality detection device and method based on machine vision technology
US11740356B2 (en) * 2020-06-05 2023-08-29 Honeywell International Inc. Dual-optical displacement sensor alignment using knife edges
CN112391731B (en) * 2020-10-20 2022-05-03 天津大学 Online detection method for yarn breakage during weaving of warp knitting machine
CN112697808B (en) * 2020-12-14 2023-02-21 苏州祥盈升精密实业有限公司 Multi-angle automatic detection device for injection molding piece
CN113092486B (en) * 2021-04-07 2023-04-07 沧澜智能科技(昆山)有限公司 Method suitable for detecting water ripples on surface of high-gloss surface product
CN112798608B (en) * 2021-04-14 2021-07-23 常州微亿智造科技有限公司 Optical detection device and optical detection method for side wall of inner cavity of mobile phone camera support
CN113125449B (en) * 2021-04-20 2022-10-18 江苏善果缘智能科技有限公司 Scanning device for detecting surface of integrated product and assembling method thereof
CN113686881A (en) * 2021-09-23 2021-11-23 云智汇(深圳)高新科技服务有限公司 Visual all-angle imaging device in high-reflection mirror surface environment

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61176805A (en) * 1985-01-31 1986-08-08 Nec Corp Instrument for measuring defect size of photomask
JPH03115844A (en) * 1989-09-29 1991-05-16 Asahi Glass Co Ltd Detection of surface defect
US7126699B1 (en) * 2002-10-18 2006-10-24 Kla-Tencor Technologies Corp. Systems and methods for multi-dimensional metrology and/or inspection of a specimen
WO2005072265A2 (en) * 2004-01-22 2005-08-11 Wintriss Engineering Corporation Illumination system for material inspection
JP2009092426A (en) * 2007-10-04 2009-04-30 Nippon Steel Corp Surface inspection method and surface inspection device
CN101576509B (en) * 2009-06-16 2011-04-27 华南理工大学 Method and device for automatically detecting surface defects of spherules based on machine vision
US8750596B2 (en) * 2011-08-19 2014-06-10 Cognex Corporation System and method for identifying defects in a material
US10365298B2 (en) * 2013-11-25 2019-07-30 Technion Research & Development Foundation Limited Optical knife-edge detector with large dynamic range

Also Published As

Publication number Publication date
JP6568672B2 (en) 2019-08-28
JP2017111121A (en) 2017-06-22
KR101945927B1 (en) 2019-02-08
KR20170056472A (en) 2017-05-23
CN106959293A (en) 2017-07-18

Similar Documents

Publication Publication Date Title
CN106959293B (en) System and method for detecting defects on reflective surface through vision system
US11493454B2 (en) System and method for detecting defects on a specular surface with a vision system
KR100756099B1 (en) Apparatus and method for fabricating flat workpieces
US7551274B1 (en) Defect detection lighting system and methods for large glass sheets
US7382457B2 (en) Illumination system for material inspection
US20130242083A1 (en) Retro-reflective imaging
TWI422814B (en) An apparatus and method for inspecting inner defect of substrate
CN110073203A (en) The method and apparatus for checking the defect in transparent substrate
KR102633672B1 (en) Methods and apparatus for detecting surface defects on glass sheets
CN107782732B (en) Automatic focusing system, method and image detection instrument
KR101416860B1 (en) Particle inspecting system for camera lens module
JP5589423B2 (en) Transparent flat plate detection system
JP2008157788A (en) Surface inspection method and device
US20230020684A1 (en) Laser based inclusion detection system and methods
JP2006258778A (en) Method and device for inspecting surface defect
US20030117616A1 (en) Wafer external inspection apparatus
JP2005528593A (en) Imaging method and apparatus
KR101447857B1 (en) Particle inspectiing apparatus for lens module
JPH0961291A (en) Apparatus for testing optical parts
JP3078784B2 (en) Defect inspection equipment
KR101403926B1 (en) Apparatus for inspecting curved surface
US11415528B2 (en) Method and apparatus for automated in-line inspection of optically transparent materials
KR20140144170A (en) inspecting machine for flat panel
US9322776B2 (en) Method and system for imaging a target
KR20110133183A (en) Inspecting machine for flat panel

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant