WO2009123901A1 - Method and apparatus for multiplexed image acquisition and processing - Google Patents
Method and apparatus for multiplexed image acquisition and processing Download PDFInfo
- Publication number
- WO2009123901A1 WO2009123901A1 PCT/US2009/038299 US2009038299W WO2009123901A1 WO 2009123901 A1 WO2009123901 A1 WO 2009123901A1 US 2009038299 W US2009038299 W US 2009038299W WO 2009123901 A1 WO2009123901 A1 WO 2009123901A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- color
- illumination
- article
- bayer filter
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
- G01N2021/8838—Stroboscopic illumination; synchronised illumination
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N2201/00—Features of devices classified in G01N21/00
- G01N2201/06—Illumination; Optics
- G01N2201/062—LED's
Definitions
- the present invention relates to using machine vision to inspect manufactured articles at high speed.
- it relates to acquiring and processing multiple images of an article acquired simultaneously by a single camera while the article is in motion. More particularly it relates to using differently colored lights illuminating an article from different directions and color filters on a sensor to separate the images in order to acquire multiple images simultaneously.
- 2D vision systems typically consist of a camera, optics and illumination. Camera technology is generally limited to commercially available sensors, so typically the optics and illumination are customized to provide the correct magnification, field of view and acquisition speed.
- An example of a machine vision system manufactured with an off-the-shelf image sensor and custom optics and illumination is the ESI BulletTM wafer ID reader manufactured by Electro Scientific Industries, the assignee of the instant patent application.
- This system contains a video sensor, optics and illumination designed inspect identification marks on semiconductor wafers, among other articles. Although it is designed to image very subtle marks formed on a mirror-like surface, this system was unable to image small defects that are the subject of the instant invention. This is so because the defects in question do not appear any different to the camera from their surroundings. The defects may be visible only as slight depressions in the surface of the article and therefore cannot be imaged with conventional 2D imaging.
- One possible method of imaging the defects would be to image the article while illuminated by light shone at a shallow grazing angle to the article from at least two different directions.
- the light would have to be shone from at least two directions because not all of the defects are visible from a single light angle. Processing all of these images would likely find the defects but would require a separate frame time for each acquisition, thereby slowing the process unacceptably.
- the image could be acquired in parallel, but that would require a sensor for each direction of light, along with filters so that each light would be visible only to the appropriate camera. This would introduce unacceptable expense to the process.
- Another potential solution to the problem of imaging subtle defects would be to use 3D imaging.
- 3D vision systems create image data that correspond to the true 3D shape of an article rather than the intensity of wavelength of the light reflected from the article.
- 3D systems can be classified into several groups according to the technology involved. The first is passive 3D systems that construct images with height information from cues in regular 2D images. An example of this is stereo image reconstruction where two or more 2D images are acquired, features detected in each image and then the correspondence problem is solved that attempts to reconcile the detected features in each image. Discrepancies are attributed to differences in altitude perception between the images and height is thereby inferred. This has the same issues as multiple 2D images above, namely time to take multiple images and equipment costs to acquire in parallel. Also, this approach would not work on the defects in question here since they do not exhibit features that could be used to infer altitude.
- 3D imaging requires projecting lines, typically a laser line or other shape projected onto the article and then imaged. The displacement of the lines indicates 3D contours. This method also requires multiple acquisitions to perform and therefore is unacceptably slow for this application.
- Other types of 3D imaging depends on detecting focus quality over a series of image steps (confocal imaging) or acquiring images of a projected special grid (Moire imaging). Both of these methods require multiple image acquisitions in order to construct a 3D image and for this reason are unacceptable solutions to this problem.
- these methods typically require that the article be held motionless for the duration of the scan, meaning that the article has to index, stop, and settle prior to scanning, all of which can slow the manufacturing process.
- Two more images of an article are acquired in a single image frame by illuminating the article with two or more different colors of light and acquiring the data with an image sensor with an attached Bayer filter structure.
- a Bayer filter is a color filter attached to an image sensor constructed with red, green and blue color filters. The Bayer filter is attached to the image sensor and image data acquired.
- the image data is acquired into the controller, either in the camera itself of in an attached controller, the pixels are sorted depending upon which color of the filter the pixels correspond to, i.e., all the red pixels are put into an image, all the blue pixels are put into a separate image and all the green pixels are put into a further separate image.
- a monochrome image acquisition sensor can be made to simulate a color imaging system with a single sensor, albeit at somewhat lower spatial resolution.
- the color images are processed to be in registration with each other so that a feature in one image will occur at the same location in the other images.
- three separate images can be derived from the single image acquired. If, for example, the article is illuminated from different directions with two or more of the three colors, the derived images can be processed to show discrepancies between the images and therefore show subtle differences in 3D morphology that indicate defects.
- the lighting can be strobed or the image sensor gated to acquire the data in a very short period of time, thereby not only acquiring the three images simultaneously and hence in registration, but also allowing them to be acquired without stopping the article.
- an article is illuminated from two different directions with light matched to two of the filters in the Bayer filter.
- a third image is acquired using diffuse illumination matched to the third color of the Bayer filter.
- the image data are acquired and separated into the three separate images corresponding to the three colors of the Bayer filter.
- the first two images are then subtracted to highlight the differences between them, since features that occur in one image and not another are very likely defects.
- the resulting image is then processed using conventional machine vision techniques to identity and classify the defects.
- the diffuse image is also processed using conventional machine vision techniques to determine if other defects such as deviations of the outline from nominal values are present that might possibly not be detected in the directionally illuminated images.
- the three illumination sources are strobed, or flashed for a very short period of time. This allows the sensor to acquire data from a moving part without blurring caused by motion.
- the sensor is set to integrate light and the illumination sources are then set to strobe while the sensor is integrating light. Following the strobe, the sensor is directed to read out the image data created in the sensor by the strobed illumination sources to the controller, where the image data is separated into three images corresponding to the three color filters.
- the motion of the article is slowed as it passes the field of view of the sensor by implementing a sinusoidal velocity profile in the transport mechanism that moves the article.
- a controller is operatively connected to the transport mechanism to control the speed at which articles are moved.
- Fig. 1 is a schematic diagram showing how an RGB Bayer filter structure is applied to an image sensor.
- Fig. 2 is a chart showing the spectral response of a typical RGB Bayer filter structure.
- FIGs. 3a and b show two views of a Bayer filter-based directional image acquisition system.
- Fig. 4a, b, and c show directional illumination image processing.
- Fig. 5 shows diffuse illumination image processing.
- This invention is directed to acquisition of image data to support automated inspection of electronic components.
- This type of component is the chip capacitor, which is made of alternating layers of metal conductors and ceramic dielectrics. These components are subject to defects which are difficult or impossible to image with conventional 2D or 3D systems. In addition, these components are made at a very high rate of speed. Installations that manufacture these parts can produce them at rates of hundreds of thousands per hour. Finally, these components are assembled into circuits that go into virtually all electronic devices currently sold; therefore it is advantageous to inspect these components before they are assembled into circuits.
- This invention can reduce the image capture cycle down to a single image exposure (2 ms) and use a faster move profile of 18 ms. The 20 ms cycle would enable a throughput speed of 180,000 units per hour.
- FIG. 1 is a schematic diagram of a semiconductor image sensor 10 with an applied Bayer filter 12.
- the Bayer filter is comprised of three differently colored filters, red 14, green 16 and blue 18.
- the Bayer filter 12 is attached to the sensor 10 so that each color segment of the Bayer filter 12 is aligned with a picture element or pixel, one of which is indicated 20, of the image sensor 10.
- each pixel 20 integrates light of only one color so that when the resulting image data are read out from the sensor 10 by a controller (not shown), these data can be sorted into three separate images, each one representing pixels illuminated by only one color.
- this function is used to create full color images using a monochrome, wide bandwidth sensor, but in an embodiment of the instant invention, this function is used to form separate images, each corresponding to a separate illumination source.
- the three images can be made to conform in size, bit depth and registration to facilitate further processing.
- Fig. 2 is a graph 30 showing the quantum efficiency 32 of the three types of color filters found in a typical Bayer filter plotted against wavelength 34. Quantum efficiency indicates the percentage of photons of a particular wavelength that are captured and converted to electrical charge by the sensor and corresponds to spectral response.
- the three color peaks correspond to blue 36, green 38 and red 40 filters. As shown, the blue filter peaks at about 450 nm, the green filter peaks at about 550 nm and the red filter peaks at about 625 nm. Note that all three filters transmit energy in the infrared (IR) region of the spectrum (approximately 700 nm and greater). Most sensors of this type also contain an IR filter that blocks this extraneous energy from reaching the sensor.
- IR infrared
- An exemplary image sensor with Bayer filter attached is part # ICX445 EXview HAD CCD sold by Sony Corporation, Tokyo, Japan.
- An embodiment of the instant invention uses technology developed for color image sensing to capture three separate images in one frame time by using the Bayer filter. The image capture is reduced to a single sensor exposure by multiplexing multiple views of the component into a single image by illuminating the component from at least two different directions. Each illumination source uses a unique color or wavelength of light matched to the colors of the Bayer filter. In one embodiment of the instant invention the following color LEDs are used for image illumination:
- Exemplary LEDs that could be used for this application include: 470 nm - HLMP- CB30-M0000, Avago Technologies, San Jose, CA; 525 nm - LTST-C190TGKT, Lite- on Semiconductor Corporation, Taipei, Taiwan; and SML-LX0402IC-TR, Lumex, Palatine, IL
- Figs. 3a and 3b show an embodiment of the instant invention.
- Fig. 3a shows a top perspective elevation and Fig.
- FIG. 3b shows a side elevation of an embodiment showing an article 50, being carried past the camera 60 by a transport mechanism 51 and being illuminated by a first illumination source 52 projecting a first color collimated light 54, from a first direction, a second illumination source 56 projecting a second color collimated light 58.
- a third illumination source 62 illuminates the article 50 with a third color diffuse light 64.
- a camera 60 containing a sensor with Bayer filter attached (not shown) acquires image data from the illuminated article 50.
- Fig. 4a shows image data 70 acquired from an article illuminated by a first illumination source 72 using a first color light from a first direction 74.
- Image data 70 acquired with this illumination scheme shows a defect 76, which shows up as a lighter colored area.
- Fig. 4b shows image data 78 acquired from an article illuminated by a second illumination source 80 using a second color light from a second direction 82. Note that in this case, the defect is not visible in the image data 78.
- Fig. 4c shows composite image data 84 calculated by subtracting image data 70 from image data 78. In the composite image data 84, the defect 86 is clearly visible.
- the purpose of combining image data from the two different directional images 70, 78 is to suppress information that occurs in both images and enhance information that occurs in only one of the images.
- the thought is that the defects to be detected are visible when illuminated in only one direction.
- Features which are visible in two directions are typically not classified as defects.
- subtraction is an exemplary way to combine the images to enhance defect detection, but other operations are possible also, including other arithmetic operations, logical operations, order statistic-based operations such as min/max operators, or combinations of them.
- standard machine vision techniques are used to identify the defect in the combined image.
- a diffuse image of the article can be used to detect the defect.
- Fig. 5 shows a grayscale image 90 of an article acquired with diffuse lighting from above.
- a defect 92 is shown circled.
- multiplexing three images onto a single sensor in a one frame time is used to acquire the data in a single frame time.
- acquisition time By limiting acquisition time to a single frame time, the time required to stop the part, acquire the data and then restart the part motion is minimized.
- performing electronic shuttering with the sensor where the sensor is allowed to integrate light only for a limited period, can allow the image data to be acquired without stopping the part.
- the illumination sources can also be strobed to freeze the motion of the part as it passes through the field of view of the sensor.
- the part can either keep moving at its normal speed past the camera, or in cases where in spite of electronic shuttering or strobing the illumination sources the image data contain motion blur, the part motion can be programmed to be sinusoidal, slowing down as the part moves through the field of view of the sensor and speeding up when no image is being acquired.
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011503042A JP2011516860A (en) | 2008-03-31 | 2009-03-25 | Method and apparatus for acquiring and processing multiplexed images |
CN2009801110186A CN101981411A (en) | 2008-03-31 | 2009-03-25 | Method and apparatus for multiplexed image acquisition and processing |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US4110508P | 2008-03-31 | 2008-03-31 | |
US61/041,105 | 2008-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009123901A1 true WO2009123901A1 (en) | 2009-10-08 |
Family
ID=41135905
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2009/038299 WO2009123901A1 (en) | 2008-03-31 | 2009-03-25 | Method and apparatus for multiplexed image acquisition and processing |
Country Status (5)
Country | Link |
---|---|
JP (1) | JP2011516860A (en) |
KR (1) | KR20100138985A (en) |
CN (1) | CN101981411A (en) |
TW (1) | TW201003038A (en) |
WO (1) | WO2009123901A1 (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5168049B2 (en) * | 2008-09-24 | 2013-03-21 | オムロン株式会社 | Image processing method and image processing apparatus |
CN103516962A (en) * | 2012-06-19 | 2014-01-15 | 全友电脑股份有限公司 | Image capturing system and method |
DE102014201144A1 (en) * | 2014-01-22 | 2015-07-23 | Zumtobel Lighting Gmbh | Method for controlling an adaptive lighting device and lighting system for carrying out the method |
JP6260354B2 (en) * | 2014-03-04 | 2018-01-17 | 株式会社リコー | Imaging device, adjustment device, and adjustment method |
KR101663518B1 (en) * | 2014-07-14 | 2016-10-10 | 주식회사 제이에스티 | reel tape inspection device |
CN110793472B (en) * | 2019-11-11 | 2021-07-27 | 桂林理工大学 | Grinding surface roughness detection method based on quaternion singular value entropy index |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5000569A (en) * | 1988-12-28 | 1991-03-19 | Lamb-Weston, Inc. | Light reflection defect detection apparatus and method using pulsed light-emitting semiconductor devices of different wavelengths |
US5298963A (en) * | 1992-02-26 | 1994-03-29 | Mitsui Mining & Smelting Co., Ltd. | Apparatus for inspecting the surface of materials |
US6064472A (en) * | 1996-10-22 | 2000-05-16 | Lap Gmbh Laser Applikationen | Method for speed measurement according to the laser-doppler-principle |
US6091488A (en) * | 1999-03-22 | 2000-07-18 | Beltronics, Inc. | Method of and apparatus for automatic high-speed optical inspection of semi-conductor structures and the like through fluorescent photoresist inspection |
-
2009
- 2009-03-25 WO PCT/US2009/038299 patent/WO2009123901A1/en active Application Filing
- 2009-03-25 KR KR1020107021795A patent/KR20100138985A/en not_active Application Discontinuation
- 2009-03-25 CN CN2009801110186A patent/CN101981411A/en active Pending
- 2009-03-25 JP JP2011503042A patent/JP2011516860A/en not_active Withdrawn
- 2009-03-27 TW TW098110047A patent/TW201003038A/en unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5000569A (en) * | 1988-12-28 | 1991-03-19 | Lamb-Weston, Inc. | Light reflection defect detection apparatus and method using pulsed light-emitting semiconductor devices of different wavelengths |
US5298963A (en) * | 1992-02-26 | 1994-03-29 | Mitsui Mining & Smelting Co., Ltd. | Apparatus for inspecting the surface of materials |
US6064472A (en) * | 1996-10-22 | 2000-05-16 | Lap Gmbh Laser Applikationen | Method for speed measurement according to the laser-doppler-principle |
US6091488A (en) * | 1999-03-22 | 2000-07-18 | Beltronics, Inc. | Method of and apparatus for automatic high-speed optical inspection of semi-conductor structures and the like through fluorescent photoresist inspection |
Also Published As
Publication number | Publication date |
---|---|
KR20100138985A (en) | 2010-12-31 |
CN101981411A (en) | 2011-02-23 |
JP2011516860A (en) | 2011-05-26 |
TW201003038A (en) | 2010-01-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2006333500B2 (en) | Apparatus and methods for inspecting a composite structure for defects | |
US9350973B2 (en) | Three-dimensional mapping and imaging | |
US6346966B1 (en) | Image acquisition system for machine vision applications | |
US8659653B2 (en) | Device for evaluating the surface of a tire | |
KR101590831B1 (en) | Method of inspecting foreign substance on a board | |
KR101972517B1 (en) | Dual line optics inspection system for surface inspection of flexible device | |
US20040150815A1 (en) | Flaw detection in objects and surfaces | |
WO2009123901A1 (en) | Method and apparatus for multiplexed image acquisition and processing | |
JP6052590B2 (en) | Surface inspection apparatus and surface inspection method for automobile body | |
US9927369B2 (en) | Automated defect detection and mapping for optical filters | |
CN108490000A (en) | A kind of Bar Wire Product surface defect on-line measuring device and method | |
CN108090890B (en) | Inspection device and inspection method | |
CN106353317B (en) | Detect the detection device and method of target to be measured | |
KR101679205B1 (en) | Device for detecting defect of device | |
KR101078404B1 (en) | Camera for detecting defect of metallic surface, metallic surface defect detection device including the same, and method of detecting defect of metallic surface | |
KR20100025001A (en) | Imaging device and method | |
JP5890953B2 (en) | Inspection device | |
KR101996486B1 (en) | Vision test device using sequential light technic | |
WO2019117802A1 (en) | A system for obtaining 3d images of objects and a process thereof | |
JP5351600B2 (en) | Appearance inspection system and appearance inspection method | |
JP4967132B2 (en) | Defect inspection method for object surface | |
JP2020193867A (en) | Image processing device, control method, control program, and computer-readable recording medium | |
KR101056392B1 (en) | Surface inspection method and device | |
JP2000258348A (en) | Defect inspection apparatus | |
JP7090899B2 (en) | Image processing equipment and image processing programs that use encoded illumination patterns |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980111018.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09728976 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20107021795 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011503042 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09728976 Country of ref document: EP Kind code of ref document: A1 |