WO2009123901A1 - Method and apparatus for multiplexed image acquisition and processing - Google Patents

Method and apparatus for multiplexed image acquisition and processing Download PDF

Info

Publication number
WO2009123901A1
WO2009123901A1 PCT/US2009/038299 US2009038299W WO2009123901A1 WO 2009123901 A1 WO2009123901 A1 WO 2009123901A1 US 2009038299 W US2009038299 W US 2009038299W WO 2009123901 A1 WO2009123901 A1 WO 2009123901A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
color
illumination
article
bayer filter
Prior art date
Application number
PCT/US2009/038299
Other languages
French (fr)
Inventor
Spencer B. Barrett
Original Assignee
Electro Scientific Industries, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Electro Scientific Industries, Inc. filed Critical Electro Scientific Industries, Inc.
Priority to JP2011503042A priority Critical patent/JP2011516860A/en
Priority to CN2009801110186A priority patent/CN101981411A/en
Publication of WO2009123901A1 publication Critical patent/WO2009123901A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • G01N2021/8838Stroboscopic illumination; synchronised illumination
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/06Illumination; Optics
    • G01N2201/062LED's

Definitions

  • the present invention relates to using machine vision to inspect manufactured articles at high speed.
  • it relates to acquiring and processing multiple images of an article acquired simultaneously by a single camera while the article is in motion. More particularly it relates to using differently colored lights illuminating an article from different directions and color filters on a sensor to separate the images in order to acquire multiple images simultaneously.
  • 2D vision systems typically consist of a camera, optics and illumination. Camera technology is generally limited to commercially available sensors, so typically the optics and illumination are customized to provide the correct magnification, field of view and acquisition speed.
  • An example of a machine vision system manufactured with an off-the-shelf image sensor and custom optics and illumination is the ESI BulletTM wafer ID reader manufactured by Electro Scientific Industries, the assignee of the instant patent application.
  • This system contains a video sensor, optics and illumination designed inspect identification marks on semiconductor wafers, among other articles. Although it is designed to image very subtle marks formed on a mirror-like surface, this system was unable to image small defects that are the subject of the instant invention. This is so because the defects in question do not appear any different to the camera from their surroundings. The defects may be visible only as slight depressions in the surface of the article and therefore cannot be imaged with conventional 2D imaging.
  • One possible method of imaging the defects would be to image the article while illuminated by light shone at a shallow grazing angle to the article from at least two different directions.
  • the light would have to be shone from at least two directions because not all of the defects are visible from a single light angle. Processing all of these images would likely find the defects but would require a separate frame time for each acquisition, thereby slowing the process unacceptably.
  • the image could be acquired in parallel, but that would require a sensor for each direction of light, along with filters so that each light would be visible only to the appropriate camera. This would introduce unacceptable expense to the process.
  • Another potential solution to the problem of imaging subtle defects would be to use 3D imaging.
  • 3D vision systems create image data that correspond to the true 3D shape of an article rather than the intensity of wavelength of the light reflected from the article.
  • 3D systems can be classified into several groups according to the technology involved. The first is passive 3D systems that construct images with height information from cues in regular 2D images. An example of this is stereo image reconstruction where two or more 2D images are acquired, features detected in each image and then the correspondence problem is solved that attempts to reconcile the detected features in each image. Discrepancies are attributed to differences in altitude perception between the images and height is thereby inferred. This has the same issues as multiple 2D images above, namely time to take multiple images and equipment costs to acquire in parallel. Also, this approach would not work on the defects in question here since they do not exhibit features that could be used to infer altitude.
  • 3D imaging requires projecting lines, typically a laser line or other shape projected onto the article and then imaged. The displacement of the lines indicates 3D contours. This method also requires multiple acquisitions to perform and therefore is unacceptably slow for this application.
  • Other types of 3D imaging depends on detecting focus quality over a series of image steps (confocal imaging) or acquiring images of a projected special grid (Moire imaging). Both of these methods require multiple image acquisitions in order to construct a 3D image and for this reason are unacceptable solutions to this problem.
  • these methods typically require that the article be held motionless for the duration of the scan, meaning that the article has to index, stop, and settle prior to scanning, all of which can slow the manufacturing process.
  • Two more images of an article are acquired in a single image frame by illuminating the article with two or more different colors of light and acquiring the data with an image sensor with an attached Bayer filter structure.
  • a Bayer filter is a color filter attached to an image sensor constructed with red, green and blue color filters. The Bayer filter is attached to the image sensor and image data acquired.
  • the image data is acquired into the controller, either in the camera itself of in an attached controller, the pixels are sorted depending upon which color of the filter the pixels correspond to, i.e., all the red pixels are put into an image, all the blue pixels are put into a separate image and all the green pixels are put into a further separate image.
  • a monochrome image acquisition sensor can be made to simulate a color imaging system with a single sensor, albeit at somewhat lower spatial resolution.
  • the color images are processed to be in registration with each other so that a feature in one image will occur at the same location in the other images.
  • three separate images can be derived from the single image acquired. If, for example, the article is illuminated from different directions with two or more of the three colors, the derived images can be processed to show discrepancies between the images and therefore show subtle differences in 3D morphology that indicate defects.
  • the lighting can be strobed or the image sensor gated to acquire the data in a very short period of time, thereby not only acquiring the three images simultaneously and hence in registration, but also allowing them to be acquired without stopping the article.
  • an article is illuminated from two different directions with light matched to two of the filters in the Bayer filter.
  • a third image is acquired using diffuse illumination matched to the third color of the Bayer filter.
  • the image data are acquired and separated into the three separate images corresponding to the three colors of the Bayer filter.
  • the first two images are then subtracted to highlight the differences between them, since features that occur in one image and not another are very likely defects.
  • the resulting image is then processed using conventional machine vision techniques to identity and classify the defects.
  • the diffuse image is also processed using conventional machine vision techniques to determine if other defects such as deviations of the outline from nominal values are present that might possibly not be detected in the directionally illuminated images.
  • the three illumination sources are strobed, or flashed for a very short period of time. This allows the sensor to acquire data from a moving part without blurring caused by motion.
  • the sensor is set to integrate light and the illumination sources are then set to strobe while the sensor is integrating light. Following the strobe, the sensor is directed to read out the image data created in the sensor by the strobed illumination sources to the controller, where the image data is separated into three images corresponding to the three color filters.
  • the motion of the article is slowed as it passes the field of view of the sensor by implementing a sinusoidal velocity profile in the transport mechanism that moves the article.
  • a controller is operatively connected to the transport mechanism to control the speed at which articles are moved.
  • Fig. 1 is a schematic diagram showing how an RGB Bayer filter structure is applied to an image sensor.
  • Fig. 2 is a chart showing the spectral response of a typical RGB Bayer filter structure.
  • FIGs. 3a and b show two views of a Bayer filter-based directional image acquisition system.
  • Fig. 4a, b, and c show directional illumination image processing.
  • Fig. 5 shows diffuse illumination image processing.
  • This invention is directed to acquisition of image data to support automated inspection of electronic components.
  • This type of component is the chip capacitor, which is made of alternating layers of metal conductors and ceramic dielectrics. These components are subject to defects which are difficult or impossible to image with conventional 2D or 3D systems. In addition, these components are made at a very high rate of speed. Installations that manufacture these parts can produce them at rates of hundreds of thousands per hour. Finally, these components are assembled into circuits that go into virtually all electronic devices currently sold; therefore it is advantageous to inspect these components before they are assembled into circuits.
  • This invention can reduce the image capture cycle down to a single image exposure (2 ms) and use a faster move profile of 18 ms. The 20 ms cycle would enable a throughput speed of 180,000 units per hour.
  • FIG. 1 is a schematic diagram of a semiconductor image sensor 10 with an applied Bayer filter 12.
  • the Bayer filter is comprised of three differently colored filters, red 14, green 16 and blue 18.
  • the Bayer filter 12 is attached to the sensor 10 so that each color segment of the Bayer filter 12 is aligned with a picture element or pixel, one of which is indicated 20, of the image sensor 10.
  • each pixel 20 integrates light of only one color so that when the resulting image data are read out from the sensor 10 by a controller (not shown), these data can be sorted into three separate images, each one representing pixels illuminated by only one color.
  • this function is used to create full color images using a monochrome, wide bandwidth sensor, but in an embodiment of the instant invention, this function is used to form separate images, each corresponding to a separate illumination source.
  • the three images can be made to conform in size, bit depth and registration to facilitate further processing.
  • Fig. 2 is a graph 30 showing the quantum efficiency 32 of the three types of color filters found in a typical Bayer filter plotted against wavelength 34. Quantum efficiency indicates the percentage of photons of a particular wavelength that are captured and converted to electrical charge by the sensor and corresponds to spectral response.
  • the three color peaks correspond to blue 36, green 38 and red 40 filters. As shown, the blue filter peaks at about 450 nm, the green filter peaks at about 550 nm and the red filter peaks at about 625 nm. Note that all three filters transmit energy in the infrared (IR) region of the spectrum (approximately 700 nm and greater). Most sensors of this type also contain an IR filter that blocks this extraneous energy from reaching the sensor.
  • IR infrared
  • An exemplary image sensor with Bayer filter attached is part # ICX445 EXview HAD CCD sold by Sony Corporation, Tokyo, Japan.
  • An embodiment of the instant invention uses technology developed for color image sensing to capture three separate images in one frame time by using the Bayer filter. The image capture is reduced to a single sensor exposure by multiplexing multiple views of the component into a single image by illuminating the component from at least two different directions. Each illumination source uses a unique color or wavelength of light matched to the colors of the Bayer filter. In one embodiment of the instant invention the following color LEDs are used for image illumination:
  • Exemplary LEDs that could be used for this application include: 470 nm - HLMP- CB30-M0000, Avago Technologies, San Jose, CA; 525 nm - LTST-C190TGKT, Lite- on Semiconductor Corporation, Taipei, Taiwan; and SML-LX0402IC-TR, Lumex, Palatine, IL
  • Figs. 3a and 3b show an embodiment of the instant invention.
  • Fig. 3a shows a top perspective elevation and Fig.
  • FIG. 3b shows a side elevation of an embodiment showing an article 50, being carried past the camera 60 by a transport mechanism 51 and being illuminated by a first illumination source 52 projecting a first color collimated light 54, from a first direction, a second illumination source 56 projecting a second color collimated light 58.
  • a third illumination source 62 illuminates the article 50 with a third color diffuse light 64.
  • a camera 60 containing a sensor with Bayer filter attached (not shown) acquires image data from the illuminated article 50.
  • Fig. 4a shows image data 70 acquired from an article illuminated by a first illumination source 72 using a first color light from a first direction 74.
  • Image data 70 acquired with this illumination scheme shows a defect 76, which shows up as a lighter colored area.
  • Fig. 4b shows image data 78 acquired from an article illuminated by a second illumination source 80 using a second color light from a second direction 82. Note that in this case, the defect is not visible in the image data 78.
  • Fig. 4c shows composite image data 84 calculated by subtracting image data 70 from image data 78. In the composite image data 84, the defect 86 is clearly visible.
  • the purpose of combining image data from the two different directional images 70, 78 is to suppress information that occurs in both images and enhance information that occurs in only one of the images.
  • the thought is that the defects to be detected are visible when illuminated in only one direction.
  • Features which are visible in two directions are typically not classified as defects.
  • subtraction is an exemplary way to combine the images to enhance defect detection, but other operations are possible also, including other arithmetic operations, logical operations, order statistic-based operations such as min/max operators, or combinations of them.
  • standard machine vision techniques are used to identify the defect in the combined image.
  • a diffuse image of the article can be used to detect the defect.
  • Fig. 5 shows a grayscale image 90 of an article acquired with diffuse lighting from above.
  • a defect 92 is shown circled.
  • multiplexing three images onto a single sensor in a one frame time is used to acquire the data in a single frame time.
  • acquisition time By limiting acquisition time to a single frame time, the time required to stop the part, acquire the data and then restart the part motion is minimized.
  • performing electronic shuttering with the sensor where the sensor is allowed to integrate light only for a limited period, can allow the image data to be acquired without stopping the part.
  • the illumination sources can also be strobed to freeze the motion of the part as it passes through the field of view of the sensor.
  • the part can either keep moving at its normal speed past the camera, or in cases where in spite of electronic shuttering or strobing the illumination sources the image data contain motion blur, the part motion can be programmed to be sinusoidal, slowing down as the part moves through the field of view of the sensor and speeding up when no image is being acquired.

Abstract

A method and an apparatus for inspecting articles entails illuminating an article (50) from two or more directions with light sources (52, 56) matched to the spectral response of a Bayer filter (12) attached to an image sensor (10). The image data are separated by color and derivative images formed, then processed to detect defects.

Description

METHOD AND APPARATUS FOR MULTIPLEXED IMAGE ACQUISITION AND PROCESSING
Technical Field
[0001] The present invention relates to using machine vision to inspect manufactured articles at high speed. In particular it relates to acquiring and processing multiple images of an article acquired simultaneously by a single camera while the article is in motion. More particularly it relates to using differently colored lights illuminating an article from different directions and color filters on a sensor to separate the images in order to acquire multiple images simultaneously.
Background of the Invention
[0002] Many articles of manufacture are inspected for small defects during the manufacturing process by machine vision systems. Typical inspections systems use 2D grayscale cameras to find defects in articles based on differences in grayscale reflectivity or 2D morphology. In some cases where defects exist that are not detectable by grayscale difference, color cameras are used to detect defects distinguished by differences in spectral reflectivity. In cases where defects are not distinguished by any differences in grayscale or color imagery, 3D vision systems can be used to detect defects characterized by differences in 3D morphology. [0003] 2D vision systems typically consist of a camera, optics and illumination. Camera technology is generally limited to commercially available sensors, so typically the optics and illumination are customized to provide the correct magnification, field of view and acquisition speed. An example of a machine vision system manufactured with an off-the-shelf image sensor and custom optics and illumination is the ESI Bullet™ wafer ID reader manufactured by Electro Scientific Industries, the assignee of the instant patent application. This system contains a video sensor, optics and illumination designed inspect identification marks on semiconductor wafers, among other articles. Although it is designed to image very subtle marks formed on a mirror-like surface, this system was unable to image small defects that are the subject of the instant invention. This is so because the defects in question do not appear any different to the camera from their surroundings. The defects may be visible only as slight depressions in the surface of the article and therefore cannot be imaged with conventional 2D imaging.
[0004] One possible method of imaging the defects would be to image the article while illuminated by light shone at a shallow grazing angle to the article from at least two different directions. The light would have to be shone from at least two directions because not all of the defects are visible from a single light angle. Processing all of these images would likely find the defects but would require a separate frame time for each acquisition, thereby slowing the process unacceptably. The image could be acquired in parallel, but that would require a sensor for each direction of light, along with filters so that each light would be visible only to the appropriate camera. This would introduce unacceptable expense to the process. [0005] Another potential solution to the problem of imaging subtle defects would be to use 3D imaging. 3D vision systems create image data that correspond to the true 3D shape of an article rather than the intensity of wavelength of the light reflected from the article. 3D systems can be classified into several groups according to the technology involved. The first is passive 3D systems that construct images with height information from cues in regular 2D images. An example of this is stereo image reconstruction where two or more 2D images are acquired, features detected in each image and then the correspondence problem is solved that attempts to reconcile the detected features in each image. Discrepancies are attributed to differences in altitude perception between the images and height is thereby inferred. This has the same issues as multiple 2D images above, namely time to take multiple images and equipment costs to acquire in parallel. Also, this approach would not work on the defects in question here since they do not exhibit features that could be used to infer altitude.
[0006] Another type of 3D imaging requires projecting lines, typically a laser line or other shape projected onto the article and then imaged. The displacement of the lines indicates 3D contours. This method also requires multiple acquisitions to perform and therefore is unacceptably slow for this application. Other types of 3D imaging depends on detecting focus quality over a series of image steps (confocal imaging) or acquiring images of a projected special grid (Moire imaging). Both of these methods require multiple image acquisitions in order to construct a 3D image and for this reason are unacceptable solutions to this problem. In addition, these methods typically require that the article be held motionless for the duration of the scan, meaning that the article has to index, stop, and settle prior to scanning, all of which can slow the manufacturing process.
[0007] One example of this method is described in "Technique for Phase Measurement and Surface Reconstruction by Use of Colored Structured Light," by Oleksandr, et al, (Applied Optics Vol. 41 , Issue 29, pp. 6104-6117 (2002)), projects multiple colored patterns to distinguish the projected light and attempt to improve the accuracy of the 3D measurements. They discuss using structured light to determine the topography of an automotive windshield using differential equations to extract 3D information from the structured light images.
[0008] What many of these methods have in common is the requirement to acquire more than one image in order to detect defects in an article. Problems with acquiring multiple images of a moving part are that in order to acquire multiple images with a single camera, multiple exposures are needed. Since the part is supposed to be moving past the camera, either the part will have to be stopped in place while the multiple images are acquired, or the camera will have to move with the part. Neither solution is acceptable, since stopping the part will detract from the system throughput and moving the camera is difficult and expensive. Another possible solution is to use multiple cameras, but that is expensive and can require special efforts to align the optics and sensors
[0009] What is needed, therefore, is an image acquisition scheme that is capable of imaging defects not readily apparent by 2D inspection methods. Further, this scheme should not require multiple cameras or other expensive additional equipment and be able to operate in a fraction of a single frame time in order to minimize or eliminate the requirement that articles be held stationary during acquisition.
Summary of the Invention
[0010] An object of the instant invention is to provide a method and an apparatus in the form of an image acquisition system with improved ability to detect defects in articles not apparent to 2D inspection means. Another object of the instant invention is to perform the acquisition without requiring the articles to be held motionless during acquisition. To achieve the foregoing and other objects in accordance with the purposes of the present invention, as embodied and broadly described herein, a method and an apparatus are disclosed.
[0011] Two more images of an article are acquired in a single image frame by illuminating the article with two or more different colors of light and acquiring the data with an image sensor with an attached Bayer filter structure. A Bayer filter is a color filter attached to an image sensor constructed with red, green and blue color filters. The Bayer filter is attached to the image sensor and image data acquired. When the image data is acquired into the controller, either in the camera itself of in an attached controller, the pixels are sorted depending upon which color of the filter the pixels correspond to, i.e., all the red pixels are put into an image, all the blue pixels are put into a separate image and all the green pixels are put into a further separate image. In this way a monochrome image acquisition sensor can be made to simulate a color imaging system with a single sensor, albeit at somewhat lower spatial resolution. The color images are processed to be in registration with each other so that a feature in one image will occur at the same location in the other images. [0012] By matching the illumination wavelengths with the Bayer filter, three separate images can be derived from the single image acquired. If, for example, the article is illuminated from different directions with two or more of the three colors, the derived images can be processed to show discrepancies between the images and therefore show subtle differences in 3D morphology that indicate defects. Furthermore, the lighting can be strobed or the image sensor gated to acquire the data in a very short period of time, thereby not only acquiring the three images simultaneously and hence in registration, but also allowing them to be acquired without stopping the article.
[0013] In an embodiment of the instant invention, an article is illuminated from two different directions with light matched to two of the filters in the Bayer filter. A third image is acquired using diffuse illumination matched to the third color of the Bayer filter. The image data are acquired and separated into the three separate images corresponding to the three colors of the Bayer filter. The first two images are then subtracted to highlight the differences between them, since features that occur in one image and not another are very likely defects. The resulting image is then processed using conventional machine vision techniques to identity and classify the defects. The diffuse image is also processed using conventional machine vision techniques to determine if other defects such as deviations of the outline from nominal values are present that might possibly not be detected in the directionally illuminated images.
[0014] In an embodiment of the instant invention the three illumination sources are strobed, or flashed for a very short period of time. This allows the sensor to acquire data from a moving part without blurring caused by motion. The sensor is set to integrate light and the illumination sources are then set to strobe while the sensor is integrating light. Following the strobe, the sensor is directed to read out the image data created in the sensor by the strobed illumination sources to the controller, where the image data is separated into three images corresponding to the three color filters.
[0015] In the case where an article is moving very quickly past the sensor field of view, it may not be practical to attempt to freeze the article's motion solely by strobing the illumination. In this embodiment, the motion of the article is slowed as it passes the field of view of the sensor by implementing a sinusoidal velocity profile in the transport mechanism that moves the article. In this embodiment a controller is operatively connected to the transport mechanism to control the speed at which articles are moved. By speeding up and slowing down the transport mechanism according to a sinusoidal velocity profile, with the lowest velocity timed to coincide with the strobing of the illumination sources, the articles can be slowed down sufficiently to eliminate blur while minimizing the reduction in throughput caused by the slowing of the transport mechanism.
Brief Description of the Drawings
[0016] Fig. 1 is a schematic diagram showing how an RGB Bayer filter structure is applied to an image sensor.
[0017] Fig. 2 is a chart showing the spectral response of a typical RGB Bayer filter structure.
[0018] Figs. 3a and b show two views of a Bayer filter-based directional image acquisition system.
[0019] Fig. 4a, b, and c show directional illumination image processing. [0020] Fig. 5 shows diffuse illumination image processing.
Detailed Description of Preferred Embodiments
[0021] This invention is directed to acquisition of image data to support automated inspection of electronic components. One example of this type of component is the chip capacitor, which is made of alternating layers of metal conductors and ceramic dielectrics. These components are subject to defects which are difficult or impossible to image with conventional 2D or 3D systems. In addition, these components are made at a very high rate of speed. Installations that manufacture these parts can produce them at rates of hundreds of thousands per hour. Finally, these components are assembled into circuits that go into virtually all electronic devices currently sold; therefore it is advantageous to inspect these components before they are assembled into circuits. This invention can reduce the image capture cycle down to a single image exposure (2 ms) and use a faster move profile of 18 ms. The 20 ms cycle would enable a throughput speed of 180,000 units per hour.
[0022] Wide spectrum image sensors with Bayer filters applied are commonly used to acquire color image information. Fig. 1 is a schematic diagram of a semiconductor image sensor 10 with an applied Bayer filter 12. The Bayer filter is comprised of three differently colored filters, red 14, green 16 and blue 18. The Bayer filter 12 is attached to the sensor 10 so that each color segment of the Bayer filter 12 is aligned with a picture element or pixel, one of which is indicated 20, of the image sensor 10. In this way each pixel 20 integrates light of only one color so that when the resulting image data are read out from the sensor 10 by a controller (not shown), these data can be sorted into three separate images, each one representing pixels illuminated by only one color. Normally this function is used to create full color images using a monochrome, wide bandwidth sensor, but in an embodiment of the instant invention, this function is used to form separate images, each corresponding to a separate illumination source. The three images can be made to conform in size, bit depth and registration to facilitate further processing.
[0023] Fig. 2 is a graph 30 showing the quantum efficiency 32 of the three types of color filters found in a typical Bayer filter plotted against wavelength 34. Quantum efficiency indicates the percentage of photons of a particular wavelength that are captured and converted to electrical charge by the sensor and corresponds to spectral response. The three color peaks correspond to blue 36, green 38 and red 40 filters. As shown, the blue filter peaks at about 450 nm, the green filter peaks at about 550 nm and the red filter peaks at about 625 nm. Note that all three filters transmit energy in the infrared (IR) region of the spectrum (approximately 700 nm and greater). Most sensors of this type also contain an IR filter that blocks this extraneous energy from reaching the sensor. Examples of devices with these types of image sensors include cell phone cameras and most moderate to low cost video cameras. An exemplary image sensor with Bayer filter attached is part # ICX445 EXview HAD CCD sold by Sony Corporation, Tokyo, Japan. [0024] An embodiment of the instant invention uses technology developed for color image sensing to capture three separate images in one frame time by using the Bayer filter. The image capture is reduced to a single sensor exposure by multiplexing multiple views of the component into a single image by illuminating the component from at least two different directions. Each illumination source uses a unique color or wavelength of light matched to the colors of the Bayer filter. In one embodiment of the instant invention the following color LEDs are used for image illumination:
- 470nm Blue - used for the North image illumination
- 525nm Green - used for the Diffuse image illumination (3rd image)
- 636nm Red - used for the East image illumination
Exemplary LEDs that could be used for this application include: 470 nm - HLMP- CB30-M0000, Avago Technologies, San Jose, CA; 525 nm - LTST-C190TGKT, Lite- on Semiconductor Corporation, Taipei, Taiwan; and SML-LX0402IC-TR, Lumex, Palatine, IL
[0025] In a typical Bayer filter, 50% of the pixels are green, 25% are blue and 25% are red. This is so because these sensors were designed to create color images from a monochrome sensor and reflect the fact that the human eye is more sensitive to green light than to blue or red light. Also to be noted is that this represents only one particular type of color filter. Other color filters that could be used to separate colors on a monochrome filter use stripes of colors that align with columns of pixels on the sensor, for example. Other filters might use slightly different colors. Any of these other filters could be used by an embodiment of the instant invention without deviating from the basic concepts. [0026] Figs. 3a and 3b show an embodiment of the instant invention. Fig. 3a shows a top perspective elevation and Fig. 3b shows a side elevation of an embodiment showing an article 50, being carried past the camera 60 by a transport mechanism 51 and being illuminated by a first illumination source 52 projecting a first color collimated light 54, from a first direction, a second illumination source 56 projecting a second color collimated light 58. In addition, a third illumination source 62 illuminates the article 50 with a third color diffuse light 64. A camera 60 containing a sensor with Bayer filter attached (not shown) acquires image data from the illuminated article 50.
[0027] Fig. 4a shows image data 70 acquired from an article illuminated by a first illumination source 72 using a first color light from a first direction 74. Image data 70 acquired with this illumination scheme shows a defect 76, which shows up as a lighter colored area. Fig. 4b shows image data 78 acquired from an article illuminated by a second illumination source 80 using a second color light from a second direction 82. Note that in this case, the defect is not visible in the image data 78. Fig. 4c shows composite image data 84 calculated by subtracting image data 70 from image data 78. In the composite image data 84, the defect 86 is clearly visible. The purpose of combining image data from the two different directional images 70, 78 is to suppress information that occurs in both images and enhance information that occurs in only one of the images. The thought is that the defects to be detected are visible when illuminated in only one direction. Features which are visible in two directions are typically not classified as defects. Also note that subtraction is an exemplary way to combine the images to enhance defect detection, but other operations are possible also, including other arithmetic operations, logical operations, order statistic-based operations such as min/max operators, or combinations of them. Following image combination, standard machine vision techniques are used to identify the defect in the combined image. [0028] In cases where a defect appears in both directional images, a diffuse image of the article can be used to detect the defect. Fig. 5 shows a grayscale image 90 of an article acquired with diffuse lighting from above. A defect 92 is shown circled. These image data can be combined with either one or both of the directional images or processed alone using standard machine vision techniques to yield further information regarding possible defects.
[0029] In an embodiment of the instant invention, multiplexing three images onto a single sensor in a one frame time is used to acquire the data in a single frame time. By limiting acquisition time to a single frame time, the time required to stop the part, acquire the data and then restart the part motion is minimized. In addition, performing electronic shuttering with the sensor, where the sensor is allowed to integrate light only for a limited period, can allow the image data to be acquired without stopping the part. The illumination sources can also be strobed to freeze the motion of the part as it passes through the field of view of the sensor. The part can either keep moving at its normal speed past the camera, or in cases where in spite of electronic shuttering or strobing the illumination sources the image data contain motion blur, the part motion can be programmed to be sinusoidal, slowing down as the part moves through the field of view of the sensor and speeding up when no image is being acquired.
[0030] It will be apparent to those of ordinary skill in the art that many changes may be made to the details of the above-described embodiments of this invention without departing from the underlying principles thereof. The scope of the present invention should, therefore, be determined only by the following claims.

Claims

Claims
1. An improved method of acquiring image data of an article using first and second illumination sources, an image sensor with a Bayer filter applied, and a controller, the improvement comprising: illuminating said article with said first illumination from a first direction with light spectrally matched to a first color of said Bayer filter; illuminating said article with said second illumination from a second direction with light spectrally matched to a second color of said Bayer filter; acquiring image data from said article with illuminated by said first illumination and said second illumination with said image sensor to said controller; separating said image data with said controller into a first color image containing data corresponding to said first illumination and a second color image containing data corresponding to said second illumination; and forming a derived image with said controller by calculation between said first color image and said second color image thereby acquiring image data of said article.
2. The method of claim 1 where said first direction and said second direction are 180 degrees opposed.
3. The method of claim 1 where said first color of said Bayer filter is about 550 nm.
4. The method of claim 1 where said second color of said Bayer filter is about 635 nm.
5. The method of claim 1 where said calculation between said first color image and said second color image is subtraction.
6. The method of claim 1 where the first illumination and the second illumination are strobed.
7. An improved system for acquiring image data of an article using an image sensor with a Bayer filter applied, including an image sensor with applied Bayer filter and a controller operatively connected to the image sensor, the improvement comprising: a first illumination source operative to illuminate said article from a first direction with light spectrally matched to a first color of said Bayer filter; a second illumination source operative to illuminate said article from a second direction with light spectrally matched to a second color of said Bayer filter; said controller operative to acquire said image data from said image sensor with applied Bayer filter and separate said image data into a first color image corresponding to image data illuminated by said first illumination source and a second color image corresponding to image data illuminated by said second illumination source; and said controller further operative to form a derived image from said first color image and said second color image by calculation, thereby acquiring said image data of said article.
8. The system of claim 7 where said first direction and said second direction are 180 degrees opposed.
9. The system of claim 7 where said first color of said Bayer filter is about 550 nm.
10. The system of claim 7 where said second color of said Bayer filter is about 635 nm.
11. The system of claim 7 where said calculation between said first color image and said second color image is subtraction.
12. The system of claim 7 where the first illumination and the second illumination are strobed.
PCT/US2009/038299 2008-03-31 2009-03-25 Method and apparatus for multiplexed image acquisition and processing WO2009123901A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2011503042A JP2011516860A (en) 2008-03-31 2009-03-25 Method and apparatus for acquiring and processing multiplexed images
CN2009801110186A CN101981411A (en) 2008-03-31 2009-03-25 Method and apparatus for multiplexed image acquisition and processing

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US4110508P 2008-03-31 2008-03-31
US61/041,105 2008-03-31

Publications (1)

Publication Number Publication Date
WO2009123901A1 true WO2009123901A1 (en) 2009-10-08

Family

ID=41135905

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2009/038299 WO2009123901A1 (en) 2008-03-31 2009-03-25 Method and apparatus for multiplexed image acquisition and processing

Country Status (5)

Country Link
JP (1) JP2011516860A (en)
KR (1) KR20100138985A (en)
CN (1) CN101981411A (en)
TW (1) TW201003038A (en)
WO (1) WO2009123901A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5168049B2 (en) * 2008-09-24 2013-03-21 オムロン株式会社 Image processing method and image processing apparatus
CN103516962A (en) * 2012-06-19 2014-01-15 全友电脑股份有限公司 Image capturing system and method
DE102014201144A1 (en) * 2014-01-22 2015-07-23 Zumtobel Lighting Gmbh Method for controlling an adaptive lighting device and lighting system for carrying out the method
JP6260354B2 (en) * 2014-03-04 2018-01-17 株式会社リコー Imaging device, adjustment device, and adjustment method
KR101663518B1 (en) * 2014-07-14 2016-10-10 주식회사 제이에스티 reel tape inspection device
CN110793472B (en) * 2019-11-11 2021-07-27 桂林理工大学 Grinding surface roughness detection method based on quaternion singular value entropy index

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5000569A (en) * 1988-12-28 1991-03-19 Lamb-Weston, Inc. Light reflection defect detection apparatus and method using pulsed light-emitting semiconductor devices of different wavelengths
US5298963A (en) * 1992-02-26 1994-03-29 Mitsui Mining & Smelting Co., Ltd. Apparatus for inspecting the surface of materials
US6064472A (en) * 1996-10-22 2000-05-16 Lap Gmbh Laser Applikationen Method for speed measurement according to the laser-doppler-principle
US6091488A (en) * 1999-03-22 2000-07-18 Beltronics, Inc. Method of and apparatus for automatic high-speed optical inspection of semi-conductor structures and the like through fluorescent photoresist inspection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5000569A (en) * 1988-12-28 1991-03-19 Lamb-Weston, Inc. Light reflection defect detection apparatus and method using pulsed light-emitting semiconductor devices of different wavelengths
US5298963A (en) * 1992-02-26 1994-03-29 Mitsui Mining & Smelting Co., Ltd. Apparatus for inspecting the surface of materials
US6064472A (en) * 1996-10-22 2000-05-16 Lap Gmbh Laser Applikationen Method for speed measurement according to the laser-doppler-principle
US6091488A (en) * 1999-03-22 2000-07-18 Beltronics, Inc. Method of and apparatus for automatic high-speed optical inspection of semi-conductor structures and the like through fluorescent photoresist inspection

Also Published As

Publication number Publication date
KR20100138985A (en) 2010-12-31
CN101981411A (en) 2011-02-23
JP2011516860A (en) 2011-05-26
TW201003038A (en) 2010-01-16

Similar Documents

Publication Publication Date Title
AU2006333500B2 (en) Apparatus and methods for inspecting a composite structure for defects
US9350973B2 (en) Three-dimensional mapping and imaging
US6346966B1 (en) Image acquisition system for machine vision applications
US8659653B2 (en) Device for evaluating the surface of a tire
KR101590831B1 (en) Method of inspecting foreign substance on a board
KR101972517B1 (en) Dual line optics inspection system for surface inspection of flexible device
US20040150815A1 (en) Flaw detection in objects and surfaces
WO2009123901A1 (en) Method and apparatus for multiplexed image acquisition and processing
JP6052590B2 (en) Surface inspection apparatus and surface inspection method for automobile body
US9927369B2 (en) Automated defect detection and mapping for optical filters
CN108490000A (en) A kind of Bar Wire Product surface defect on-line measuring device and method
CN108090890B (en) Inspection device and inspection method
CN106353317B (en) Detect the detection device and method of target to be measured
KR101679205B1 (en) Device for detecting defect of device
KR101078404B1 (en) Camera for detecting defect of metallic surface, metallic surface defect detection device including the same, and method of detecting defect of metallic surface
KR20100025001A (en) Imaging device and method
JP5890953B2 (en) Inspection device
KR101996486B1 (en) Vision test device using sequential light technic
WO2019117802A1 (en) A system for obtaining 3d images of objects and a process thereof
JP5351600B2 (en) Appearance inspection system and appearance inspection method
JP4967132B2 (en) Defect inspection method for object surface
JP2020193867A (en) Image processing device, control method, control program, and computer-readable recording medium
KR101056392B1 (en) Surface inspection method and device
JP2000258348A (en) Defect inspection apparatus
JP7090899B2 (en) Image processing equipment and image processing programs that use encoded illumination patterns

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980111018.6

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09728976

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20107021795

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2011503042

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09728976

Country of ref document: EP

Kind code of ref document: A1