WO2021049326A1 - 表面欠陥判別装置、外観検査装置及びプログラム - Google Patents
表面欠陥判別装置、外観検査装置及びプログラム Download PDFInfo
- Publication number
- WO2021049326A1 WO2021049326A1 PCT/JP2020/032574 JP2020032574W WO2021049326A1 WO 2021049326 A1 WO2021049326 A1 WO 2021049326A1 JP 2020032574 W JP2020032574 W JP 2020032574W WO 2021049326 A1 WO2021049326 A1 WO 2021049326A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- pixel
- inspected
- receiving amount
- sub
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/89—Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
- G01N21/892—Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles characterised by the flaw, defect or object feature examined
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/89—Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
- G01N21/8901—Optical details; Scanning details
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8854—Grading and classifying of flaws
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8887—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
Definitions
- the present invention relates to a surface defect discriminating device for discriminating surface defects of an object to be inspected such as a product or part having a surface having a strong specular reflection property, and an appearance inspection device and a program provided with this surface defect discriminating device.
- Patent Document 1 is characterized in that a component is photographed while switching light sources in a plurality of directions, and the direction of the illumination light source and the photographed image are analyzed to determine whether the shadow of the image is defective or dirty.
- the technology to be used is disclosed.
- Patent Document 1 is based on the premise that the object to be inspected is stationary. For example, a drum-driven belt component whose static control is difficult is inspected while moving the belt to detect surface defects. It cannot be determined.
- the present invention has been made in view of such a technical background, and is a surface defect discriminating device capable of discriminating surface defects while relatively moving an object to be inspected with respect to a lighting device and a line sensor.
- the purpose is to provide visual inspection equipment and programs.
- the above object is achieved by the following means. (1) When the object to be inspected is irradiated by switching the illumination light from each of the lighting devices one by one while moving the object to be inspected relative to the illumination devices and line sensors arranged at different positions. In addition, each time the illumination light from each illumination device is switched, the reflected light from the object to be inspected is received by the line sensor and photographed, so that a plurality of images are displaced by the switching amount of the illumination light.
- the surface of the object to be inspected from the image acquisition means for acquiring the image, the alignment means for aligning the image corresponding to each lighting device acquired by the image acquisition means, and the image aligned by the alignment means.
- a surface defect discriminating device including a discriminating means for discriminating defects.
- a part of each pixel of the line sensor is an overlapping region in which the shooting range overlaps between the current shooting and the previous shooting due to the illumination light from one of the lighting devices irradiating the object to be inspected.
- the portion of one pixel excluding the overlapping region is a sub-pixel
- the light receiving amount of the sub-pixel this time is obtained by subtracting the light receiving amount of the overlapping region from the light receiving amount of the entire pixel in the current shooting.
- a sub-pixel image creating means for estimating and creating a sub-pixel image, and the aligning means aligns a sub-pixel image corresponding to each lighting device created by the sub-pixel image creating means.
- the object to be inspected is irradiated by switching the illumination light from each of the illumination devices one by one while moving the object to be inspected relative to the illumination devices and line sensors arranged at different positions.
- the image acquisition means for acquiring a plurality of images for each illumination light by receiving the reflected light from the object to be inspected by the line sensor and taking a picture is provided.
- a part of each pixel of the line sensor becomes an overlapping area where the shooting range overlaps between the current shooting and the previous shooting due to the illumination light from one of the lighting devices irradiating the object to be inspected.
- a surface defect further comprising a sub-pixel image creating means for creating a sub-pixel image and a discriminating means for determining a surface defect of an object to be inspected based on the sub-pixel image created by the sub-pixel image creating means. Discriminating device. (4) The surface defect determining device according to item 3 above, which includes a positioning means for aligning a subpixel image corresponding to each lighting device created by the subpixel image creating means.
- the surface defect determining device according to any one of items 2 to 4 above, wherein the sub-pixel image creating means subtracts from the light-receiving amount of the entire pixel in a state where the light-receiving amount of the overlapping region is corrected for each region.
- the sub-pixel image creating means obtains the light-receiving amount in the overlapping region from the sum of the light-receiving amounts of the sub-pixels estimated before the previous time, and subtracts the obtained light-receiving amount from the light-receiving amount of the entire pixel.
- the surface defect discriminating device according to any one of items 2 to 5 above, which estimates the amount of light received by a subpixel.
- the subpixel image creating means estimates that the average value obtained by dividing the light receiving amount of the entire first pixel after the start of shooting by the number of subpixels per pixel is the light receiving amount of the first subpixel.
- the surface defect discriminating device according to. (8) In the sub-pixel image creating means, when the light-receiving amount of the entire pixel does not exceed a predetermined threshold value, the average value obtained by dividing the light-receiving amount of the entire pixel by the number of sub-pixels per pixel is divided by the number of sub-pixels per pixel.
- the light receiving amount of the entire pixel exceeds a predetermined threshold value
- the light receiving amount of the sub-pixel this time is estimated by subtracting the light receiving amount of the overlapping region from the light receiving amount of the entire pixel.
- the surface defect discriminating device according to any one of 7. (9) the alignment means, the alignment of the sub-pixel image corresponding to each illumination device created by the sub-pixel image creating means, the correction in the correction value K 'i j luminance value K i j by the following formula
- the surface defect discriminating device according to any one of 2, 4 to 8 in the preceding item.
- the discriminating means corresponds to each lighting device in the sub-pixel image aligned by the positioning means. If the points do not overlap and each bright point is within a preset range, it is described in any one of the above items 1, 2, 4 to 9 in which it is determined that a concave defect or a convex defect exists on the surface of the object to be inspected. Surface defect discrimination device. (11) The discriminating means has a concave defect when the position of the bright point corresponding to each illuminating device is opposite to the arranging position of the illuminating device in the subpixel image aligned by the positioning means.
- the surface defect discriminating device determines that a convex defect exists if the opposite is not true. (12)
- the discriminating means determines that dust or dirt is present on the surface of the object to be inspected when the bright spots corresponding to the lighting devices overlap in the subpixel image aligned by the positioning means. Judgment The surface defect discriminating device according to any one of the preceding items 1, 2, 4 to 11. (13) A pixel whose total received light amount exceeds a predetermined threshold value is detected as a defect candidate pixel, a subpixel image is aligned with the detected defect candidate pixel by the alignment means, and an object to be inspected by a determination means.
- the surface defect discriminating device according to any one of the above items 12, 4 to 12, which discriminates the surface defects of the above.
- the surface defect determination device according to any one of items 1 to 13 above, wherein an LED or a visible light semiconductor laser is used as a light source of the lighting device.
- the number of the lighting devices is three or more, and any of the above items 1 to 14 is arranged on the circumference centered on the line sensor and with an angle difference of 360 degrees ⁇ the number of lighting devices.
- (16) A plurality of lighting devices arranged at different positions, a line sensor capable of receiving the reflected light of the illumination light emitted from each lighting device to the object to be inspected, and the lighting device and the line of the object to be inspected.
- the lighting control means for switching the illumination light from each lighting device one by one at a predetermined cycle to irradiate the object to be inspected
- the moving means for the object to be inspected Is moved relative to the lighting device and the line sensor, and each time the lighting light from each lighting device is switched by the lighting control means, the reflected light from the object to be inspected is received and photographed.
- an appearance inspection device including the line sensor control means for controlling the line sensor and the surface defect determination device according to any one of the above items 1 to 15. (17)
- the object to be inspected is irradiated by switching the illumination light from each of the illumination devices one by one while moving the object to be inspected relative to the illumination devices and line sensors arranged at different positions.
- the alignment step for aligning the image corresponding to each lighting device acquired by the image acquisition step, and the image aligned by the alignment step the surface of the object to be inspected.
- a part of each pixel of the line sensor is an overlapping region in which the shooting range overlaps between the current shooting and the previous shooting due to the illumination light from one of the lighting devices irradiating the object to be inspected.
- the light receiving amount of the sub-pixel this time is obtained by subtracting the light receiving amount of the overlapping region from the light receiving amount of the entire pixel in the current shooting.
- a process of estimating the light receiving amount of the first subpixel is the average value obtained by dividing the light receiving amount of the entire first pixel after the start of shooting by the number of subpixels per pixel.
- the average value obtained by dividing the light-receiving amount of the entire pixel by the number of sub-pixels per pixel is the current sub-pixel.
- the light receiving amount of the sub-pixel this time is estimated by subtracting the light receiving amount of the overlapping region from the light receiving amount of the entire pixel.
- i index of the estimated subpixel position
- j identification difference number of the lit lighting
- the discrimination step in the subpixel image aligned by the alignment step, the brightness corresponding to each lighting device is provided. If the points do not overlap and each bright point is within a preset range, any of 17 to 23 in the preceding paragraph, which causes the computer to execute a process of determining that a concave defect or a convex defect exists on the surface of the object to be inspected.
- the determination step in the subpixel image aligned by the alignment step, in the subpixel image aligned by the alignment means, the position of the bright point corresponding to each illumination device is the position of the illumination device.
- the program according to item 24 above which causes the computer to execute a process of determining that a concave defect exists when the position is opposite to the arrangement position and a convex defect exists when the position is not the opposite.
- the determination step when the bright spots corresponding to the lighting devices overlap in the pixel image aligned by the alignment step, it is determined that dust or dirt is present on the surface of the object to be inspected.
- a pixel whose total received light amount exceeds a predetermined threshold value is detected as a defect candidate pixel, a subpixel image is aligned with the detected defect candidate pixel by the alignment means, and an object to be inspected is subjected to a determination step.
- the program according to any one of 17 to 26 in the preceding paragraph which causes the computer to execute a process for determining a surface defect of the above.
- the illumination light from each illumination device is switched one by one while the object to be inspected is relatively moved with respect to the illumination devices and line sensors arranged at different positions.
- the object to be inspected is irradiated.
- the reflected light from the object to be inspected is received by the line sensor and photographed, so that a plurality of images are acquired in a state of being displaced by the amount of switching of the illumination light.
- the acquired image corresponding to each lighting device is aligned, and then the surface defect of the object to be inspected is determined from the aligned image.
- the object to be inspected is moving relative to the lighting device and the line sensor, the positions of the objects to be inspected are displaced by the amount of switching of the lighting light when the lighting light from each lighting device is switched. Then, a plurality of images corresponding to each lighting device acquired from the line sensor are aligned, and the surface defect of the inspected object is determined in this aligned state. Therefore, the inspected object is relatively moved. However, the surface defect of the object to be inspected can be discriminated.
- a part of each pixel of the line sensor has a shooting range in the current shooting and the previous shooting when the object to be inspected is irradiated with the illumination light by one lighting device.
- Is an overlapping area and when the part of one pixel excluding the overlapping area is a sub-pixel, the sub-pixel of this time is obtained by subtracting the light-receiving amount of the overlapping area from the light-receiving amount of the entire pixel in this shooting.
- the amount of light received by the pixel is estimated and a subpixel image is created.
- the sub-pixel image corresponding to each created lighting device is aligned, and the surface defect of the object to be inspected is detected from the aligned image.
- the sub-pixel is a portion of the pixel excluding the overlapping region, it is smaller than one pixel, so that the resolution of the visual inspection is improved and finer surface defects can be detected.
- the light receiving amount of the overlapping region is corrected for each region and is subtracted from the light receiving amount of the entire pixel. Therefore, the more accurate light receiving amount of the overlapping region is subtracted to make this sub.
- the amount of light received by the pixels can be estimated, and as a result, more accurate defect discrimination can be performed.
- the amount of light received in the overlapping region is obtained from the sum of the amount of light received by the subpixels estimated before the previous time, and the obtained amount of light received is subtracted from the amount of light received by the entire pixel. Since the amount of light received by the subpixel is estimated, the estimation process is simplified.
- the average value obtained by dividing the light receiving amount of the entire first pixel after the start of shooting by the number of subpixels per pixel is estimated as the light receiving amount of the first subpixel. , It is possible to smoothly estimate the light receiving amount of the subpixel from the next time onward.
- the light receiving amount of the entire pixel when the light receiving amount of the entire pixel does not exceed a predetermined threshold value, in other words, when there is a high possibility that surface defects do not exist, the light receiving amount of the entire pixel is per pixel. The average value divided by the number of sub-pixels in is estimated as the amount of light received by the sub-pixels this time.
- the light receiving amount of the entire pixel exceeds a predetermined threshold value, in other words, when there is a high possibility that a surface defect exists, the light receiving amount of the overlapping region is subtracted from the light receiving amount of the entire pixel, so that the sub The amount of light received by the pixel is estimated.
- the defect discrimination process can be executed by concentrating on the region where the surface defects are likely to exist.
- the subpixel image corresponding to each lighting device can be accurately aligned, and the defect can be discriminated with high accuracy.
- a pixel whose total received light amount exceeds a predetermined threshold value is detected as a defect candidate pixel, and a subpixel image is aligned with the detected defect candidate pixel and is inspected. Since the surface defects of the object are discriminated, the defect discriminating process can be executed by concentrating on the region where the surface defects are likely to exist.
- each lighting device can be switched at a high speed.
- the number of lighting devices is three or more, and they are arranged on the circumference centered on the line sensor with an angle difference of 360 degrees ⁇ the number of lighting devices.
- a lighting device whose certification light is not perpendicular to scratches or the like is always secured, and surface defects such as scratches can be accurately discriminated.
- the surface defect determination process of the non-inspected object is performed while the inspected object is relatively moved with respect to the lighting device and the line sensor arranged at different positions. You can let the computer do it.
- FIG. 1 It is a block diagram of the appearance inspection apparatus which concerns on one Embodiment of this invention.
- (A) and (B) are diagrams for explaining the arrangement relationship of a plurality of lighting devices. It is a figure for demonstrating the relative positional relationship of a shooting range and a pixel at the time of shooting while switching a plurality of lighting devices. It is a figure for demonstrating the relative positional relationship between the imaging range and the pixel at the time of performing the 1st to 4th imaging by one lighting device. It is a figure for demonstrating the method of estimating the light receiving amount of a subpixel 23. It is a figure for demonstrating the alignment of the subpixel image for a plurality of lighting devices. It is a figure which shows an example of the sensitivity distribution of a pixel.
- FIG. 1 is a configuration diagram of an appearance inspection device according to an embodiment of the present invention.
- the visual inspection device includes a line sensor 1, two lighting devices 2a and 2b, a lighting control unit 8 that controls each lighting device 2a and 2b, and a line sensor control that controls the line sensor 1.
- Unit 9 transport drums 3 and 3 for transporting the object to be inspected 5, drum encoder 4 for detecting the imaging position of the object to be inspected 5, display device 6, computer 10, and transport speed of the object to be inspected 5.
- a transport drum control unit 11 and the like for controlling the number of rotations of the transport drum 3 for control are provided.
- the computer 10 processes the image captured by the line sensor 1 to determine defects, and synchronously controls the lighting devices 2a and 2b and the line sensor 1.
- the display device 6 displays an image that has been subjected to defect discrimination processing by the computer 10, a processing result, and the like.
- the object 5 to be inspected has a belt shape with high reflectance, is installed in a roll shape using the transport drum 3, and is fed in the Y direction by rotation of the transport drums 3 and 3 in the arrow direction.
- the imaging position of the object to be inspected 5 by the line sensor 1 is detected by the drum encoder 4.
- the line sensor 1 extends in the X direction orthogonal to the moving direction Y of the object 5 to be inspected, and the two lighting devices 2a and 2b are line sensors when viewed from above, as shown in FIG. 2A. It is arranged at the target position centered on 1 with an angle difference of 180 degrees, and it is possible to illuminate from two different directions.
- the facing directions of the lighting devices 2a and 2b may be the X direction, the Y direction, or any other direction. In this embodiment, two lighting devices 2a and 2b are used, but three or more may be used. In the case of three or more, as shown in FIG. 2B, they are arranged on the circumference centered on the line sensor 1 when viewed from above, and at an angle difference of 360 degrees ⁇ the number of lighting devices.
- FIG. 2B shows the cases of the three lighting devices 2a, 2b, and 2c, which are arranged at an angle difference of 120 degrees from each other.
- Each of the lighting devices 2a and 2b can be switched on and off at an arbitrary timing by the control of the lighting control unit 8.
- the sensor illumination carrier 12 makes a line with the lighting devices 2a and 2b.
- the sensor 1 may be integrally moved in the X direction by the length of the line sensor 1, the image may be taken once again in the Y direction, and this may be repeated in order to photograph the entire object 5 to be inspected.
- the line sensor 1 receives the reflected light when the inspected object 5 is illuminated by switching the on / off of each of the lighting devices 2a and 2b while moving the inspected object 5 in the Y direction.
- the line sensor 1 and the illuminating devices 2a and 2b are not in opposite positions, and the reflected light from the illuminating devices 2a and 2b is diffusely reflected by the object 5 to be received by the line sensor 1. Therefore, the image captured by the line sensor 1 is a dark field image.
- the surface of the object to be inspected 5 has high reflectance, if there are concave defects, convex defects, scratch defects, dust, dust, etc. at the illumination position, the reflected light diffusely reflected by these defects, dust, dust, etc. is lined up. It is incident on the sensor 1.
- the line sensor control unit 9 and the lighting control unit 8 are connected to the computer 10, and the line sensor 1 and the lighting device 2a and the line sensor 1 and the lighting device 2b emit light and photograph in synchronization with each other.
- the line rate of the line sensor 1 is set to 100 kHz (shutter speed 0.01 ms) based on the specifications of a general line sensor. That is, as the illuminating device 2a and the illuminating device 2b, those using an LED light source, an LD (visible light semiconductor laser) light source, or the like that can be switched alternately at high speed every 0.01 ms are preferable.
- the surface defect discrimination process of the object 5 to be inspected by the computer 10 will be described.
- the computer 10 is provided with a CPU, RAM, a storage device, and the like, and the surface defect determination process is executed by operating the CPU according to an operation program stored in the storage device and the like.
- FIG. 3 is a diagram for explaining the relative positional relationship between the shooting range and the pixels 20 when shooting while switching the lighting devices 2a and 2b.
- the size of the defect 30 to be detected is 12A
- the resolution of the line sensor 1 (the length of one pixel 20)
- the lighting devices 2a and 2b are switched each time the object 5 to be inspected 5 is sent A. It shall be taken.
- the resolution 6A of the line sensor 1 is a single shooting area in one pixel. Therefore, as shown in FIG. 3, the first shooting is performed by the illumination light of the lighting device 2a, and when the object 5 to be inspected is sent to A, the second shooting is switched to, and the shooting is performed by the illumination light of the lighting device 2b. Will be. In the first shooting and the second shooting, the shooting area of the object 5 to be inspected 5 is moved by A. The same applies to the third and subsequent shootings. In the example of FIG. 3, for convenience of explanation, a state in which the pixel 20 is moved by A each time the switching shooting is performed is shown.
- FIG. 4 is a diagram for explaining the relative positional relationship between the shooting range and the pixels 20 when the first to fourth shootings are performed by one lighting device 2a.
- the illuminating device 2a is turned on every time the inspected object 5 moves by 2A, and the irradiation of the inspected object 5 by the illumination light is started, and the line sensor is started each time. Taken by 1. That is, every time the object 5 to be inspected moves 2A, an image corresponding to the illumination light from the illumination device 2a is taken.
- the irradiation time of the illumination light of the illumination device 2a in other words, the light receiving time of each pixel 20 of the line sensor 1 is a time corresponding to the moving distance A.
- the same imaging range of the object 5 to be inspected 5 is photographed in the current imaging and the previous imaging for 4A, which is a part of the sensor resolution 6A.
- This is an overlapping area where the shooting range overlaps. That is, when the pixel 20 is divided into three regions of the first region 21, the second region 22, and the third region 23 in order in the length direction, the length per region is 2A, and the second region 22 in the previous shooting.
- the third region 23 and the first region 21 and the second region 22 of this shooting are overlapping regions having the same shooting range.
- the first region 21 and the second region 22 of the fourth shooting overlap with the second region 22 and the third region 23 of the third shooting, respectively.
- the overlapping area with the previous shooting in the current shooting is grayed out.
- the shooting range does not overlap with the previous shooting and is updated as a new shooting range, and this is used as a subpixel.
- the third region is also referred to as a subpixel.
- FIG. 5 is a diagram for explaining a method of estimating the light receiving amount of the sub-pixel 23, and is a relative of the shooting range and the pixel 20 when the i-th shooting and the plurality of shootings before and after the i-th shooting are performed by the lighting device 2a. It shows the positional relationship.
- the light receiving amount of the subpixel 23 in the i-th shooting is the first light receiving area of the entire 1 pixel 20 (6A minutes) in the i-th shooting, which is an overlapping region with the previous shooting. It is necessary to calculate and estimate by subtracting the amount of light received for 4A in the region 21 and the second region 22.
- the (i-2) th image is updated by 2A of the subpixel 23, and the subpixel 23 is sequentially photographed each time the number of shots increases from the (i-1) th time to the ith time. It will be updated every 2A.
- the updated new sub-pixel 23 becomes an overlapping area at the next shooting, remains as an overlapping area at the next shooting, and is excluded from the overlapping area at the next shooting. That is, the overlapping region between the current shooting and the previous shooting is the subpixel 23 at the time of the last two shootings of the previous time and the previous two times.
- the light receiving amount of the sub-pixel 23 in the i-th shooting is the estimated light-receiving amount of the sub-pixel 23 in the previous (i-1) shooting from the total light-receiving amount of 1 pixel 6A in the i-th shooting.
- (Estimated value of the received amount of the subpixel in the i-th time) (Total received amount of the i-th time)- ⁇ (Estimated value of the received amount of the received amount of the sub-pixel in the (i-1) time) + ((i-2) time Estimated value of the amount of light received by the subpixel) ⁇ .
- the numerical values written in the first to third regions 21 to 23 of each pixel 20 in FIG. 5 are examples of the estimated light receiving amount in that region, and are the same as the numerical values of the subpixel 23 of the previous time or the time before the previous time.
- the numerical value on the right side of the pixel 20 is the total amount of light received by one pixel.
- the total light receiving amount for one pixel 6A in the i-th shooting is 3.8
- the estimated light-receiving amount of the subpixel 23 in the previous (i-1) shooting is 1.3.
- the estimated light receiving amount of the subpixel 23 at the time of the second shooting (i-2) before the previous shooting is 0.5
- the process of estimating the light receiving amount of the sub-pixel 23 may be performed on the pixel 20 detected as the defect candidate pixel having a high possibility of having a defect, and the estimated position of the detected defect candidate pixel may be determined.
- the sub-pixel image may be created by setting i based on the information of the drum encoding 4 and storing the received light amount of the sub-pixel 23 at that time in association with the position information.
- the defect discrimination process can be performed by concentrating on the portion where the surface defect is likely to exist, and the efficiency is improved.
- the defect candidate pixels the pixels 20 in which the total amount of received light exceeds a predetermined threshold value may be detected as the defect candidate pixels.
- the average value of the pixel light amount for 2A is obtained as 1/3 of the light receiving amount of the entire pixel. It may be estimated to be the amount of light received by the subpixel 23 (for example, the (i-4) th time or the (i-3) th time in FIG. 5).
- the average value obtained by dividing the light receiving amount of the entire pixel by the number of subpixels 23 per pixel is the first. It is estimated that the amount of light received by the subpixel 23 is estimated, and the amount of light received by the subpixel 23 may be estimated using this amount of light received.
- a sub-pixel image is created around the defect candidate pixel, which is not an image of pixel 20 but a received amount of 1/3 pixel (area for 2A).
- the resolution of the line sensor 1 is tripled, and fine surface defects can be detected and discriminated with high accuracy. That is, when the non-inspection object 5 moving with respect to the line sensor 1 and the lighting devices 2a and 2b is photographed by the line sensor 1, the distance between the line sensor 1 and the imaging surface of the non-inspection object 5 is not stable and the subject is photographed.
- the resolution decreases when the depth of field is increased, and it may not be possible to inspect even small defects, but use an image with subpixels smaller than one pixel. As a result, the resolution is improved without increasing the depth of field, and even finer defects can be inspected.
- the sub-pixel image and lighting of the lighting device 2a are alternately displaced by the moving distance A corresponding to the switching time of the illumination light, such as a1, b1, a2, b2, a3, b3, and so on.
- the position of the lighting device 2a may be aligned so as to correspond to the positions b1, b2, b3, ... Of the lighting device 2b.
- correction formula is for two lighting devices, but the correction formula that can be applied regardless of whether it is two or three or more is expressed by the following formula.
- i index of the estimated subpixel position
- j identification number of the lighting device that is lit ⁇ correction when estimating the amount of received light of the subpixel>
- the light receiving amount of the sub-pixel 23 was estimated assuming that all the regions of 1 pixel 6A have the same light receiving sensitivity.
- the light receiving sensitivity differs depending on each part of the pixel 20, the central part has a relatively high light receiving sensitivity, and both ends have a low light receiving sensitivity.
- FIG. 7 shows that the sensitivity of the hatched portion is high, and the sensitivity of the third region is higher than that of the first region even at both ends of the hatched portion.
- the subpixel is in the third region 23 at the right end of one pixel. Yes, the light receiving sensitivity is low.
- the subpixel 23 overlaps with the second region 22 for 2A in the center, and this region has high light receiving sensitivity. Therefore, the light receiving amount of the central second region 22 in the i-th shooting should be larger than the light-receiving amount of the subpixel 23 in the (i-1) th shooting.
- the light receiving amount of the first region 21 in the i-th shooting is the light-receiving amount of the sub-pixel 23 in the (i-2) th shooting, but the light-receiving amount of the sub-pixel 23 in the (i-2) th shooting.
- the amount of light received in the first region 21 in the i-th imaging should actually be smaller than that.
- weighting is performed according to each region 21 to 23, and a weighting coefficient is set for each region 21 to 23.
- the weighting coefficient of the first region 21 at the left end of one pixel 20 is ⁇ 1
- the weighting coefficient of the second region 22 at the center is ⁇ 2
- the weighting coefficient of the third region 23 at the right end is ⁇ 3.
- the amount of light received by the subpixel 23 in the second shooting is calculated by the following formula.
- (Estimated amount of light received by the i-th subpixel) (Total amount of light received by the i-th time)- ⁇ (Estimated value of the amount of light received by the (i-1) subpixel) * ⁇ 2 / ⁇ 3 + ((i-2) ) Estimated amount of light received from the second subpixel) * ⁇ 1 / ⁇ 3 ⁇
- FIG. 8 shows an example of the corrected received light amount estimated value of each region 21 to 23 calculated in consideration of weighting.
- the light receiving amount of the subpixel 23 at the time of the (i-2) th shooting is 0.3
- the light receiving amount is the light receiving amount in the second region 22 at the time of the (i-1) th shooting. It is corrected and increased to 0.5, and in the first region 21 at the time of the i-th shooting, the light receiving amount is corrected and decreased to 0.2.
- the light receiving amount of the subpixel 23 at the time of the (i-1) th shooting is 0.9
- the light receiving amount is corrected and increased to 1.3 in the second region 22 at the time of the i-th shooting. doing.
- the light receiving amount of the overlapping area is subtracted from the light receiving amount of the entire pixel, so the more accurate light receiving amount of the overlapping area is subtracted this time. It is possible to estimate the amount of light received by the sub-pixel 23 of the above, and by extension, it is possible to perform more accurate defect discrimination. ⁇ Defect discrimination> Surface defects are discriminated based on the subpixel images aligned with each other.
- the spherical concave defect 51 is illuminated by crossing the illumination lights from different directions of the lighting devices 2a and 2b arranged opposite to each other as shown in FIG.
- the positional relationship between the positions of the devices 2a and 2b and the reflection position is opposite. That is, in the aligned sub-pixel image 61, the bright points 61a and 61b corresponding to the lighting devices 2a and 2b do not overlap, and the bright points 61a and 61b are within a preset range, and are bright.
- the positions of the points 61a and 61b have a positional relationship opposite to the arrangement positions of the lighting devices 2a and 2b, it is determined to be a concave defect 51.
- the illumination light from the illumination devices 2a and 2b to the convex defect 52 does not cross, and the positional relationship between each position of the illumination devices 2a and 2b and the reflection position is Is the same. Therefore, in the aligned sub-pixel image 62, the bright points 62a and 62b corresponding to the lighting devices 2a and 2b do not overlap, and the bright points 62a and 62b are within the preset range, and are bright.
- the positions of the points 62a and 62b have the same positional relationship as the arrangement positions of the lighting devices 2a and 2b, it is determined to be a convex defect 52.
- FIGS. 10 and 11 bright points 61a and 62a corresponding to the lighting device 2a are shown by double hatching, and bright points 61b and 62b corresponding to the lighting device 2b are shown by broken line hatching. The same applies to FIGS. 12 and later.
- the illumination lights from the illumination devices 2a and 2b arranged to face each other cross each other, and the illumination devices 2a and 2b
- the positional relationship between each position and the reflection position is reversed.
- the directions of the scratched surfaces are not uniform, the reflection of the illumination light of the illumination device 2a and the reflection of the illumination light of the illumination device 2b coexist.
- the plane is a high reflectance surface, the illumination lights are mixed and not reflected.
- the bright points 63a corresponding to the lighting device 2a and the bright points 63b corresponding to the lighting device 2b do not overlap, and the bright points 63a and 63b are mixed, and the bright points 63a
- the position of 63b is opposite to the arrangement position of the lighting devices 2a and 2b, it is determined that the scratch defect 53 is present on the surface of the object 5 to be inspected.
- each illumination light is mixed and reflected by the defect 54. Will be done. Therefore, in the aligned sub-pixel image 64, when the bright points 64a and 64b corresponding to the lighting devices 2a and 2b overlap, it is determined that dust or dirt is present on the surface of the object 5 to be inspected.
- Each subpixel image by the lighting devices 2a and 2b is a dark field image, and unevenness defects, scratch defects, dust, dust, etc. appear as white spots. Defect candidates on the image are detected as follows.
- each subpixel image is binarized by B2, and the discrete pixel aggregation process is performed by the expansion / contraction process. Further, each pixel set is labeled by color coding or the like.
- W1 does not simply indicate the minimum defect size, but indicates the minimum size that can be regarded as a "defect portion".
- the size of the defect is defined as X or more in terms of the number of pixels
- each illumination is combined.
- the bright points 61a and 61b corresponding to the devices 2a and 2b do not overlap, the bright points 61a and 61b are within the range of the coordinates Vi ⁇ X / 2, and the positions of the bright points 61a and 61b are located in the lighting device 2a, Since the positional relationship is opposite to the arrangement position of 2b, it is determined to be a void defect.
- the respective lighting devices 2a and 2b are combined.
- the corresponding bright points 62a and 62b do not overlap, the bright points 62a and 62b are within the range of the coordinates Vi ⁇ X / 2, and the positions of the bright points 62a and 62b are the positions of the lighting devices 2a and 2b. Since they have the same positional relationship, it is determined to be a convex defect.
- the respective lighting devices 2a and 2b are combined.
- the corresponding bright points 62a and 62b are within the range of the coordinates Vi ⁇ X / 2, the bright points 62a and 62b are mixed without overlapping, and the positions of the bright points 63a and 63b are the arrangement positions of the lighting devices 2a and 2b. Since it is the opposite of the above, it is judged to be a scratch defect.
- the detection result is displayed on the display device 6.
- the display is preferably, together with the images after the alignment of the two subpixel images SPa and SPb shown on the right side of FIGS. 14 to 17, the type of the identified defect and the range of the coordinates Vi ⁇ X / 2 for each defect. May also be displayed.
- the present invention is not limited to the above embodiment.
- the line sensor 1 and the lighting devices 2a and 2b are fixed and the image is taken while moving the object 5 to be inspected.
- the object 5 to be inspected is fixed and the line sensor 1 and the lighting devices 2a and 2b are moved.
- the image may be taken while the image is being taken, as long as at least one of the object 5 to be inspected, the line sensor 1 and the lighting devices 2a and 2b is relatively moving with respect to the other.
- the relative moving distance of the non-inspected object 5 per shooting is A and the length of the subpixel 23 is 2A is shown. It suffices if the shooting ranges overlap and subpixels can be formed. Therefore, it is preferable that the relative movement distance of the non-inspected object 5 per photographing is 1/2 or less of one pixel.
- the present invention can be used to determine surface defects of an object to be inspected such as a product or part having a surface having a strong specular reflection property.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Analytical Chemistry (AREA)
- Pathology (AREA)
- Immunology (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Biochemistry (AREA)
- Textile Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
Abstract
Description
(1)異なる位置に配置された照明装置及びラインセンサに対して被検査物を相対的に移動させながら、前記各照明装置からの照明光を1つずつ切り替えて被検査物に照射させたときに、各照明装置からの照明光が切り替えられる毎に、被検査物からの反射光をラインセンサで受光して撮影することにより、前記照明光の切り替え分だけそれぞれ位置ずれした状態で複数の画像を取得する画像取得手段と、前記画像取得手段により取得された、各照明装置に対応する画像を位置合わせする位置合わせ手段と、前記位置合わせ手段により位置合わせされた画像から、被検査物の表面欠陥を判別する判別手段と、を備えた表面欠陥判別装置。
(2)前記ラインセンサの各画素の一部は、1つの前記照明装置による照明光が被検査物に照射されることによる今回の撮影と前回の撮影とで撮影範囲が重複する重複領域となっており、1つの画素における前記重複領域を除く部分をサブピクセルとするとき、今回の撮影での画素全体の受光量から前記重複領域の受光量を差し引くことで、今回のサブピクセルの受光量を推定しサブピクセル画像を作成するサブピクセル画像作成手段と、を備え、前記位置合わせ手段は、前記サブピクセル画像作成手段により作成された各照明装置に対応するサブピクセル画像を位置合わせする請求項1に記載の表面欠陥判別装置。
(3)異なる位置に配置された照明装置及びラインセンサに対して被検査物を相対的に移動させながら、前記各照明装置からの照明光を1つずつ切り替えて被検査物に照射させたときに、各照明装置からの照明光が切り替えられる毎に、被検査物からの反射光をラインセンサで受光して撮影することにより、前記各照明光毎に複数の画像を取得する画像取得手段を備え、前記ラインセンサの各画素の一部は、1つの前記照明装置による照明光が被検査物に照射されることによる今回の撮影と前回の撮影とで撮影範囲が重複する重複領域となっており、1つの画素における前記重複領域を除く部分をサブピクセルとするとき、今回の撮影での画素全体の受光量から前記重複領域の受光量を差し引くことで、今回のサブピクセルの受光量を推定しサブピクセル画像を作成するサブピクセル画像作成手段と、サブピクセル画像作成手段により作成されたサブピクセル画像に基づいて、被検査物の表面欠陥を判別する判別手段と、をさらに備えている表面欠陥判別装置。
(4)前記サブピクセル画像作成手段により作成された各照明装置に対応するサブピクセル画像を位置合わせする位置合わせ手段を備えている前項3に記載の表面欠陥判別装置。
(5)前記サブピクセル画像作成手段は、前記重複領域の受光量を領域毎に補正した状態で、画素全体の受光量から差し引く前項2~4のいずれかに記載の表面欠陥判別装置。
(6)前記サブピクセル画像作成手段は、前記重複領域の受光量を、前回以前に推定されたサブピクセルの受光量の和から求め、求めた受光量を画素全体の受光量から差し引いて今回のサブピクセルの受光量を推定する前項2~5のいずれかに記載の表面欠陥判別装置。
(7)前記サブピクセル画像作成手段は、撮影開始後の最初の画素全体の受光量を1画素あたりのサブピクセルの数で割った平均値を、最初のサブピクセルの受光量と推定する前項6に記載の表面欠陥判別装置。
(8)前記サブピクセル画像作成手段は、画素全体の受光量が所定の閾値を超えない場合、画素全体の受光量を1画素あたりのサブピクセルの数で割った平均値を、今回のサブピクセルの受光量と推定し、画素全体の受光量が所定の閾値を超える場合、画素全体の受光量から前記重複領域の受光量を差し引くことで、今回のサブピクセルの受光量を推定する前項2~7のいずれかに記載の表面欠陥判別装置。
(9)前記位置合わせ手段は、前記サブピクセル画像作成手段により作成された各照明装置に対応するサブピクセル画像の位置合わせを、下記式により輝度値Ki jを補正値K'i jに補正することにより行う前項2、4~8のいずれかに記載の表面欠陥判別装置。
j:点灯している照明装置の識別番号
(10)前記判別手段は、前記位置合わせ手段により位置合わせされたサブピクセル画像において、各照明装置に対応する明点が重複せずかつ各明点が予め設定された範囲内にある場合は、被検査物の表面に凹欠陥または凸欠陥が存在すると判定する前項1、2、4~9のいずれかに記載の表面欠陥判別装置。
(11)前記判別手段は、前記位置合わせ手段により位置合わせされたサブピクセル画像において、各照明装置に対応する明点の位置が照明装置の配置位置と逆である場合は凹欠陥が存在し、逆でない場合は凸欠陥が存在すると判定する前項10に記載の表面欠陥判別装置。
(12)前記判別手段は、前記位置合わせ手段により位置合わせされたサブピクセル画像において、各照明装置に対応する明点が重複しているときは、被検査物の表面にゴミまたはほこりが存在すると判定する前項1、2、4~11のいずれかに記載の表面欠陥判別装置。
(13)全体の受光量が所定の閾値を超える画素を欠陥候補画素として検出し、検出された欠陥候補画素について、前記位置合わせ手段によりサブピクセル画像を位置合わせし、かつ判定手段により被検査物の表面欠陥を判別する前項12、4~12のいずれかに記載の表面欠陥判別装置。
(14)前記照明装置の光源として、LEDまたは可視光半導体レーザーが用いられる前項1~13のいずれかに記載の表面欠陥判別装置。
(15)前記照明装置は3個以上であり、前記ラインセンサを中心とする円周上でかつ、360度÷照明装置の数、の角度差で配置されている前項1~14のいずれかに記載の表面欠陥判別装置。
(16)異なる位置に配置された複数の照明装置と、各照明装置から被検査物に照射された照明光の反射光を受光可能なラインセンサと、前記被検査物を、前記照明装置及びラインセンサに対して相対的に移動させる移動手段と、各照明装置からの照明光を1つずつ所定の周期で切り替えて被検査物に照射させる照明制御手段と、前記移動手段により、前記被検査物を前記照明装置及びラインセンサに対して相対的に移動させながら、前記照明制御手段により、各照明装置からの照明光が切り替えられる毎に、被検査物からの反射光を受光して撮影を行うように、前記ラインセンサを制御するラインセンサ制御手段と、前項1~15のいずれかに記載の表面欠陥判別装置と、を備えた外観検査装置。
(17)異なる位置に配置された照明装置及びラインセンサに対して被検査物を相対的に移動させながら、前記各照明装置からの照明光を1つずつ切り替えて被検査物に照射させたときに、各照明装置からの照明光が切り替えられる毎に、被検査物からの反射光をラインセンサで受光して撮影することにより、前記照明光の切り替え分だけそれぞれ位置ずれした状態で複数の画像を取得する画像取得ステップと、前記画像取得ステップにより取得された、各照明装置に対応する画像を位置合わせする位置合わせステップと、前記位置合わせステップにより位置合わせされた画像から、被検査物の表面欠陥を判別する判別ステップと、をコンピュータに実行させるためのプログラム。
(18)前記ラインセンサの各画素の一部は、1つの前記照明装置による照明光が被検査物に照射されることによる今回の撮影と前回の撮影とで撮影範囲が重複する重複領域となっており、1つの画素における前記重複領域を除く部分をサブピクセルとするとき、今回の撮影での画素全体の受光量から前記重複領域の受光量を差し引くことで、今回のサブピクセルの受光量を推定しサブピクセル画像を作成するサブピクセル画像作成ステップを前記コンピュータに実行させ、前記位置合わせステップでは、前記サブピクセル画像作成ステップにより作成された各照明装置に対応するサブピクセル画像を位置合わせする処理を前記コンピュータに実行させる前項17に記載のプログラム。
(19)前記サブピクセル画像作成ステップでは、前記重複領域の受光量を領域毎に補正した状態で、画素全体の受光量から差し引く処理を前記コンピュータに実行させる前項17に記載のプログラム。
(20)前記サブピクセル画像作成ステップでは、前記重複領域の受光量を、前回以前に推定されたサブピクセルの受光量の和から求め、求めた受光量を画素全体の受光量から差し引いて今回のサブピクセルの受光量を推定する処理を前記コンピュータに実行させる前項18または19に記載のプログラム。
(21)前記サブピクセル画像作成ステップでは、撮影開始後の最初の画素全体の受光量を1画素あたりのサブピクセルの数で割った平均値を、最初のサブピクセルの受光量と推定する処理を前記コンピュータに実行させる前項20に記載のプログラム。
(22)前記サブピクセル画像作成ステップでは、画素全体の受光量が所定の閾値を超えない場合、画素全体の受光量を1画素あたりのサブピクセルの数で割った平均値を、今回のサブピクセルの受光量と推定し、画素全体の受光量が所定の閾値を超える場合、画素全体の受光量から前記重複領域の受光量を差し引くことで、今回のサブピクセルの受光量を推定する処理を前記コンピュータに実行させる前項18~21のいずれかに記載のプログラム。
(23)前記位置合わせステップでは、前記サブピクセル画像作成ステップにより作成された各照明装置に対応するサブピクセル画像の位置合わせを、下記式により輝度値Ki jを補正値K'i jに補正することにより行う処理を前記コンピュータに実行させる前項18~22のいずれかに記載のプログラム。
j:点灯している照明の識別差番号
(24)前記判別ステップでは、前記位置合わせステップにより位置合わせされたサブピクセル画像において、各照明装置に対応する明点が重複せずかつ各明点が予め設定された範囲内にある場合は、被検査物の表面に凹欠陥または凸欠陥が存在すると判定する処理を前記コンピュータに実行させる前項17~23のいずれかに記載のプログラム。
(25)前記判別ステップでは、前記位置合わせステップにより位置合わせされたサブピクセル画像において、前記位置合わせ手段により位置合わせされたサブピクセル画像において、各照明装置に対応する明点の位置が照明装置の配置位置と逆である場合は凹欠陥が存在し、逆でない場合は凸欠陥が存在すると判定する処理を前記コンピュータに実行させる前項24に記載のプログラム。
(26)前記判別ステップでは、前記位置合わせステップにより位置合わせされたピクセル画像において、各照明装置に対応する明点が重複しているときは、被検査物の表面にゴミまたはほこりが存在すると判定する処理を前記コンピュータに実行させる前項17~25のいずれかに記載のプログラム。
(27)全体の受光量が所定の閾値を超える画素を欠陥候補画素として検出し、検出された欠陥候補画素について、前記位置合わせ手段によりサブピクセル画像を位置合わせし、かつ判定ステップにより被検査物の表面欠陥を判別する処理を前記コンピュータに実行させる前項17~26のいずれかに記載のプログラム。
(28)前記照明装置の光源として、LEDまたは可視光半導体レーザーが用いられる前項17~27のいずれかに記載のプログラム。
(29)複数個の前記照明装置は、前記ラインセンサを中心とする円周上かつ360度/(照明装置の個数)の位置に配置されている前項17~28のいずれかに記載のプログラム。
[外観検査装置の構成]
図1は、この発明の一実施形態に係る外観検査装置の構成図である。図1に示すように、外観検査装置は、ラインセンサ1と、2つの照明装置2a、2bと、各照明装置2a、2bを制御する照明制御部8と、ラインセンサ1を制御するラインセンサ制御部9と、被検査物5を搬送する搬送ドラム3、3と、被検査物5の撮影位置を検出するドラムエンコーダー4と、表示装置6と、コンピュータ10と、被検査物5の搬送速度の制御のために搬送ドラム3の回転数を制御する搬送ドラム制御部11等を備えている。
[表面欠陥判別処理]
次に、コンピュータ10による被検査物5の表面欠陥判別処理について説明する。なお、コンピュータ10には、CPU、RAM、記憶装置等が備えられ、表面欠陥判別処理は、記憶装置等に格納された動作プログラムに従ってCPUが動作することにより実行される。
<サブピクセル画像の作成>
図3は、照明装置2a、2bを切り替えながら撮影するときの撮影範囲と画素20の相対的位置関係を説明するための図である。
(i回目のサブピクセルの受光量の推定値)=(i回目の全受光量)-{((i-1)回目のサブピクセルの受光量の推定値)+((i-2)回目のサブピクセルの受光量の推定値)}と演算される。
<照明装置2a、2bについてのサブピクセル画像の位置合わせ>
図6に示すように、照明装置2aに対応する1つの画素20についてのサブピクセル画像の位置をa1、a2、a3・・・とし、照明装置2bに対応する1つの画素20についてのサブピクセル画像の位置をb1、b2、b3・・・とすると、撮影時には、被検査物5はラインセンサ1及び照明装置2a、2bに対して移動しているから、照明装置2aについてのサブピクセル画像と照明装置2bについてのサブピクセル画像は、a1、b1、a2、b2、a3、b3・・・というように、照明光の切り替え時間に相当する移動距離Aだけ、交互に位置ずれした状態となる。
位置b2’での受光量(輝度値)=(位置b1での受光量+位置b2での受光量)/2
として、受光量(輝度値)を補正することにより位置ずれを補正する。照明装置2aの位置a3、a4・・・に対応する照明装置2bのサブピクセルの位置b3’、b4’・・・についても同様である。
j:点灯している照明装置の識別番号
<サブピクセルの受光量推定時の補正>
サブピクセル23の受光量の推定は、1画素6A分のすべての領域が同じ受光感度を有しているものとして行った。しかし実際には、図7の感度分布に示すように、画素20の各部によって受光感度は相違し、中央部は相対的に受光感度が高く、両端部は低い。図7では、ハッチング部分の感度が高いことを示しており、同じ両端部でも第3領域の方が第1領域よりも感度が高い。
(i回目のサブピクセルの受光量の推定値)=(i回目の全受光量)-{((i-1)回目のサブピクセルの受光量の推定値)*ε2/ε3+((i-2)回目のサブピクセルの受光量の推定値)*ε1/ε3}
ε1、ε2、ε3の具体例として本実施形態では、ε1=1/3、ε2=1、ε3=2/3が設定されている。図8に、重み付けを考慮して算出した各領域21~23の補正後の受光量推定値の一例を示す。
<欠陥判別>
相互に位置合わせされたサブピクセル画像を基に表面欠陥を判別する。
2a、2b 照明装置
4 ドラムエンコーダ
5 被検査物
6 照明装置
8 照明制御部
9 ラインセンサ制御部
10 コンピュータ
11 ドラム搬送制御部
20 画素
21 第1領域
22 第2領域
23 サブピクセル(第3領域)
30 欠陥
51 凹欠陥(ボイド欠陥)
52 凸欠陥
53 傷欠陥
54 ほこりまたはゴミ
61a~64a 照明装置2aによる明点
61b~64b 照明装置2bによる明点
Claims (29)
- 異なる位置に配置された照明装置及びラインセンサに対して被検査物を相対的に移動させながら、前記各照明装置からの照明光を1つずつ切り替えて被検査物に照射させたときに、各照明装置からの照明光が切り替えられる毎に、被検査物からの反射光をラインセンサで受光して撮影することにより、前記照明光の切り替え分だけそれぞれ位置ずれした状態で複数の画像を取得する画像取得手段と、
前記画像取得手段により取得された、各照明装置に対応する画像を位置合わせする位置合わせ手段と、
前記位置合わせ手段により位置合わせされた画像から、被検査物の表面欠陥を判別する判別手段と、
を備えた表面欠陥判別装置。 - 前記ラインセンサの各画素の一部は、1つの前記照明装置による照明光が被検査物に照射されることによる今回の撮影と前回の撮影とで撮影範囲が重複する重複領域となっており、
1つの画素における前記重複領域を除く部分をサブピクセルとするとき、今回の撮影での画素全体の受光量から前記重複領域の受光量を差し引くことで、今回のサブピクセルの受光量を推定しサブピクセル画像を作成するサブピクセル画像作成手段と、
を備え、
前記位置合わせ手段は、前記サブピクセル画像作成手段により作成された各照明装置に対応するサブピクセル画像を位置合わせする請求項1に記載の表面欠陥判別装置。 - 異なる位置に配置された照明装置及びラインセンサに対して被検査物を相対的に移動させながら、前記各照明装置からの照明光を1つずつ切り替えて被検査物に照射させたときに、各照明装置からの照明光が切り替えられる毎に、被検査物からの反射光をラインセンサで受光して撮影することにより、前記各照明光毎に複数の画像を取得する画像取得手段を備え、
前記ラインセンサの各画素の一部は、1つの前記照明装置による照明光が被検査物に照射されることによる今回の撮影と前回の撮影とで撮影範囲が重複する重複領域となっており、
1つの画素における前記重複領域を除く部分をサブピクセルとするとき、今回の撮影での画素全体の受光量から前記重複領域の受光量を差し引くことで、今回のサブピクセルの受光量を推定しサブピクセル画像を作成するサブピクセル画像作成手段と、
サブピクセル画像作成手段により作成されたサブピクセル画像に基づいて、被検査物の表面欠陥を判別する判別手段と、
をさらに備えている表面欠陥判別装置。 - 前記サブピクセル画像作成手段により作成された各照明装置に対応するサブピクセル画像を位置合わせする位置合わせ手段を備えている請求項3に記載の表面欠陥判別装置。
- 前記サブピクセル画像作成手段は、前記重複領域の受光量を領域毎に補正した状態で、画素全体の受光量から差し引く請求項2~4のいずれかに記載の表面欠陥判別装置。
- 前記サブピクセル画像作成手段は、前記重複領域の受光量を、前回以前に推定されたサブピクセルの受光量の和から求め、求めた受光量を画素全体の受光量から差し引いて今回のサブピクセルの受光量を推定する請求項2~5のいずれかに記載の表面欠陥判別装置。
- 前記サブピクセル画像作成手段は、撮影開始後の最初の画素全体の受光量を1画素あたりのサブピクセルの数で割った平均値を、最初のサブピクセルの受光量と推定する請求項6に記載の表面欠陥判別装置。
- 前記サブピクセル画像作成手段は、画素全体の受光量が所定の閾値を超えない場合、画素全体の受光量を1画素あたりのサブピクセルの数で割った平均値を、今回のサブピクセルの受光量と推定し、画素全体の受光量が所定の閾値を超える場合、画素全体の受光量から前記重複領域の受光量を差し引くことで、今回のサブピクセルの受光量を推定する請求項2~7のいずれかに記載の表面欠陥判別装置。
- 前記判別手段は、前記位置合わせ手段により位置合わせされたサブピクセル画像において、各照明装置に対応する明点が重複せずかつ各明点が予め設定された範囲内にある場合は、被検査物の表面に凹欠陥または凸欠陥が存在すると判定する請求項1、2、4~9のいずれかに記載の表面欠陥判別装置。
- 前記判別手段は、前記位置合わせ手段により位置合わせされたサブピクセル画像において、各照明装置に対応する明点の位置が照明装置の配置位置と逆である場合は凹欠陥が存在し、逆でない場合は凸欠陥が存在すると判定する請求項10に記載の表面欠陥判別装置。
- 前記判別手段は、前記位置合わせ手段により位置合わせされたサブピクセル画像において、各照明装置に対応する明点が重複しているときは、被検査物の表面にゴミまたはほこりが存在すると判定する請求項1、2、4~11のいずれかに記載の表面欠陥判別装置。
- 全体の受光量が所定の閾値を超える画素を欠陥候補画素として検出し、検出された欠陥候補画素について、前記位置合わせ手段によりサブピクセル画像を位置合わせし、かつ判定手段により被検査物の表面欠陥を判別する請求項12、4~12のいずれかに記載の表面欠陥判別装置。
- 前記照明装置の光源として、LEDまたは可視光半導体レーザーが用いられる請求項1~13のいずれかに記載の表面欠陥判別装置。
- 前記照明装置は3個以上であり、前記ラインセンサを中心とする円周上でかつ、360度÷照明装置の数、の角度差で配置されている請求項1~14のいずれかに記載の表面欠陥判別装置。
- 異なる位置に配置された複数の照明装置と、
各照明装置から被検査物に照射された照明光の反射光を受光可能なラインセンサと、
前記被検査物を、前記照明装置及びラインセンサに対して相対的に移動させる移動手段と、
各照明装置からの照明光を1つずつ所定の周期で切り替えて被検査物に照射させる照明制御手段と、
前記移動手段により、前記被検査物を前記照明装置及びラインセンサに対して相対的に移動させながら、前記照明制御手段により、各照明装置からの照明光が切り替えられる毎に、被検査物からの反射光を受光して撮影を行うように、前記ラインセンサを制御するラインセンサ制御手段と、
請求項1~15のいずれかに記載の表面欠陥判別装置と、
を備えた外観検査装置。 - 異なる位置に配置された照明装置及びラインセンサに対して被検査物を相対的に移動させながら、前記各照明装置からの照明光を1つずつ切り替えて被検査物に照射させたときに、各照明装置からの照明光が切り替えられる毎に、被検査物からの反射光をラインセンサで受光して撮影することにより、前記照明光の切り替え分だけそれぞれ位置ずれした状態で複数の画像を取得する画像取得ステップと、
前記画像取得ステップにより取得された、各照明装置に対応する画像を位置合わせする位置合わせステップと、
前記位置合わせステップにより位置合わせされた画像から、被検査物の表面欠陥を判別する判別ステップと、
をコンピュータに実行させるためのプログラム。 - 前記ラインセンサの各画素の一部は、1つの前記照明装置による照明光が被検査物に照射されることによる今回の撮影と前回の撮影とで撮影範囲が重複する重複領域となっており、
1つの画素における前記重複領域を除く部分をサブピクセルとするとき、今回の撮影での画素全体の受光量から前記重複領域の受光量を差し引くことで、今回のサブピクセルの受光量を推定しサブピクセル画像を作成するサブピクセル画像作成ステップを前記コンピュータに実行させ、
前記位置合わせステップでは、前記サブピクセル画像作成ステップにより作成された各照明装置に対応するサブピクセル画像を位置合わせする処理を前記コンピュータに実行させる請求項17に記載のプログラム。 - 前記サブピクセル画像作成ステップでは、前記重複領域の受光量を領域毎に補正した状態で、画素全体の受光量から差し引く処理を前記コンピュータに実行させる請求項17に記載のプログラム。
- 前記サブピクセル画像作成ステップでは、前記重複領域の受光量を、前回以前に推定されたサブピクセルの受光量の和から求め、求めた受光量を画素全体の受光量から差し引いて今回のサブピクセルの受光量を推定する処理を前記コンピュータに実行させる請求項18または19に記載のプログラム。
- 前記サブピクセル画像作成ステップでは、撮影開始後の最初の画素全体の受光量を1画素あたりのサブピクセルの数で割った平均値を、最初のサブピクセルの受光量と推定する処理を前記コンピュータに実行させる請求項20に記載のプログラム。
- 前記サブピクセル画像作成ステップでは、画素全体の受光量が所定の閾値を超えない場合、画素全体の受光量を1画素あたりのサブピクセルの数で割った平均値を、今回のサブピクセルの受光量と推定し、画素全体の受光量が所定の閾値を超える場合、画素全体の受光量から前記重複領域の受光量を差し引くことで、今回のサブピクセルの受光量を推定する処理を前記コンピュータに実行させる請求項18~21のいずれかに記載のプログラム。
- 前記判別ステップでは、前記位置合わせステップにより位置合わせされたサブピクセル画像において、各照明装置に対応する明点が重複せずかつ各明点が予め設定された範囲内にある場合は、被検査物の表面に凹欠陥または凸欠陥が存在すると判定する処理を前記コンピュータに実行させる請求項17~23のいずれかに記載のプログラム。
- 前記判別ステップでは、前記位置合わせステップにより位置合わせされたサブピクセル画像において、前記位置合わせ手段により位置合わせされたサブピクセル画像において、各照明装置に対応する明点の位置が照明装置の配置位置と逆である場合は凹欠陥が存在し、逆でない場合は凸欠陥が存在すると判定する処理を前記コンピュータに実行させる請求項24に記載のプログラム。
- 前記判別ステップでは、前記位置合わせステップにより位置合わせされたピクセル画像において、各照明装置に対応する明点が重複しているときは、被検査物の表面にゴミまたはほこりが存在すると判定する処理を前記コンピュータに実行させる請求項17~25のいずれかに記載のプログラム。
- 全体の受光量が所定の閾値を超える画素を欠陥候補画素として検出し、検出された欠陥候補画素について、前記位置合わせ手段によりサブピクセル画像を位置合わせし、かつ判定ステップにより被検査物の表面欠陥を判別する処理を前記コンピュータに実行させる請求項17~26のいずれかに記載のプログラム。
- 前記照明装置の光源として、LEDまたは可視光半導体レーザーが用いられる請求項17~27のいずれかに記載のプログラム。
- 複数個の前記照明装置は、前記ラインセンサを中心とする円周上かつ360度/(照明装置の個数)の位置に配置されている請求項17~28のいずれかに記載のプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021545215A JP7444171B2 (ja) | 2019-09-13 | 2020-08-28 | 表面欠陥判別装置、外観検査装置及びプログラム |
CN202080063954.0A CN114364973B (zh) | 2019-09-13 | 2020-08-28 | 表面缺陷判别装置、外观检查装置以及程序 |
KR1020227008488A KR20220043219A (ko) | 2019-09-13 | 2020-08-28 | 표면 결함 판별 장치, 외관 검사 장치 및 프로그램 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019167576 | 2019-09-13 | ||
JP2019-167576 | 2019-09-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021049326A1 true WO2021049326A1 (ja) | 2021-03-18 |
Family
ID=74866162
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/032574 WO2021049326A1 (ja) | 2019-09-13 | 2020-08-28 | 表面欠陥判別装置、外観検査装置及びプログラム |
Country Status (4)
Country | Link |
---|---|
JP (1) | JP7444171B2 (ja) |
KR (1) | KR20220043219A (ja) |
CN (1) | CN114364973B (ja) |
WO (1) | WO2021049326A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006194828A (ja) * | 2005-01-17 | 2006-07-27 | Mega Trade:Kk | 検査装置 |
EP1742041A1 (de) * | 2005-07-04 | 2007-01-10 | Massen Machine Vision Systems GmbH | Kostengünstige multi-sensorielle Oberflächeninspektion |
JP2012521559A (ja) * | 2009-03-24 | 2012-09-13 | オルボテック・リミテッド | マルチモード・イメージング |
US20160103079A1 (en) * | 2013-05-23 | 2016-04-14 | Centro Sviluppo Materiali S.P.A. | Method for the surface inspection of long products and apparatus suitable for carrying out such a method |
WO2019150693A1 (ja) * | 2018-02-05 | 2019-08-08 | 株式会社Screenホールディングス | 画像取得装置、画像取得方法および検査装置 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11118450A (ja) | 1997-10-14 | 1999-04-30 | Mitsubishi Heavy Ind Ltd | 液晶基板の突起欠陥検出装置 |
JP4190636B2 (ja) * | 1998-11-24 | 2008-12-03 | 日本エレクトロセンサリデバイス株式会社 | 表面検査装置 |
JP5673621B2 (ja) * | 2012-07-18 | 2015-02-18 | オムロン株式会社 | 欠陥検査方法及び欠陥検査装置 |
JP6470506B2 (ja) | 2014-06-09 | 2019-02-13 | 株式会社キーエンス | 検査装置 |
JP6370177B2 (ja) * | 2014-09-05 | 2018-08-08 | 株式会社Screenホールディングス | 検査装置および検査方法 |
KR20180009792A (ko) * | 2015-06-25 | 2018-01-29 | 제이에프이 스틸 가부시키가이샤 | 표면 결함 검출 장치, 표면 결함 검출 방법 및, 강재의 제조 방법 |
JP6682809B2 (ja) | 2015-11-09 | 2020-04-15 | 大日本印刷株式会社 | 検査システムおよび検査方法 |
JP6648869B2 (ja) * | 2017-11-27 | 2020-02-14 | 日本製鉄株式会社 | 形状検査装置及び形状検査方法 |
-
2020
- 2020-08-28 KR KR1020227008488A patent/KR20220043219A/ko not_active Application Discontinuation
- 2020-08-28 JP JP2021545215A patent/JP7444171B2/ja active Active
- 2020-08-28 CN CN202080063954.0A patent/CN114364973B/zh active Active
- 2020-08-28 WO PCT/JP2020/032574 patent/WO2021049326A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006194828A (ja) * | 2005-01-17 | 2006-07-27 | Mega Trade:Kk | 検査装置 |
EP1742041A1 (de) * | 2005-07-04 | 2007-01-10 | Massen Machine Vision Systems GmbH | Kostengünstige multi-sensorielle Oberflächeninspektion |
JP2012521559A (ja) * | 2009-03-24 | 2012-09-13 | オルボテック・リミテッド | マルチモード・イメージング |
US20160103079A1 (en) * | 2013-05-23 | 2016-04-14 | Centro Sviluppo Materiali S.P.A. | Method for the surface inspection of long products and apparatus suitable for carrying out such a method |
WO2019150693A1 (ja) * | 2018-02-05 | 2019-08-08 | 株式会社Screenホールディングス | 画像取得装置、画像取得方法および検査装置 |
Also Published As
Publication number | Publication date |
---|---|
CN114364973A (zh) | 2022-04-15 |
KR20220043219A (ko) | 2022-04-05 |
CN114364973B (zh) | 2024-01-16 |
JP7444171B2 (ja) | 2024-03-06 |
JPWO2021049326A1 (ja) | 2021-03-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11216687B2 (en) | Image detection scanning method for object surface defects and image detection scanning system thereof | |
CN110596134B (zh) | 一种基于图像采集的片状玻璃边缘瑕疵检测方法 | |
JP5014003B2 (ja) | 検査装置および方法 | |
US20180195858A1 (en) | Measurement apparatus for measuring shape of target object, system and manufacturing method | |
KR101679205B1 (ko) | 디바이스 결함 검출장치 | |
EP3789728B1 (en) | Thread shape measuring apparatus and measuring method | |
JP6859627B2 (ja) | 外観検査装置 | |
JP5682419B2 (ja) | 検査方法及び検査装置 | |
US20170053394A1 (en) | Inspection apparatus, inspection method, and article manufacturing method | |
JP2013534312A (ja) | ウェハのソーマークの三次元検査のための装置および方法 | |
CN110596139A (zh) | 一种屏缺陷检测方法和系统 | |
US20180367722A1 (en) | Image acquisition device and image acquisition method | |
JP2009168454A (ja) | 表面欠陥検査装置及び表面欠陥検査方法 | |
JP4932595B2 (ja) | 表面疵検査装置 | |
WO2021049326A1 (ja) | 表面欠陥判別装置、外観検査装置及びプログラム | |
JP6031751B2 (ja) | ガラス基板検査装置及びガラス基板製造方法 | |
JP2021139817A (ja) | ワークの表面検査装置、表面検査システム、表面検査方法及びプログラム | |
KR20100093213A (ko) | 휘도값을 이용한 글래스 기판 상의 이물 감지 시스템 및 그방법 | |
TWI786522B (zh) | 表面檢查裝置、表面檢查方法、鋼材製造方法、鋼材品質管理方法以及鋼材製造設備 | |
CN107727654B (zh) | 膜层检测方法、装置及膜层检测系统 | |
JP2004125396A (ja) | 駆動伝達ベルトの検査方法 | |
JP2014122825A (ja) | ボトルキャップの外観検査装置及び外観検査方法 | |
WO2011101893A1 (ja) | 可撓性を有する検査対象物の表面の傷を検査する方法および装置 | |
JP2021173704A (ja) | ワーク検査装置およびワーク検査方法 | |
RU2810913C1 (ru) | Устройство для контроля поверхности, способ контроля поверхности, способ изготовления стального материала, способ сортировки стального материала, производственное оборудование для изготовления стального материала |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20862587 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021545215 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20227008488 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20862587 Country of ref document: EP Kind code of ref document: A1 |