CN114364973A - Surface defect discriminating device, appearance inspecting device, and program - Google Patents

Surface defect discriminating device, appearance inspecting device, and program Download PDF

Info

Publication number
CN114364973A
CN114364973A CN202080063954.0A CN202080063954A CN114364973A CN 114364973 A CN114364973 A CN 114364973A CN 202080063954 A CN202080063954 A CN 202080063954A CN 114364973 A CN114364973 A CN 114364973A
Authority
CN
China
Prior art keywords
pixel
sub
illumination
light receiving
receiving amount
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202080063954.0A
Other languages
Chinese (zh)
Other versions
CN114364973B (en
Inventor
原田孝仁
阿部芳久
山田正之
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Konica Minolta Inc
Original Assignee
Konica Minolta Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Inc filed Critical Konica Minolta Inc
Publication of CN114364973A publication Critical patent/CN114364973A/en
Application granted granted Critical
Publication of CN114364973B publication Critical patent/CN114364973B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
    • G01N21/892Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles characterised by the flaw, defect or object feature examined
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
    • G01N21/8901Optical details; Scanning details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Abstract

The invention provides a surface defect discriminating device, an appearance inspecting device and a program. The disclosed device is provided with: an image acquisition unit (10) which receives and captures images of the light reflected from the object to be inspected by the line sensor every time the illumination light from each illumination device is switched, and acquires a plurality of images in a state in which the respective positions are shifted by the amount of switching of the illumination light, when the illumination light from each illumination device is switched to the object to be inspected while the object to be inspected (5) is moved relatively to the illumination devices (2a, 2b) and the line sensor (1) disposed at different positions; an alignment unit (10) that aligns the acquired images corresponding to the respective illumination devices; and a determination unit (10) for determining the surface defect of the inspected object from the image aligned by the alignment unit.

Description

Surface defect discriminating device, appearance inspecting device, and program
Technical Field
The present invention relates to a surface defect discriminating device for discriminating a surface defect of an object to be inspected such as a product or a component having a surface with a strong specular reflection property, and an appearance inspecting device and a program provided with the surface defect discriminating device.
Background
Scratches on the surface of products and parts spoil the appearance. When the film forming plate for forming a thin film such as a film has irregularities caused by scratches or the like, the irregularities are transferred to the produced thin film and become defects of the thin film.
Therefore, an appearance inspection apparatus for detecting surface defects of various products, parts, film-forming plates, and the like has been proposed.
For example, patent document 1 discloses a technique for discriminating whether the shadow of an image is a defect or dirt by imaging a component while switching light sources in a plurality of directions, and analyzing the direction of an illumination light source and the image captured by the illumination light source.
Patent document 1 Japanese patent application laid-open No. 11-118450
However, the invention described in patent document 1 is based on the premise that the object to be inspected is stationary, and for example, it is impossible to inspect a roller-driven conveyor belt member, which is difficult to be stationary-controlled, while moving the conveyor belt, and to determine a surface defect.
Therefore, a technique capable of discriminating a surface defect while relatively moving an object to be inspected with respect to an illumination device or a line sensor for capturing an image is desired.
Disclosure of Invention
The present invention has been made in view of the above-described technical background, and an object of the present invention is to provide a surface defect determination device, an appearance inspection device, and a program that can determine a surface defect while relatively moving an object to be inspected with respect to an illumination device and a line sensor.
The above object is achieved by the following means.
(1) A surface defect discriminating device includes: an image acquisition unit that receives reflected light from the object to be inspected via the line sensor and captures images of the object to be inspected every time the illumination light from each illumination device is switched when the object to be inspected is irradiated with the illumination light from each illumination device while the object to be inspected is moved relative to the illumination device and the line sensor arranged at different positions, and that acquires a plurality of images while the respective positions of the images are shifted by the amount of switching of the illumination light; an alignment unit that aligns the images corresponding to the respective illumination devices acquired by the image acquisition unit; and a determination unit for determining the surface defect of the inspected object according to the image aligned by the alignment unit.
(2) The surface defect discriminating device according to claim 1, wherein a part of each pixel of the line sensor is an overlapping region where imaging ranges overlap in the current imaging and the previous imaging due to irradiation of the object with the illumination light from one of the illuminating devices, the surface defect discriminating device includes a sub-pixel image creating means for creating a sub-pixel image by subtracting a light receiving amount of the overlapping region from a light receiving amount of the entire pixel in the current imaging when a part of one pixel other than the overlapping region is defined as a sub-pixel, and the aligning means aligns the sub-pixel images created by the sub-pixel image creating means and corresponding to the illuminating devices.
(3) A surface defect discriminating device including an image acquiring unit that receives reflected light from an object to be inspected by a line sensor and captures an image for each of illumination lights when the object to be inspected is irradiated with the illumination lights from the illuminating devices and the line sensor which are arranged at different positions while the object to be inspected is moved relative to the illuminating devices and the line sensor, and each of the illumination lights from the illuminating devices is switched, wherein a part of each pixel of the line sensor is an overlapping region where imaging ranges in the current capturing and the previous capturing overlap each other due to the illumination light from one illuminating device irradiating the object to be inspected, the surface defect discriminating device further including: a sub-pixel image creating unit that, when a portion of one pixel other than the overlap region is a sub-pixel, subtracts the light receiving amount of the overlap region from the light receiving amount of the entire pixel in the current image, thereby estimating the light receiving amount of the sub-pixel and creating a sub-pixel image; and a discrimination unit that discriminates the surface defect of the object to be inspected based on the sub-pixel image created by the sub-pixel image creation unit.
(4) The surface defect detecting apparatus according to the aforementioned item 3, comprising a positioning means for positioning the sub-pixel images created by the sub-pixel image creating means and corresponding to the respective lighting devices.
(5) The surface defect discriminating device according to any one of the preceding items 2 to 4, wherein the sub-pixel image creating means subtracts the light receiving amount of the overlapping region from the light receiving amount of the entire pixel in a state in which the light receiving amount of the overlapping region is corrected for each region.
(6) The surface defect discriminating device according to any one of the preceding items 2 to 5, wherein the sub-pixel image creating means calculates the light receiving amount of the overlapping area from a sum of light receiving amounts of sub-pixels estimated previously, and subtracts the calculated light receiving amount from the light receiving amount of the entire pixel to estimate the light receiving amount of the sub-pixel at this time.
(7) The surface defect discriminating device as set forth in the above item 6, wherein the sub-pixel image creating means estimates an average value obtained by dividing the light receiving amount of the entire first pixel after the start of imaging by the number of sub-pixels per pixel as the light receiving amount of the first sub-pixel.
(8) The surface defect discriminating device according to any one of the preceding items 2 to 7, wherein the sub-pixel image creating means estimates the light receiving amount of the sub-pixel of the current time as an average value obtained by dividing the light receiving amount of the entire pixel by the number of sub-pixels per pixel when the light receiving amount of the entire pixel does not exceed a predetermined threshold, and estimates the light receiving amount of the sub-pixel of the current time by subtracting the light receiving amount of the overlapping region from the light receiving amount of the entire pixel when the light receiving amount of the entire pixel exceeds the predetermined threshold.
(9) The surface defect detecting device according to any one of the preceding items 2, 4 to 8, wherein the alignment unit aligns the brightness value K according to the following equationi jCorrected to a correction value K'i jThe sub-pixel image corresponding to each lighting device created by the sub-pixel image creating means is aligned.
[ formula 1]
Figure BDA0003540424550000031
Wherein, i: indexing of subpixel inferred positions
j: identification number of lighting device to be turned on
(10) The surface defect discriminating device according to any one of the preceding items 1, 2, 4 to 9, wherein the discriminating means determines that a concave defect or a convex defect exists on the surface of the object to be inspected when the bright points corresponding to the respective illuminating devices do not overlap and the respective bright points are within a predetermined range in the sub-pixel image aligned by the aligning means.
(11) The surface defect discriminating apparatus according to the aforementioned item 10, wherein the discriminating means determines that the concave defect exists when the position of the bright point corresponding to each of the illuminating devices is opposite to the arrangement position of the illuminating devices in the sub-pixel image aligned by the aligning means, and determines that the convex defect exists when the position of the bright point corresponding to each of the illuminating devices is not opposite to the arrangement position of the illuminating devices.
(12) The surface defect discriminating device according to any one of the preceding items 1, 2, 4 to 11, wherein the discriminating means determines that dust or dirt is present on the surface of the object to be inspected when the bright spots corresponding to the respective illuminating devices are superimposed on the sub-pixel image aligned by the aligning means.
(13) The surface defect discriminating apparatus as set forth in any one of the preceding items 1, 2, 4 to 12 detects pixels having an entire light receiving amount exceeding a predetermined threshold as defect candidate pixels, and for the detected defect candidate pixels, the sub-pixel images are aligned by the aligning means, and the surface defect of the object is discriminated by the discriminating means.
(14) The surface defect discriminating device according to any one of the preceding items 1 to 13, wherein an LED or a visible light semiconductor laser is used as a light source of the illumination device.
(15) The surface defect detecting device according to any one of the preceding items 1 to 14, wherein the number of the lighting devices is three or more, and the lighting devices are arranged on a circumference centering on the line sensor, and are arranged with an angular difference of 360 degrees divided by the number of the lighting devices.
(16) An appearance inspection device is provided with: a plurality of lighting devices disposed at different positions; a line sensor capable of receiving reflected light of illumination light irradiated from each illumination device to the object to be inspected; a moving unit that moves the object to be inspected relative to the illuminating device and the line sensor; an illumination control unit for switching the illumination light from each illumination device to irradiate the object to be inspected at a predetermined cycle; a line sensor control unit that controls the line sensor so as to receive reflected light from the test object and take an image of the reflected light each time the illumination light from each illumination device is switched while relatively moving the test object with respect to the illumination device and the line sensor by the moving unit; and a surface defect discriminating device as defined in any one of the preceding items 1 to 15.
(17) A program for causing a computer to execute the steps of: an image acquisition step of receiving and capturing images of the object by the line sensor every time the illumination light from each of the illumination devices is switched when the object is irradiated with the illumination light from each of the illumination devices while relatively moving the object to be inspected with respect to the illumination devices and the line sensor arranged at different positions, and acquiring a plurality of images in a state where the respective positions are shifted by the switching amount of the illumination light; an alignment step of aligning the images corresponding to the respective illumination apparatuses acquired in the image acquisition step; and a determination step of determining surface defects of the object to be inspected based on the image aligned by the alignment step.
(18) According to the program described in the aforementioned item 17, a part of each pixel of the line sensor becomes an overlapping area where imaging ranges overlap in the current imaging and the previous imaging due to irradiation of the object with the illumination light of one of the illumination devices, and the computer is caused to execute: and a sub-pixel image creating step of creating a sub-pixel image by subtracting the light receiving amount of the overlapping area from the light receiving amount of the entire pixel in the current image capturing when the portion of one pixel other than the overlapping area is a sub-pixel, and causing the computer to perform alignment of the sub-pixel images corresponding to the respective lighting devices created in the sub-pixel image creating step.
(19) The program according to the aforementioned item 17, causing the computer to execute: in the sub-pixel image creating step, the light receiving amount of the overlapping area is subtracted from the light receiving amount of the entire pixel in a state where the light receiving amount of the overlapping area is corrected for each area.
(20) The program according to the aforementioned item 18 or 19, causing the aforementioned computer to execute: in the sub-pixel image creating step, the light receiving amount of the overlapping area is obtained from the sum of the light receiving amounts of the sub-pixels estimated in the previous time, and the light receiving amount of the current sub-pixel is estimated by subtracting the obtained light receiving amount from the light receiving amount of the entire pixel.
(21) The program according to the aforementioned item 20, causing the computer to execute: in the sub-pixel image creating step, the average value obtained by dividing the light receiving amount of the entire first pixel after the start of image capturing by the number of sub-pixels per pixel is estimated as the light receiving amount of the first sub-pixel.
(22) The program according to any one of the preceding claims 18 to 21, causing the computer to execute: in the subpixel image creating step, when the light receiving amount of the entire pixel does not exceed the predetermined threshold, the light receiving amount of the current subpixel is estimated by dividing the light receiving amount of the entire pixel by the number of subpixels per pixel, and when the light receiving amount of the entire pixel exceeds the predetermined threshold, the light receiving amount of the current subpixel is estimated by subtracting the light receiving amount of the overlap region from the light receiving amount of the entire pixel.
(23) The program according to any one of the preceding claims 18 to 22, causing the computer to execute: in the above alignment step, the luminance value K is calculated by the following equationi jCorrected to a correction value K'i jThe alignment of the sub-pixel images corresponding to the respective lighting devices created by the sub-pixel image creating step is performed.
[ formula 2]
Figure BDA0003540424550000051
Wherein, i: indexing of subpixel inferred positions
j: the identification difference number of the lighted illumination.
(24) The program according to any one of the preceding claims 17 to 23, causing the computer to execute: in the determination step, it is determined that a concave defect or a convex defect exists on the surface of the object to be inspected when the bright points corresponding to the respective illumination devices do not overlap and the respective bright points are within a predetermined range in the sub-pixel image aligned in the alignment step.
(25) The program according to the aforementioned item 24, causing the computer to execute: in the determination step, in the sub-pixel image aligned in the alignment step, it is determined that a concave defect exists when the position of the bright point corresponding to each of the illumination devices is opposite to the arrangement position of the illumination devices, and the determination unit determines that a convex defect exists when the position of the bright point corresponding to each of the illumination devices is not opposite to the arrangement position of the illumination devices.
(26) The program according to any one of the preceding claims 17 to 25, causing the computer to execute: in the determination step, when the bright points corresponding to the respective illumination devices overlap in the pixel image aligned in the alignment step, it is determined that dust or dirt is present on the surface of the inspection object.
(27) The program according to any one of the preceding claims 17 to 26, causing the computer to execute: the pixels whose total light receiving amount exceeds a predetermined threshold are detected as defect candidate pixels, and the sub-pixel images are aligned by the alignment unit with respect to the detected defect candidate pixels, and the surface defect of the object is determined by the determination step.
(28) The program according to any one of the preceding claims 17 to 27, wherein an LED or a visible light semiconductor laser is used as a light source of the lighting device.
(29) The program according to any one of the preceding claims 17 to 28, wherein the plurality of illumination devices are arranged on a circumference centered on the line sensor at positions of 360 degrees/(number of illumination devices).
According to the invention described in the aforementioned item (1), while the inspection object is relatively moved with respect to the illumination devices and the line sensor arranged at different positions, the illumination light from each illumination device is switched to be applied to the inspection object one by one. Each time the illumination light from each illumination device is switched, the reflected light from the object is received by the line sensor and captured, and a plurality of images are acquired with the respective positions shifted by the switching amount of the illumination light. After the acquired images corresponding to the respective illumination devices are aligned, the surface defect of the object to be inspected is discriminated from the aligned images.
In this way, in a state where the object to be inspected is moved relative to the illumination device and the line sensor and is shifted in position by the amount of switching of the illumination light when the illumination light from each illumination device is switched, the plurality of images corresponding to each illumination device acquired from the line sensor are aligned, and the surface defect of the object to be inspected is determined in the aligned state.
According to the invention described in the aforementioned item (2), a part of each pixel of the line sensor becomes an overlapping region where the imaging ranges in the current imaging and the previous imaging overlap each other due to the irradiation of the object with the illumination light of one illumination device, and when a part of one pixel other than the overlapping region is a sub-pixel, the light receiving amount of the overlapping region is subtracted from the light receiving amount of the entire pixel in the current imaging, whereby the light receiving amount of the current sub-pixel is estimated, and a sub-pixel image is created. Then, the created sub-pixel images corresponding to the respective illumination devices are aligned, and the surface defect of the object to be inspected is detected from the aligned images. Here, since the sub-pixel is a portion of the pixel other than the overlapping region and is therefore smaller than one pixel, the resolution of the visual inspection is improved, and more detailed surface defect detection is possible.
In other words, in the case of imaging a relatively moving non-inspection object by using a line sensor, the distance between the line sensor and the imaging surface of the non-inspection object is unstable and the depth of field needs to be set to be deep, but there is a trade-off relationship that the resolution is lowered when the depth of field is increased and a minute defect cannot be inspected.
According to the invention described in the aforementioned item (3), since a defect is discriminated from a sub-pixel image smaller than one pixel, the resolution of the appearance inspection is improved, and more detailed surface defect detection can be performed.
According to the invention described in the aforementioned item (4), the defect detection can be performed with higher accuracy by the alignment of the sub-pixel images corresponding to the respective illumination devices.
According to the invention described in the aforementioned item (5), since the light receiving amount of the overlap region is subtracted from the light receiving amount of the entire pixel in a state where the light receiving amount of the overlap region is corrected for each region, it is possible to estimate the light receiving amount of the current sub-pixel by subtracting a more accurate light receiving amount of the overlap region, and further, it is possible to perform more accurate defect determination.
According to the invention described in the aforementioned item (6), the light receiving amount of the overlapping region is obtained from the sum of the light receiving amounts of the sub-pixels estimated in the previous time, and the light receiving amount of the sub-pixel at this time is estimated by subtracting the obtained light receiving amount from the light receiving amount of the entire pixel, so that the estimation process can be simplified.
According to the invention described in the aforementioned item (7), since the average value obtained by dividing the light receiving amount of the entire first pixel after the start of image capturing by the number of sub-pixels per pixel is estimated as the light receiving amount of the first sub-pixel, the process of estimating the light receiving amount of the sub-pixel next and later can be smoothly performed.
According to the invention described in the aforementioned item (8), when the light receiving amount of the entire pixel does not exceed the predetermined threshold value, in other words, when there is a high possibility that no surface defect is present, the light receiving amount of the entire pixel is divided by the average value of the number of sub-pixels per pixel, and the light receiving amount of the sub-pixel at this time is estimated. On the other hand, when the light receiving amount of the entire pixel exceeds the predetermined threshold value, in other words, when the possibility of the presence of the surface defect is high, the light receiving amount of the current sub-pixel is estimated by subtracting the light receiving amount of the overlap region from the light receiving amount of the entire pixel. Thus, the defect discriminating process can be executed while concentrating on a region where the possibility of the presence of the surface defect is high.
According to the invention described in the aforementioned item (9), the sub-pixel images corresponding to the respective illumination devices can be accurately aligned, and further, the defect can be determined with high accuracy.
According to the invention described in the aforementioned item (10), a concave defect such as a scratch or a convex defect on the surface of the object can be discriminated.
According to the invention as recited in the aforementioned item (11), it is possible to discriminate the concave defect and the convex defect on the surface of the object to be inspected.
According to the invention as recited in the aforementioned item (12), dust or dirt on the surface of the object to be inspected can be discriminated.
According to the invention described in the aforementioned item (13), since the pixels whose total light receiving amount exceeds the predetermined threshold are detected as the defect candidate pixels, the sub-pixel images are aligned with the detected defect candidate pixels, and the surface defect of the object to be inspected is discriminated, the defect discriminating process can be performed with the pixels being concentrated on the region where the possibility of the surface defect is high.
According to the invention as recited in the aforementioned item (14), since the LED or the visible light semiconductor laser can be used as the light source of the illumination device, the switching of each illumination device can be performed at high speed.
According to the invention described in the aforementioned item (15), since the three or more illumination devices are arranged on the circumference centering on the line sensor and are arranged with the angular difference of 360 degrees divided by the number of the illumination devices, the surface defect such as the scratch can be discriminated with high accuracy while ensuring the illumination device in which the certification light does not become a right angle with respect to the scratch or the like.
According to the invention described in the aforementioned item (16), the surface defect of the non-test object is accurately discriminated while relatively moving the test object with respect to the illumination device and the line sensor which are disposed at different positions.
According to the inventions described in the aforementioned items (17) to (29), the computer can execute the surface defect discriminating process for the non-test object while relatively moving the test object with respect to the illuminating device and the line sensor arranged at different positions.
Drawings
Fig. 1 is a configuration diagram of an appearance inspection apparatus according to an embodiment of the present invention.
Fig. 2 (a) and (B) are diagrams for explaining the arrangement relationship of a plurality of lighting devices.
Fig. 3 is a diagram for explaining a relative positional relationship between an imaging range and a pixel when imaging is performed while switching a plurality of illumination devices.
Fig. 4 is a diagram for explaining a relative positional relationship between the imaging range and the pixel when the first to fourth imaging is performed by one illumination device.
Fig. 5 is a diagram for explaining a method of estimating the light receiving amount of the sub-pixel 23.
Fig. 6 is a diagram for explaining alignment of sub-pixel images by a plurality of lighting devices.
Fig. 7 is a diagram showing an example of the sensitivity distribution of a pixel.
Fig. 8 is a diagram showing an example of the corrected estimated light-receiving amount value of each region of the pixel calculated in consideration of the weighting of the sensitivity distribution of the pixel.
Fig. 9 is a diagram for explaining a case where the distribution of the irradiation light amount of the plurality of illumination devices is different, and therefore the light receiving amount of the reflected light is also different depending on the region of the pixel.
Fig. 10 is a diagram for explaining a method of discriminating a void defect.
Fig. 11 is a diagram for explaining a method of discriminating a convex defect.
Fig. 12 is a diagram for explaining a method of discriminating a scratch defect.
Fig. 13 is a diagram for explaining a method of discriminating dust and dirt.
Fig. 14 is a diagram schematically showing an image determined as a void defect by combining a plurality of sub-pixel images.
Fig. 15 is a diagram schematically showing an image determined as a convex defect by combining a plurality of sub-pixel images.
Fig. 16 is a diagram schematically showing an image determined as a scratch defect by combining a plurality of sub-pixel images.
Fig. 17 is a diagram schematically showing an image that is discriminated as dust or dirt by combining a plurality of sub-pixel images.
Detailed Description
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
[ Structure of appearance inspection apparatus ]
Fig. 1 is a configuration diagram of an appearance inspection apparatus according to an embodiment of the present invention. As shown in fig. 1, the appearance inspection apparatus includes a line sensor 1, two illumination devices 2a and 2b, an illumination control unit 8 for controlling the illumination devices 2a and 2b, a line sensor control unit 9 for controlling the line sensor 1, transport drums 3 and 3 for transporting an inspection object 5, a drum encoder 4 for detecting an imaging position of the inspection object 5, a display device 6, a computer 10, a transport drum control unit 11 for controlling the rotation speed of the transport drum 3 to control the transport speed of the inspection object 5, and the like.
The computer 10 processes the image captured by the line sensor 1 to perform defect determination, and synchronously controls the lighting devices 2a, 2b and the line sensor 1. The display device 6 displays the image, the processing result, and the like after the defect determination processing by the computer 10.
The test object 5 is in the form of a belt having a high reflectance, is provided in the form of a roller by the transport drum 3, and is transported in the Y direction by the rotation of the transport drums 3, 3 in the arrow direction. The imaging position of the object 5 imaged by the line sensor 1 is detected by the drum encoder 4.
The line sensor 1 is disposed to extend in the X direction orthogonal to the moving direction Y of the object 5, and as shown in fig. 2 (a), the two illumination devices 2a and 2b are disposed at the target position centered on the line sensor 1 with an angular difference of 180 degrees when viewed from above, and can illuminate from 2 different directions. The opposing direction of the illumination devices 2a and 2b may be the X direction or the Y direction, or may be another direction. In the present embodiment, two illumination devices 2a and 2b are used, but 3 or more illumination devices may be used. In the case of 3 or more illumination devices, it is preferable that the illumination devices not perpendicular to the surface defect such as a linear scratch be secured, and the surface defect such as a scratch be accurately discriminated, and that an angular difference of 360 degrees divided by the number of the illumination devices be arranged on a circumference centering on the line sensor 1 as viewed from above, as shown in fig. 2 (B). In fig. 2 (B), in the case of three lighting devices 2a, 2B, and 2c, they are arranged at an angle difference of 120 degrees from each other.
The lighting devices 2a and 2b can be switched at any timing by control of the lighting control unit 8.
When the imaging range of 1 line of the line sensor 1 is small relative to the object 5, the entire object 5 may be imaged by sequentially repeating the steps of imaging the object 5 by moving the illumination devices 2a and 2b and the line sensor 1 together by the length of the line sensor 1 in the X direction by the sensor illumination transport unit 12 after imaging the object 5 once in the Y direction, and imaging the object once again in the Y direction.
The line sensor 1 switches the switches of the illumination devices 2a and 2b and receives the reflected light when the object 5 is irradiated, while moving the object 5 in the Y direction. The line sensor 1 and the illumination devices 2a and 2b are not in the facing position, and the reflected light of the illumination light from the illumination devices 2a and 2b after being diffusely reflected by the inspection object 5 is received by the line sensor 1. Therefore, the image captured by the line sensor 1 becomes a dark field image.
Since the surface of the object 5 to be inspected has a high reflectance, when a concave defect, a convex defect, a scratch-like defect, dust, dirt, or the like exists at the illumination position, the reflected light diffusely reflected by the defect, dust, dirt, or the like enters the line sensor 1.
The line sensor control unit 9 and the illumination control unit 8 are connected to the computer 10, and the line sensor 1 and the illumination device 2a, and the line sensor 1 and the illumination device 2b emit light in synchronization with each other and are imaged.
In the present embodiment, the line rate of the line sensor 1 is set to 100kHz (shutter speed 0.01ms) in accordance with the specifications of a general line sensor. In other words, as the illumination devices 2a and 2b, an LED light source, an LD (visible light semiconductor laser) light source, or the like that alternately switches at high speed every 0.01ms is preferably used.
[ surface Defect discrimination treatment ]
Next, the surface defect determination processing of the test object 5 by the computer 10 will be described. The computer 10 includes a CPU, a RAM, a storage device, and the like, and the surface defect determination process is executed by the CPU operating according to an operation program stored in the storage device or the like.
As described above, the line sensor 1 performs imaging by alternately switching the switches of the plurality of (2 in the present embodiment) illumination devices 2a and 2b while moving the object 5 to be inspected. The computer 10 sequentially acquires the captured images of the line sensor 1.
< creation of a sub-pixel image >)
Fig. 3 is a diagram for explaining a relative positional relationship between the imaging range and the pixel 20 when imaging while switching the illumination devices 2a and 2 b.
As shown in fig. 3, the size of the defect 30 to be detected is 12A, the resolution of the line sensor 1 (the length of one pixel 20) is 6A, and the illumination devices 2A and 2b are switched every time the inspection object 5 is conveyed a, and imaging is performed. The resolution 6A of the line sensor 1 is a one-time shooting area in one pixel. Therefore, as shown in fig. 3, the first imaging is performed by the illumination light of the illumination device 2a, and when the object 5 is conveyed a, the imaging is switched to the second imaging, and the imaging is performed by the illumination light of the illumination device 2 b. In the first and second shots, the imaging area of the object 5 is moved by an amount a. The same applies to the third and subsequent shots. In the example of fig. 3, for convenience of explanation, the pixel 20 is moved by an amount a each time the switching imaging is performed.
Fig. 4 is a diagram for explaining a relative positional relationship between the pixel 20 and the imaging range when the first to fourth imaging is performed by one illumination device 2 a.
As shown in fig. 4, focusing on the illumination device 2A, each time the inspection object 5 moves 2A, the illumination device 2A is turned on to start irradiation of the inspection object 5 with the illumination light, and each time imaging is performed by the line sensor 1. In other words, every time the object 5 to be inspected moves 2A, photographing corresponding to the illumination light from the illumination device 2A is performed. The irradiation time of the illumination light of the illumination device 2a, in other words, the light receiving time of each pixel 20 of the line sensor 1 is a time corresponding to the movement distance a. These points are also the same for the lighting device 2 b.
As shown in fig. 4, in the imaging by the illumination device 2a, the amount of 4A which is a part of the sensor resolution 6A is the same imaging range in which the object 5 is imaged in the current imaging and the previous imaging, and the imaging ranges overlap each other. In other words, when the pixel 20 is sequentially divided into three regions, i.e., the first region 21, the second region 22, and the third region 23, in the longitudinal direction, the length of each region is 2A, and the second region 22 and the third region 23 in the previous shot and the first region 21 and the second region 22 in the current shot are overlapped regions having the same shooting range. For example, the first area 21 and the second area 22 of the fourth shot overlap with the second area 22 and the third area 23 of the third shot, respectively. In fig. 4, an overlapping area with the previous shot in the current shot is displayed in gray.
The third area 23 in the current shot is a part which is not overlapped with the shooting range in the previous shot, is updated as a new shooting range, and is regarded as a sub-pixel. Hereinafter, the third region is also referred to as a subpixel.
Fig. 5 is a diagram for explaining a method of estimating the amount of light received by the sub-pixel 23, and shows the relative positional relationship between the pixel 20 and the imaging range when the lighting device 2a performs the i-th imaging and the imaging a plurality of times before and after the i-th imaging.
As shown in fig. 5, in the i-th shot, the light receiving amount of the sub-pixel 23 needs to be calculated and estimated by subtracting the light receiving amount of the first region 21 and the second region 22, which are overlapped regions with the previous shot, from the light receiving amount of the entire one pixel 20 (the amount of 6A) in the i-th shot.
In the comparison between the (i-2) th image and the (i-3) th image, the amount of 2A of the sub-pixel 23 is updated, and each time the number of times of image capturing increases to the (i-1) th time and the i-th time, the amount of 2A of the sub-pixel 23 is updated in sequence. The updated new sub-pixel 23 becomes an overlapping area in the next shooting, remains as an overlapping area in the next shooting, and exceeds the overlapping area in the next shooting. In other words, the overlapping area between the present shot and the previous shot is the sub-pixel 23 at the time of the previous and the past two shots more previous. Therefore, in the i-th shot, the light receiving amount of the sub-pixel 23 is a value obtained by subtracting the sum of the estimated light receiving amount of the sub-pixel 23 in the previous (i-1) -th shot and the estimated light receiving amount of the sub-pixel 23 in the further previous (i-2) -th shot from the total light receiving amount of the one pixel 6A in the i-th shot. In other words, the calculation is (estimated value of the light receiving amount of the ith sub-pixel) ═ the (ith total light receiving amount) - { (estimated value of the light receiving amount of the (i-1) th sub-pixel) + (estimated value of the light receiving amount of the (i-2) th sub-pixel) }.
The numerical values written in the first to third regions 21 to 23 of each pixel 20 in fig. 5 are an example of the estimated received light amount in the region, and are the same as the numerical values of the previous or more previous sub-pixels 23. The value of the right lateral direction of the pixel 20 is the total light receiving amount of one pixel. In the example of fig. 5, since the total light receiving amount of one pixel 6A in the i-th imaging is 3.8, the estimated light receiving amount of the sub-pixel 23 in the previous (i-1) -th imaging is 1.3, and the estimated light receiving amount of the sub-pixel 23 in the previous (i-2) -th imaging is 0.5, the estimated light receiving amount of the sub-pixel 23 in the i-th imaging is 2.0, which is [ 3.8- (1.3+0.5) ].
However, the estimation process of the light receiving amount of the sub-pixel 23 may be performed on the pixel 20 of the defect candidate pixel detected as having a high possibility of having a defect, and the estimated position of the detected defect candidate pixel may be i based on the information of the drum encoder 4, and the light receiving amount of the sub-pixel 23 at this time may be stored in association with the position information to create a sub-pixel image. This makes it possible to perform the defect discriminating process while concentrating on a portion where the possibility of the presence of the surface defect is high, thereby improving the efficiency. As for the defective candidate pixel, the pixel 20 whose total light receiving amount exceeds a predetermined threshold may be detected as the defective candidate pixel.
Since the probability of defects is low for the pixels 20 whose total light receiving amount does not exceed the predetermined threshold, the average value of the pixel light receiving amounts by 2A may be obtained as 1/3 of the light receiving amount of the entire pixel, and estimated as the light receiving amount of the sub-pixel 23 (for example, the (i-4) th or (i-3) th in fig. 5).
In addition, since the estimated light receiving amount of the previous sub-pixel 23 does not exist in the first shot after the start of the inspection, the light receiving amount of the sub-pixel 23 in the subsequent shot may be estimated by estimating the average value obtained by dividing the light receiving amount of the entire pixel by the number of sub-pixels 23 per pixel as the light receiving amount of the first sub-pixel 23 and using the light receiving amount.
In this way, a sub-pixel image made up of the light receiving amount of 1/3 pixels (area of the amount of 2A) is created for the defect candidate pixel periphery, instead of the image of the pixel 20. This makes it possible to detect and determine minute surface defects with high accuracy by multiplying the resolution of the line sensor 1 by 3. In other words, when the non-inspection object 5 moving relative to the line sensor 1 and the illumination devices 2a and 2b is imaged by the line sensor 1, the distance between the imaging surfaces of the line sensor 1 and the non-inspection object 5 is unstable, and it is necessary to set the depth of field to be deep, but there is a trade-off relationship that the resolution is lowered when the depth of field is increased, and there is a case where a minute defect cannot be detected, but by using an image of a sub-pixel smaller than one pixel, the resolution is improved without increasing the depth of field, and a minute defect can be detected.
Although a subpixel image is generated for the lighting device 2a, a subpixel image of 1/3 pixels can be created for the lighting device 2b as well.
< alignment of sub-pixel images for illumination devices 2a, 2b >
As shown in fig. 6, when the positions of the sub-pixel images of the one pixel 20 corresponding to the illumination device 2a are a1, a2, a3 …, and the positions of the sub-pixel images of the one pixel 20 corresponding to the illumination device 2b are b1, b2, b3 …, the object 5 moves relative to the line sensor 1 and the illumination devices 2a, 2b at the time of imaging, so that the sub-pixel images of the illumination device 2a and the sub-pixel images of the illumination device 2b are alternately shifted in position by the movement distance a corresponding to the switching time of the illumination light, as in a1, b1, a2, b2, a3, b3 ….
Thus, the positional deviation is corrected. Specifically, when the position of the sub-pixel of the lighting device 2b corresponding to the position a2 of the lighting device 2a is b2 ', the light receiving amount (luminance value) at the position b 2' is (light receiving amount at the position b1 + light receiving amount at the position b 2)/2, and the positional deviation is corrected by correcting the light receiving amount (luminance value). The same applies to the positions b3 ', b 4' … of the sub-pixels of the lighting device 2b corresponding to the positions a3, a4 … of the lighting device 2 a.
The position of the lighting device 2a may be aligned so as to correspond to the positions b1, b2, and b3 … of the lighting device 2 b.
The above-described modification is a modification in the case of two lighting apparatuses, but the modification applicable to two or more lighting apparatuses can be expressed by the following expression.
[ formula 3]
Figure BDA0003540424550000141
Wherein, i: indexing of subpixel inferred positions
j: identification number of lighting device to be turned on
Correction for estimating light receiving amount of sub-pixel
Assuming that all regions corresponding to one pixel 6A have the same light receiving sensitivity, the light receiving amount of the sub-pixel 23 is estimated. However, actually, as shown in the sensitivity distribution of fig. 7, the light receiving sensitivity differs for each part of the pixel 20, and the light receiving sensitivity is relatively high in the central part and low at both end parts. In fig. 7, it is shown that the sensitivity of the hatched portion is high, and the sensitivity of the third region is higher than that of the first region even at both end portions of the same.
As described with reference to fig. 5, if the current shot is the i-th shot, the sub-pixel is the third region 23 at the right end of one pixel in the (i-1) -th shot which is the previous shot, and the light receiving sensitivity is low. The sub-pixel 23 overlaps the second region 22 corresponding to 2A at the center in the ith image capturing, and the light receiving sensitivity of this region is high. Therefore, the light receiving amount of the central second region 22 in the i-th shot should be larger than that of the sub-pixel 23 in the (i-1) -th shot.
The light receiving amount of the first region 21 in the i-th shot is the light receiving amount of the sub-pixel 23 in the (i-2) th shot, but actually, the light receiving amount of the first region 21 in the i-th shot should be smaller than the light receiving amount of the sub-pixel 23 in the (i-2) th shot.
Therefore, in order to correct the light receiving amount of each of the regions 21 to 23 of the pixel 20, weighting is performed in accordance with each of the regions 21 to 23, and a weighting coefficient is set for each of the regions 21 to 23. Specifically, the light receiving amount of the sub-pixel 23 in the i-th shot is calculated by the following equation, where ∈ 1 is a weight coefficient of the first region 21 at the left end portion of one pixel 20, ∈ 2 is a weight coefficient of the second region 22 at the center portion, and ∈ 3 is a weight coefficient of the third region 23 at the right end portion.
(estimated value of light-receiving amount of ith sub-pixel) (i-th total light-receiving amount) { (estimated value of light-receiving amount of (i-1) th sub-pixel) } ε 2/ε 3+ (estimated value of light-receiving amount of (i-2) th sub-pixel) } ε 1/ε 3}
Specific examples of ∈ 1, ∈ 2, and ∈ 3 include, in the present embodiment, ∈ 1 ═ 1/3, ∈ 2 ═ 1, and ∈ 3 ═ 2/3. Fig. 8 shows an example of the corrected estimated light receiving amount values of the respective areas 21 to 23 calculated by taking the weighting into consideration.
In the example of fig. 8, when the light receiving amount of the sub-pixel 23 in the (i-2) th shooting is set to 0.3, the corrected light receiving amount is increased to 0.5 in the second area 22 in the (i-1) th shooting, and the corrected light receiving amount is decreased to 0.2 in the first area 21 in the i-th shooting. When the light receiving amount of the sub-pixel 23 in the (i-1) th shooting is set to 0.9, the corrected light receiving amount increases to 1.3 in the second region 22 in the i-th shooting.
Further, since the distribution of the amount of light irradiated by the illumination devices 2a and 2b differs not only in the light receiving sensitivity of the pixel 20, the amount of light received by the reflected light 40 may differ depending on the region of the pixel 20 as shown in fig. 9. In the example of fig. 9, the reflected light amount is 1 in the first region 21 and the third region 23 at the end portions of the pixel 20, and 2 times the reflected light amount in the central second region 22. Therefore, in order to correct the difference in the amount of reflected light, the weighting coefficients ε 1 to ε 3 may be set for the respective regions 21 to 23 of the pixel 20. For example, epsilon 1 may be 1/2, epsilon 2 may be 1, and epsilon 3 may be 1/2.
In this way, in the state where the light receiving amount is corrected for each of the regions 21 to 23 of the pixel 20, the light receiving amount of the overlap region is subtracted from the light receiving amount of the entire single pixel, so that the light receiving amount of the sub-pixel 23 at the present time can be estimated by subtracting the more accurate light receiving amount of the overlap region, and further, the defect determination can be performed with higher accuracy.
< Defect discrimination >
Surface defects are discriminated based on the mutually aligned sub-pixel images.
As shown in fig. 10, in the case of a spherical dimple defect 51 called a "void defect" among the dimple defects, illumination lights from the illumination devices 2a and 2b arranged to face each other in different directions intersect each other, and the positional relationship between the positions of the illumination devices 2a and 2b and the reflection positions is reversed. In other words, in the aligned sub-pixel image 61, when the bright points 61a and 61b corresponding to the respective illumination devices 2a and 2b do not overlap, the respective bright points 61a and 61b are within a predetermined range, and the positions of the bright points 61a and 61b and the arrangement positions of the illumination devices 2a and 2b are in an opposite positional relationship, the concave defect 51 is determined.
On the other hand, in the case of the "convex defect", as shown in fig. 11, the illumination lights directed from the illumination devices 2a and 2b to the convex defect 52 do not intersect each other, and the positional relationship between the positions of the illumination devices 2a and 2b and the reflection position is the same. Therefore, in the aligned sub-pixel image 62, when the bright points 62a and 62b corresponding to the respective illumination devices 2a and 2b do not overlap, the respective bright points 62a and 62b are within a predetermined range, and the positions of the bright points 62a and 62b and the arrangement positions of the illumination devices 2a and 2b are in the same positional relationship, it is determined that the convex defect 52 is present.
In fig. 10 and 11, the bright points 61a and 62a corresponding to the lighting device 2a are shown by double hatching, and the bright points 61b and 62b corresponding to the lighting device 2b are shown by dashed hatching. The same applies to fig. 12 and later.
As shown in fig. 12, the illumination lights from the illumination devices 2a and 2b disposed opposite to each other intersect with each other, and the positional relationship between the positions of the illumination devices 2a and 2b and the reflection position is reversed with respect to the planar defect 53 called a "scratch defect" among the concave defects. Further, since the directions of the scratched surfaces are not uniform, the reflection of the illumination light by the illumination device 2a and the reflection of the illumination light by the illumination device 2b are mixed. However, since the flat surface is a high-reflectance surface, the illumination lights are not mixed and reflected. Therefore, in the aligned sub-pixel image 63, when the bright point 63a corresponding to the illumination device 2a and the bright point 63b corresponding to the illumination device 2b do not overlap each other, the bright points 63a and 63b are present in a mixed manner, and the positions of the bright points 63a and 63b are opposite to the arrangement positions of the illumination devices 2a and 2b, it is determined that the scratch defect 53 is present on the surface of the object 5.
Since the surfaces of "dust" and "dirt" are diffuse reflection surfaces, the illumination lights of the illumination devices 2a and 2b are diffusely reflected, and thus the respective illumination lights are mixed and reflected by the defect 54 as shown in fig. 13. Therefore, in the aligned sub-pixel image 64, when the bright spots 64a and 64b corresponding to the respective illumination devices 2a and 2b overlap, it is determined that dust or dirt is present on the surface of the object 5.
Each of the sub-pixel images of the illumination devices 2a and 2b is a dark field image, and a concave-convex defect, a scratch defect, dust, dirt, and the like appear as white dots. The detection of defect candidates on the image is performed as follows.
That is, when the size of the defect candidate is set to W1 or more and W2 or less in terms of the area of the image, and the luminance is set to B2 or more, each sub-pixel image is binarized by B2, and the clustering process of the discrete pixels is performed by the expansion and contraction process. And, labeling based on color distinction or the like is performed for each pixel set.
Only the pixel set of the label with the set area S of W1 ≦ S ≦ W2 is left, and the other pixel sets are deleted from each sub-pixel image. W1 represents not only the minimum defect size but also the minimum size of what is considered to be "a portion of a defect".
When the size of the defect is defined as X or more in terms of the number of pixels, the coordinates Vi (i is a label of the pixel set) of each pixel of the remaining pixel set in the sub-pixel image of the lighting apparatus 2a are examined, and if the pixel set exists in the range of ± X/2 of the coordinates Vi ± X/2 of the sub-pixel image of the lighting apparatus 2b, the defect is regarded as a defect, and the defect is classified into four types, that is, "void defect", "scratch defect", "convex defect", "dust or dirt", according to the above-described defect classification method.
Specifically, as shown in fig. 14, when the aligned sub-pixel image SPb corresponding to the illumination device 2a and the aligned sub-pixel image SPb corresponding to the illumination device 2b are combined as shown in the right drawing, the bright points 61a and 61b corresponding to the illumination devices 2a and 2b do not overlap each other, the bright points 61a and 61b are in the range of the coordinates Vi ± X/2, and the positions of the bright points 61a and 61b and the arrangement positions of the illumination devices 2a and 2b are in an opposite positional relationship, and thus it is determined that a void defect is present.
As shown in fig. 15, when the aligned sub-pixel image SPb corresponding to the illumination device 2a and the aligned sub-pixel image SPb corresponding to the illumination device 2b are combined as shown in the right drawing, the bright points 62a and 62b corresponding to the illumination devices 2a and 2b do not overlap each other, the bright points 62a and 62b are in the range of the coordinates Vi ± X/2, and the positions of the bright points 62a and 62b and the arrangement positions of the illumination devices 2a and 2b are in the same positional relationship, and thus it is determined that there is a convex defect.
As shown in fig. 16, when the aligned sub-pixel image SPb corresponding to the illumination device 2a and the aligned sub-pixel image SPb corresponding to the illumination device 2b are combined as shown in the right drawing, the bright spots 62a and 62b corresponding to the illumination devices 2a and 2b are within the range of the coordinates Vi ± X/2, the bright spots 62a and 62b are present in a mixture without overlapping, and the positions of the bright spots 63a and 63b are opposite to the arrangement positions of the illumination devices 2a and 2b, and therefore it is determined that the scratch defect is present.
As shown in fig. 17, when the aligned sub-pixel image SPb corresponding to the illumination device 2a and the aligned sub-pixel image SPb corresponding to the illumination device 2b are combined as shown in the right drawing, the bright spots 64a and 64b corresponding to the illumination devices 2a and 2b overlap with each other, and thus it is determined that dust or dirt is present.
As described above, the surface defect of the test object 5 that is moved can be detected and discriminated.
The detection result is displayed on the display device 6. The display is preferably capable of displaying the determined type of defect and the range of the coordinates Vi ± X/2 of each defect together with the aligned images of the two subpixel images Spa and SPb shown on the right side of fig. 14 to 17.
Although one embodiment of the present invention has been described above, the present invention is not limited to the above embodiment. For example, although the line sensor 1 and the illumination devices 2a and 2b are fixed and the subject 5 is moved to perform imaging, the subject 5 may be fixed and the line sensor 1 and the illumination devices 2a and 2b may be moved to perform imaging and at least one of the subject 5, the line sensor 1, and the illumination devices 2a and 2b may be moved relative to the other.
In addition, although the case where the relative movement distance of the non-test object 5 is a and the length of the sub-pixel 23 is 2A each time the non-test object is imaged is shown, in one illumination device, the sub-pixel may be formed by overlapping the imaging ranges in the current imaging and the previous imaging. Therefore, the relative movement distance of the non-inspection object 5 in each imaging is preferably not more than 1/2 of 1 pixel.
Further, although 2 illumination devices 2a and 2b are used, 3 or more illumination devices are used in sequence by switching as described above, and the directions of the irradiation light for detecting the discrimination defect by comparing 3 or more kinds of sub-pixel images corresponding to the respective illumination devices are various, and it is preferable from the viewpoint that the discrimination surface defect can be detected with higher accuracy.
The present invention can be used for discriminating surface defects of an object to be inspected such as a product or a member having a surface with a strong specular reflection property.
Description of the reference numerals
1 … line sensor; 2a, 2b … lighting device; 4 … drum encoder; 5 … test object; 6 … lighting device; 8 … lighting control part; 9 … line sensor control part; 10 … computer; 11 … drum conveyance control unit; 20 … pixels; 21 … a first area; 22 … a second area; 23 … subpixels (third area); 30 … defect; 51 … concave defects (void defects); 52 … convex defect; 53 … scratch defects; 54 … dust or dirt; 61a to 64a … bright spots of the lighting device 2 a; 61 b-64 b … illuminate the bright spots of device 2 b.

Claims (29)

1. A surface defect discriminating device includes:
an image acquisition unit that receives and captures images of light reflected from the object via the line sensor every time the illumination light from each of the illumination devices is switched when the object is irradiated with the illumination light from each of the illumination devices while relatively moving the object with respect to the illumination devices and the line sensor arranged at different positions, and that acquires a plurality of images in a state in which the positions are shifted by the switching amount of the illumination light;
an alignment unit that aligns the images corresponding to the respective illumination devices acquired by the image acquisition unit; and
and a determination unit for determining the surface defect of the object to be inspected according to the image aligned by the alignment unit.
2. The surface defect discriminating apparatus as defined in claim 1,
a part of each pixel of the line sensor is an overlapping area where imaging ranges overlap in the current imaging and the previous imaging due to the irradiation of the object with the illumination light of one of the illumination devices,
the surface defect discriminating device includes a sub-pixel image creating unit that creates a sub-pixel image by subtracting the light receiving amount of the overlapping region from the light receiving amount of the entire pixel in the current shot when the portion of one pixel excluding the overlapping region is used as the sub-pixel,
the alignment unit aligns the subpixel images corresponding to the respective illumination devices created by the subpixel image creating unit.
3. A surface defect discriminating device includes:
an image acquisition unit which receives and captures images of light reflected from the object through the line sensor every time the illumination light from each of the illumination devices is switched when the object is irradiated with the illumination light from each of the illumination devices while the object is moved relative to the illumination devices and the line sensor arranged at different positions, and acquires a plurality of images for each of the illumination lights,
a part of each pixel of the line sensor is an overlapping area where imaging ranges overlap in the current imaging and the previous imaging due to the irradiation of the object with the illumination light of one of the illumination devices,
the surface defect discriminating device further includes:
a sub-pixel image creating unit that, when a portion of one pixel other than the overlap region is a sub-pixel, subtracts the light receiving amount of the overlap region from the light receiving amount of the entire pixel in the current image, thereby estimating the light receiving amount of the sub-pixel and creating a sub-pixel image; and
and a discrimination unit for discriminating the surface defect of the object based on the sub-pixel image created by the sub-pixel image creation unit.
4. The surface defect discriminating apparatus as defined in claim 3,
the lighting device includes a positioning unit that positions the sub-pixel image created by the sub-pixel image creating unit for each lighting device.
5. The surface defect discriminating apparatus as defined in any one of claims 2 to 4,
the sub-pixel image creating means subtracts the light receiving amount of the overlapping region from the light receiving amount of the entire pixel in a state where the light receiving amount of the overlapping region is corrected for each region.
6. The surface defect discriminating apparatus as defined in any one of claims 2 to 5,
the sub-pixel image creating means calculates the light receiving amount of the overlapping region from the sum of the light receiving amounts of the sub-pixels estimated previously, and estimates the light receiving amount of the sub-pixel of this time by subtracting the calculated light receiving amount from the light receiving amount of the entire pixel.
7. The surface defect discriminating apparatus as defined in claim 6,
the sub-pixel image creating means estimates the light receiving amount of the first sub-pixel as an average value obtained by dividing the light receiving amount of the entire first pixel after the start of image capturing by the number of sub-pixels per pixel.
8. The surface defect discriminating apparatus as defined in any one of claims 2 to 7,
the sub-pixel image creating means estimates the amount of light received by the sub-pixel of this time by dividing the amount of light received by the entire pixel by the number of sub-pixels per pixel when the amount of light received by the entire pixel does not exceed a predetermined threshold, and estimates the amount of light received by the sub-pixel of this time by subtracting the amount of light received by the overlapping region from the amount of light received by the entire pixel when the amount of light received by the entire pixel exceeds the predetermined threshold.
9. The surface defect discriminating apparatus as defined in any one of claims 2 and 4 to 8, wherein,
the alignment unit aligns the brightness value K according to the following formulai jCorrected to a correction value K'i jTo perform alignment of the sub-pixel images corresponding to the respective illumination devices created by the sub-pixel image creating unit,
[ formula 4]
Figure FDA0003540424540000031
Wherein, i: indexing of subpixel inferred positions
j: identification number of the lighting device to be turned on.
10. The surface defect discriminating apparatus as defined in any one of claims 1, 2, 4 to 9,
in the sub-pixel image aligned by the aligning unit, the determining unit determines that a concave defect or a convex defect exists on the surface of the object to be inspected when the bright points corresponding to the illuminating devices do not overlap and the bright points are within a predetermined range.
11. The surface defect discriminating apparatus as defined in claim 10,
in the sub-pixel image aligned by the aligning unit, the determining unit determines that a concave defect exists when the position of the bright point corresponding to each of the illuminating devices is opposite to the arrangement position of the illuminating devices, and determines that a convex defect exists when the position of the bright point corresponding to each of the illuminating devices is not opposite to the arrangement position of the illuminating devices.
12. The surface defect discriminating device as defined in any one of claims 1, 2, 4 to 11,
the determination means determines that dust or dirt is present on the surface of the object to be inspected when the bright spots corresponding to the respective illumination devices overlap in the sub-pixel images aligned by the alignment means.
13. The surface defect discriminating device as defined in any one of claims 1, 2, 4 to 12,
pixels having an entire light receiving amount exceeding a predetermined threshold are detected as defect candidate pixels, and with respect to the detected defect candidate pixels, the sub-pixel images are aligned by the aligning means, and the surface defect of the object is determined by the determining means.
14. The surface defect discriminating device as defined in any one of claims 1 to 13,
as a light source of the above-described illumination device, an LED or a visible light semiconductor laser is used.
15. The surface defect discriminating device as defined in any one of claims 1 to 14,
the lighting devices are arranged on a circumference centered on the line sensor, and are arranged with an angular difference of 360 degrees divided by the number of the lighting devices.
16. An appearance inspection device is provided with:
a plurality of lighting devices disposed at different positions;
a line sensor capable of receiving reflected light of illumination light irradiated from each illumination device to the object to be inspected;
a moving unit that moves the object to be inspected relative to the illuminating device and the line sensor;
an illumination control unit for switching the illumination light from each illumination device to irradiate the object to be inspected at a predetermined cycle;
a line sensor control unit that controls the line sensor so as to receive reflected light from the test object and take an image of the reflected light each time the illumination light from each illumination device is switched while relatively moving the test object with respect to the illumination device and the line sensor by the moving unit; and
a surface defect discriminating device as defined in any one of claims 1 to 15.
17. A program for causing a computer to execute the steps of:
an image acquisition step of receiving and capturing images of the object by the line sensor every time the illumination light from each of the illumination devices is switched when the object is irradiated with the illumination light from each of the illumination devices while relatively moving the object to be inspected with respect to the illumination devices and the line sensor arranged at different positions, and acquiring a plurality of images in a state where the positions are shifted by the switching amount of the illumination light;
an alignment step of aligning the images corresponding to the respective illumination apparatuses acquired in the image acquisition step; and
and a determination step of determining the surface defect of the object to be inspected based on the image aligned by the alignment step.
18. The program according to claim 17, wherein,
a part of each pixel of the line sensor is an overlapping area where imaging ranges overlap in the current imaging and the previous imaging due to the irradiation of the object with the illumination light of one of the illumination devices,
causing the computer to perform the following sub-pixel image creation steps: when a portion of one pixel other than the overlapping region is a sub-pixel, the amount of light received by the sub-pixel at this time is estimated by subtracting the amount of light received by the overlapping region from the amount of light received by the entire pixel at this time, and a sub-pixel image is created,
causing the computer to execute: in the aligning step, the subpixel images corresponding to the respective lighting devices created in the subpixel image creating step are aligned.
19. The program according to claim 17, wherein,
causing the computer to execute: in the sub-pixel image creating step, the light receiving amount of the overlapping area is subtracted from the light receiving amount of the entire pixel in a state where the light receiving amount of the overlapping area is corrected for each area.
20. The program according to claim 18 or 19, wherein,
causing the computer to execute: in the sub-pixel image creating step, the light receiving amount of the overlapping area is obtained from the sum of the light receiving amounts of the sub-pixels estimated in the previous time, and the light receiving amount of the current sub-pixel is estimated by subtracting the obtained light receiving amount from the light receiving amount of the entire pixel.
21. The program according to claim 20, wherein,
causing the computer to execute: in the sub-pixel image creating step, the average value obtained by dividing the light receiving amount of the entire first pixel after the start of image capturing by the number of sub-pixels per pixel is estimated as the light receiving amount of the first sub-pixel.
22. The process of any one of claims 18 to 21,
causing the computer to execute: in the subpixel image creating step, when the light receiving amount of the entire pixel does not exceed the predetermined threshold, the light receiving amount of the current subpixel is estimated by dividing the light receiving amount of the entire pixel by the number of subpixels per pixel, and when the light receiving amount of the entire pixel exceeds the predetermined threshold, the light receiving amount of the current subpixel is estimated by subtracting the light receiving amount of the overlap region from the light receiving amount of the entire pixel.
23. The process according to any one of claims 18 to 22,
causing the computer to execute: in the above alignment step, the luminance value K is calculated by the following equationi jCorrected to a correction value K'i jTo perform the registration of the sub-pixel images corresponding to the respective lighting devices created by the above-described sub-pixel image creating step,
[ formula 5]
Figure FDA0003540424540000051
Wherein, i: indexing of subpixel inferred positions
j: the identification difference number of the lighted illumination.
24. The process of any one of claims 17 to 23,
causing the computer to execute: in the determination step, it is determined that a concave defect or a convex defect exists on the surface of the object to be inspected when the bright points corresponding to the respective illumination devices do not overlap and the respective bright points are within a predetermined range in the sub-pixel image aligned in the alignment step.
25. The program according to claim 24, wherein,
causing the computer to execute: in the determination step, in the sub-pixel image aligned in the alignment step, it is determined that a concave defect exists when the position of the bright point corresponding to each of the illumination devices is opposite to the arrangement position of the illumination devices, and the determination unit determines that a convex defect exists when the position of the bright point corresponding to each of the illumination devices is not opposite to the arrangement position of the illumination devices.
26. The process according to any one of claims 17 to 25,
causing the computer to execute: in the determination step, when the bright points corresponding to the respective illumination devices overlap in the pixel image aligned in the alignment step, it is determined that dust or dirt is present on the surface of the inspection object.
27. The process of any one of claims 17 to 26,
causing the computer to execute: the pixels whose total light receiving amount exceeds a predetermined threshold are detected as defect candidate pixels, and the sub-pixel images are aligned by the alignment unit with respect to the detected defect candidate pixels, and the surface defect of the object is determined by the determination step.
28. The process of any one of claims 17 to 27,
as a light source of the above-described illumination device, an LED or a visible light semiconductor laser is used.
29. The process of any one of claims 17 to 28,
the plurality of illumination devices are arranged on a circumference centered on the line sensor and at positions of 360 degrees/(the number of illumination devices).
CN202080063954.0A 2019-09-13 2020-08-28 Surface defect determination device, appearance inspection device, and program Active CN114364973B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2019-167576 2019-09-13
JP2019167576 2019-09-13
PCT/JP2020/032574 WO2021049326A1 (en) 2019-09-13 2020-08-28 Surface defect discerning device, appearance inspection device, and program

Publications (2)

Publication Number Publication Date
CN114364973A true CN114364973A (en) 2022-04-15
CN114364973B CN114364973B (en) 2024-01-16

Family

ID=74866162

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080063954.0A Active CN114364973B (en) 2019-09-13 2020-08-28 Surface defect determination device, appearance inspection device, and program

Country Status (4)

Country Link
JP (1) JP7444171B2 (en)
KR (1) KR20220043219A (en)
CN (1) CN114364973B (en)
WO (1) WO2021049326A1 (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000162146A (en) * 1998-11-24 2000-06-16 Nippon Electro Sensari Device Kk Surface inspecting device
JP2006194828A (en) * 2005-01-17 2006-07-27 Mega Trade:Kk Inspection device
DE102005031490A1 (en) * 2005-07-04 2007-02-01 Massen Machine Vision Systems Gmbh Cost-effective multi-sensor surface inspection
US20100245813A1 (en) * 2009-03-24 2010-09-30 Orbotech Ltd. Multi-modal imaging
CN103575737A (en) * 2012-07-18 2014-02-12 欧姆龙株式会社 Defect detection method and device
CN105358966A (en) * 2013-05-23 2016-02-24 材料开发中心股份公司 Method for the surface inspection of long products and apparatus suitable for carrying out such a method
CN106796179A (en) * 2014-09-05 2017-05-31 株式会社斯库林集团 Check device and inspection method
CN107735674A (en) * 2015-06-25 2018-02-23 杰富意钢铁株式会社 The manufacture method of surface defect detection apparatus, detection method of surface flaw and steel
WO2019103153A1 (en) * 2017-11-27 2019-05-31 日本製鉄株式会社 Shape inspecting device and shape inspecting method
JP2019135461A (en) * 2018-02-05 2019-08-15 株式会社Screenホールディングス Image acquisition device, image acquisition method, and inspection device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11118450A (en) 1997-10-14 1999-04-30 Mitsubishi Heavy Ind Ltd Device for detecting protrusion defect of liquid crystal substrate
JP6470506B2 (en) 2014-06-09 2019-02-13 株式会社キーエンス Inspection device
JP6682809B2 (en) 2015-11-09 2020-04-15 大日本印刷株式会社 Inspection system and inspection method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000162146A (en) * 1998-11-24 2000-06-16 Nippon Electro Sensari Device Kk Surface inspecting device
JP2006194828A (en) * 2005-01-17 2006-07-27 Mega Trade:Kk Inspection device
DE102005031490A1 (en) * 2005-07-04 2007-02-01 Massen Machine Vision Systems Gmbh Cost-effective multi-sensor surface inspection
US20100245813A1 (en) * 2009-03-24 2010-09-30 Orbotech Ltd. Multi-modal imaging
CN103575737A (en) * 2012-07-18 2014-02-12 欧姆龙株式会社 Defect detection method and device
CN105358966A (en) * 2013-05-23 2016-02-24 材料开发中心股份公司 Method for the surface inspection of long products and apparatus suitable for carrying out such a method
CN106796179A (en) * 2014-09-05 2017-05-31 株式会社斯库林集团 Check device and inspection method
CN107735674A (en) * 2015-06-25 2018-02-23 杰富意钢铁株式会社 The manufacture method of surface defect detection apparatus, detection method of surface flaw and steel
WO2019103153A1 (en) * 2017-11-27 2019-05-31 日本製鉄株式会社 Shape inspecting device and shape inspecting method
JP2019135461A (en) * 2018-02-05 2019-08-15 株式会社Screenホールディングス Image acquisition device, image acquisition method, and inspection device

Also Published As

Publication number Publication date
CN114364973B (en) 2024-01-16
JP7444171B2 (en) 2024-03-06
WO2021049326A1 (en) 2021-03-18
KR20220043219A (en) 2022-04-05
JPWO2021049326A1 (en) 2021-03-18

Similar Documents

Publication Publication Date Title
US11238303B2 (en) Image scanning method for metallic surface and image scanning system thereof
KR101832081B1 (en) Surface defect detection method and surface defect detection device
US11727613B2 (en) Systems and methods for stitching sequential images of an object
CN107709977B (en) Surface defect detection device and surface defect detection method
EP3399302A1 (en) Egg surface inspection apparatus
US20170053394A1 (en) Inspection apparatus, inspection method, and article manufacturing method
CN110596139A (en) Screen defect detection method and system
CN112986258A (en) Surface defect detection device and method for judging surface where surface defect is located
US10955354B2 (en) Cylindrical body surface inspection device and cylindrical body surface inspection method
JP4318579B2 (en) Surface defect inspection equipment
CN1844899A (en) Wide article detection method
JP6782449B2 (en) Surface inspection method and its equipment
CN114364973B (en) Surface defect determination device, appearance inspection device, and program
JP2006177852A (en) Surface inspection device and its method
JP7267665B2 (en) WORK INSPECTION DEVICE AND WORK INSPECTION METHOD
CN114981645A (en) Surface inspection device, surface inspection method, steel product manufacturing method, steel product quality management method, and steel product manufacturing facility
WO2011101893A1 (en) Method and device for detecting flaw on surface of flexible object to be tested
JP6409606B2 (en) Scratch defect inspection device and scratch defect inspection method
JP3160838B2 (en) Surface defect inspection equipment
JP3609136B2 (en) Semiconductor device inspection method and apparatus
JP3043530B2 (en) Dot pattern inspection equipment
JP2023082350A (en) Surface defect inspection device and surface defect inspection method
JP2022128536A (en) Workpiece inspection device and workpiece inspection method
KR100529929B1 (en) Apparatus for inspecting PDP Rib and Method for the same
JP2020094877A (en) Optical evaluation device, optical evaluation method, test object conveyance method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant