WO2024070101A1 - 表面欠陥検出方法及び表面欠陥検出装置 - Google Patents
表面欠陥検出方法及び表面欠陥検出装置 Download PDFInfo
- Publication number
- WO2024070101A1 WO2024070101A1 PCT/JP2023/024078 JP2023024078W WO2024070101A1 WO 2024070101 A1 WO2024070101 A1 WO 2024070101A1 JP 2023024078 W JP2023024078 W JP 2023024078W WO 2024070101 A1 WO2024070101 A1 WO 2024070101A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- strip
- defect detection
- surface defect
- images
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/89—Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
- G01N21/892—Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles characterised by the flaw, defect or object feature examined
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/89—Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
- G01N21/8901—Optical details; Scanning details
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/95—Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
- G01N21/952—Inspecting the exterior surface of cylindrical bodies or wires
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8854—Grading and classifying of flaws
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8887—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/89—Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
- G01N21/8914—Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles characterised by the material examined
- G01N2021/8918—Metal
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N2201/00—Features of devices classified in G01N21/00
- G01N2201/12—Circuits of general importance; Signal processing
- G01N2201/127—Calibration; base line adjustment; drift compensation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30136—Metal
Definitions
- This disclosure relates to a surface defect detection method and a surface defect detection device for detecting surface defects on strips, including steel plates, etc.
- the steel materials described in this disclosure include steel products such as seamless steel pipes, welded steel pipes, hot-rolled steel sheets, cold-rolled steel sheets, thick plates, and other steel plates and shaped steel, as well as semi-finished products such as slabs that are generated during the manufacturing process of these steel products.
- the techniques described in Patent Documents 1 to 3 have been conventionally known as surface inspection methods for optically detecting surface defects in steel materials.
- Patent Document 1 discloses a surface defect detection method that uses two distinguishable light sources to irradiate illumination light from different directions onto the same inspection target portion of a steel material, obtains images based on the reflected light of each illumination light, and performs differential processing between the obtained images to detect surface defects in the inspection target portion.
- a commonly used technique for optically detecting surface defects in steel materials is to illuminate the surface of the steel material to be inspected, detect the reflected light with a camera having a two-dimensional or one-dimensional light receiving element to capture an image, and then process the resulting image to detect defects.
- a common method for detecting defects in an image is to compare the brightness of each pixel in the image with a predetermined threshold value to determine whether each pixel is healthy or defective.
- a typical configuration is to fix lighting and cameras to the production line, etc., and repeatedly image the steel material as it is transported through the production line.
- the area in which the steel products appear in the images captured by a fixed camera changes. Areas in the image that do not contain steel products do not need to be inspected, and at the same time, it is necessary to prevent the mistaken detection of defects in such areas. Therefore, it is necessary to dynamically recognize the range in the captured image in which the steel products appear, i.e., the area to be inspected, and to detect defects only within the area to be inspected.
- raw images captured by a camera have "shading" where the brightness level varies depending on the position in the image due to factors such as uneven lighting and reduced brightness around the periphery of the camera's field of view due to lens characteristics.
- shading correction a process known as shading correction
- Shading patterns can be broadly classified into fixed patterns and dynamic patterns.
- shading correction is always performed using the same pattern.
- the pattern is determined in advance, for example, experimentally.
- one method is to use the average image obtained from multiple images taken of defect-free steel as the shading pattern.
- shading correction is performed on the captured image while the shading pattern is calculated or updated sequentially from one or more images captured during the inspection.
- the shading pattern is calculated or updated sequentially from one or more images captured during the inspection.
- a basic method of shading correction using such dynamic patterns is to use a blurred image obtained by performing moving average processing or low-pass filter processing on a captured one- or two-dimensional image as a shading pattern to correct the original image.
- Patent Document 2 discloses a shading correction method for a surface inspection device in which a threshold signal is generated based on a first shading correction signal generated from a pre-stored scanning detection signal from the previous scan, the signal value of the position of the defective part contained in the scanning signal from the current scan is replaced with the first shading correction signal to obtain a corrected scanning signal in real time, and the scanning detection signal from the current scan is normalized.
- Patent Document 3 discloses a method of detecting the image area of the object to be inspected by detecting the boundary between the object to be inspected and the area outside the object to be inspected from an image from a camera, extracting the shading components as an image of the same size as the image from the camera by applying a low-pass filter to the image from the camera, approximating the luminance change of each line in the image area of the object to be inspected using a curve, or approximating the luminance change of the entire image area of the object to be inspected using a curved surface, reconstructing the image into one of the same size with the luminance value of the approximated curve or approximated surface, and correcting the shading by dividing or subtracting the image from the camera by the reconstructed image.
- the current scan signal is corrected by a first shading correction signal based on the previous scan signal. Therefore, when the edge position of the target material shown in the image moves due to meandering or a change in width of the test target material, the current scan signal is corrected by the first shading correction signal in the area where the edge position has changed, making it difficult to perform normal shading correction near the edge.
- the present disclosure has been made in consideration of the above problems, and aims to provide a surface defect detection method and a surface defect detection device that can detect surface defects on a strip-shaped body with greater accuracy.
- the aim is to provide a surface defect detection method and a surface defect detection device that suppresses "folding" around the defect or “hollowing” of a defect with a large area, and also performs appropriate shading correction up to the vicinity of the edge even when the edge position shown in the image fluctuates due to meandering or width changes in the material being inspected.
- a surface defect detection method for optically detecting surface defects on a strip comprising the steps of: an image acquisition step of detecting reflected light from the strip obtained by illuminating the surface of the strip, and capturing an image of the surface of the strip while relatively scanning the surface of the strip to acquire a plurality of images including the surface of the strip; an average image calculation step of calculating an average image of the plurality of acquired images; an image correction step of performing shading correction on each of the acquired images using the average image to obtain a corrected image; a defect detection step of detecting surface defects of the strip based on the corrected image; Including, the step of calculating the average image includes identifying an inspection target area in each of the plurality of images in which the band exists, and allowing only pixels in the inspection target area to contribute to the average image.
- Surface defect detection method It is.
- the image correction step may include dividing each of the acquired images by the average image.
- the image correction step may include subtracting the average image from each of the captured images.
- the average image calculation step may include recognizing, in each of the plurality of images, non-constant parts within the inspection area in the image, and allowing only pixels of constant parts to contribute to the average image.
- the strip may comprise steel.
- a surface defect detection device for optically detecting surface defects on a strip, comprising: an illumination unit that illuminates a surface of the strip; an imaging unit that detects reflected light from the strip obtained by illuminating a surface of the strip with the illumination unit; an image processing unit that processes a plurality of images including the surface of the strip captured by relatively scanning the surface of the strip using the illumination unit and the imaging unit to detect surface defects of the strip; Equipped with The image processing unit includes: Calculating an average image of the plurality of images; obtaining a corrected image by performing shading correction on the plurality of images using the average image; detecting surface defects on the strip based on the corrected image; identifying a region of interest in each of the plurality of images in which the band exists, and allowing only pixels in the region of interest to contribute to the average image; Surface defect detection device, It is.
- the surface defect detection method and surface defect detection device can detect surface defects on a strip with greater accuracy. For example, "folding" around the defect or “hollowing” of a large defect can be suppressed. In addition, even if the edge position shown in the image fluctuates due to meandering or width changes in the material being inspected, appropriate shading correction can be performed up to the vicinity of the edge.
- FIG. 1 is a schematic diagram illustrating a configuration of a surface defect inspection device according to an embodiment of the present disclosure.
- 4 is a flowchart showing image processing executed by the image processing unit in FIG. 1 .
- FIG. 2 shows several images of the steel product of FIG.
- FIG. 1 illustrates an embodiment of the present disclosure.
- FIG. 1 is a diagram showing a comparative example according to the prior art.
- FIG. 13 is a diagram showing a comparative example according to another conventional technique.
- FIG. 5 is a graph showing a corrected luminance waveform between A and A in the shading-corrected image of FIG. 4(d).
- FIG. 6 is a graph showing a corrected luminance waveform between A and A in the shading-corrected image of FIG. 5(d).
- FIG. 7 is a graph showing a corrected luminance waveform between u and u in the shading-corrected image of FIG. 6(d).
- FIG. 1 is a schematic diagram showing the configuration of a surface defect detection device 1 according to one embodiment of the present disclosure.
- the surface defect detection device 1 optically detects surface defects on a strip.
- the "strip" includes, for example, steel materials.
- the steel materials mentioned in this disclosure include steel products such as seamless steel pipes, welded steel pipes, hot-rolled steel plates, cold-rolled steel plates, thick plates, and other steel plates and shaped steel, as well as semi-finished products such as slabs that are produced during the manufacturing process of these steel products.
- a surface defect detection device 1 detects surface defects in a strip of steel material PL transported on a transport roller table TB in the direction of arrow D in the figure.
- the surface defect detection device 1 has as its main components lighting devices 2a, 2b, area sensors 3a, 3b, a processor 4, and a display device 5.
- the lighting devices 2a, 2b correspond to the "lighting section” described in the claims.
- the area sensors 3a, 3b correspond to the "imaging section” described in the claims.
- the lighting devices 2a and 2b have any light-emitting element that illuminates the surface of the strip.
- the lighting devices 2a and 2b emit light each time the steel material PL moves a certain distance in the direction of the arrow D on the transport roller table TB, illuminating the inspection position on the surface of the steel material PL.
- Area sensors 3a and 3b have any light receiving element that detects reflected light from the strip obtained by illuminating the surface of the strip with lighting devices 2a and 2b.
- Area sensors 3a and 3b include, for example, a camera.
- area sensors 3a and 3b detect reflected light from an inspection position on the steel material PL surface in synchronization with the light emission of lighting devices 2a and 2b, respectively, and capture an image of the steel material PL surface.
- the surface defect detection device 1 has two sets of lighting devices and area sensors, but is not limited to this.
- the surface defect detection device 1 may have one or three or more sets of lighting devices and area sensors depending on, for example, the product width of the steel material PL to be inspected.
- the surface defect detection device 1 may have multiple lighting devices arranged for one area sensor, or multiple area sensors arranged for one lighting device.
- the calculation processing device 4 has one or more processors.
- the "processor” is a general-purpose processor or a dedicated processor specialized for a specific process, but is not limited to these.
- the calculation processing device 4 acquires two-dimensional images captured using the area sensors 3a and 3b, and detects surface defects of the steel material PL.
- the calculation processing device 4 has image processing units 41a and 41b and an integrated processing unit 42 inside.
- the image processing units 41a, 41b detect surface defects on the strip by processing multiple images including the surface of the strip captured by relatively scanning the surface of the strip using the lighting devices 2a, 2b and the area sensors 3a, 3b.
- the image processing units 41a, 41b execute a series of image processing operations described below to detect the range of the steel material PL reflected within the field of view of each area sensor and surface defects on the steel material PL.
- the image processing units 41a, 41b may determine the type and harmfulness of the defect from the characteristics of each detected surface defect.
- the integrated processing unit 42 consolidates information on the extent of the steel material PL and information on surface defects detected by the image processing units 41a and 41b, respectively, across the entire steel material PL.
- the integrated processing unit 42 creates inspection performance information such as a defect map showing where defects exist on the steel material PL, and a summary table that tallies the number of defects on the steel material PL by type and harmfulness.
- the integrated processing unit 42 outputs the created inspection performance information to the display device 5, and displays it to the operator using the display device 5.
- the display device 5 has one or more output interfaces that output information and present it to the operator.
- the display device 5 has a display that outputs information as a video.
- FIG. 2 is a flowchart showing the image processing executed by the image processing units 41a and 41b in FIG. 1.
- the image processing procedures executed by the image processing units 41a and 41b are the same. Therefore, in the following, image processing in the system of the lighting device 2a, the area sensor 3a, and the image processing unit 41a will be described as an example.
- step S1 the image processing unit 41a acquires an input image I n (x, y) from the area sensor 3a.
- the input image I n (x, y) is an image captured by the area sensor 3a by detecting reflected light from the surface of the steel material PL illuminated by the lighting device 2a in synchronization with the light emission of the lighting device 2a.
- the subscript n indicates that this is the nth imaging since the start of the inspection of the steel material PL.
- (x, y) indicate two-dimensional coordinates in the input image.
- x is the coordinate corresponding to the width direction of the steel material PL, i.e., the direction perpendicular to the direction of the arrow D in FIG. 1.
- y is the coordinate corresponding to the traveling direction of the steel material PL, i.e., the direction of the arrow D in FIG. 1.
- the value of I n (x, y) represents the brightness at each coordinate (x, y).
- step S2 the image processing unit 41a detects the inspection target area, i.e., the range in the input image I n (x, y) where the steel material PL exists, based on the input image I n (x, y) acquired in step S1.
- the method for detecting the inspection target area may be a method suitable for the inspection target material and its background.
- a method can be adopted in which a threshold value TR is used to distinguish between the background and the material to be inspected, and a binary image R n (x, y) indicating the area to be inspected can be calculated at each coordinate (x, y) using the following equations (1) and (2).
- a method can be adopted in which a search is performed in the x direction for each y coordinate of the input image I n (x, y), and the position where I n (x, y) exceeds TR for the first time for a predetermined number of consecutive pixels is recognized as the boundary between the steel material PL and the background to detect the inspection area.
- step S3 the image processing unit 41a adds the luminance of the pixels included in the inspection target area of the input image I n (x, y) detected in step S2 to the accumulated luminance image S(x, y).
- step S4 the image processing unit 41a adds 1 to the count image C(x, y) for pixels included in the inspection target area of the input image I n (x, y) detected in step S2.
- the image processing unit 41a initializes a count image C(x, y) for each steel material PL, which is the material to be inspected.
- step S5 the image processing unit 41a judges whether or not there is an input image that has not been obtained. If the image processing unit 41a judges that there is an input image that has not been obtained, the process returns to step S1, and the process of steps S1 to S4 is executed again for the next input image I n+1 (x, y). If the image processing unit 41a judges that there is no input image that has not been obtained, the process executes the next step S6.
- step S6 the image processing unit 41a calculates an average image A(x, y) of the multiple input images acquired in step S1. More specifically, the image processing unit 41a calculates the average image A(x, y) by dividing the integrated luminance image S(x, y) by the count image C(x, y). The image processing unit 41a calculates the average image A(x, y) at each coordinate (x, y) using the following formula.
- the value of the average image A(x, y) calculated by equation (5) is the average brightness at the position where the steel material PL corresponds to the coordinates (x, y) of the input image within the field of view of the area sensor 3a, and the average image as a whole represents a shading pattern caused by unevenness in the lighting device 2a, etc.
- the image processing unit 41a recognizes the inspection target area in each of the multiple input images where a band-like object exists in the image, and only pixels in the inspection target area contribute to the average image A(x, y). Please note that the brightness of the input image contributes to the average value only when the coordinates (x, y) in each input image are within the inspection target area, and contributions from the background are excluded.
- step S7 the image processing unit 41a performs shading correction on each of the multiple images acquired in step S1 using the average image A(x, y) to acquire a corrected image. More specifically, the image processing unit 41a performs shading correction on the input image I n (x, y) using the average image A(x, y) calculated in step S6.
- the average luminance level of the healthy parts of the steel material PL becomes a constant value regardless of position.
- the brightness of the input image contributes to the average value only when the coordinates (x, y) in each input image are within the inspection area, and the contribution from the background is excluded. Therefore, shading correction is appropriately performed even at coordinates that are either the surface of the steel material PL or the background in a series of input images I n (x, y) due to the meandering or width change of the steel material PL.
- step S8 the image processing unit 41a binarizes the shading-corrected image Jn (x,y) obtained in step S7.
- threshold value T1 is a value smaller than the average luminance level.
- Threshold value T2 is a value larger than the average luminance level.
- T1 ⁇ 1 ⁇ T2 When shading correction is of division type, T1 ⁇ 1 ⁇ T2 .
- threshold values T1 and T2 are each values around the average luminance level after conversion.
- image processing unit 41a may compare only with either threshold value T1 or T2 .
- step S9 the image processing unit 41a extracts blobs from the binary image Bn (x,y) obtained in step S8, and labels the extracted blobs.
- a blob refers to a region in which pixels in the binary image Bn (x,y) whose value is 1, that is, pixels belonging to a defect, are consecutive. Labeling involves assigning an identification label (serial number) to each blob.
- the image processing unit 41a may regard the blobs as one blob when the distance between the blobs is smaller than a predetermined distance and may assign the same label to them. Since small blobs may occur due to harmless patterns on the surface of the steel material PL even in healthy parts, the image processing unit 41a may remove blobs with a small area (number of pixels).
- step S10 the image processing unit 41a calculates the feature amounts of the blobs labeled in step S9, i.e., the defects.
- the feature amounts are calculated for pixel regions belonging to each blob (defect) in the shading-corrected image Jn (x,y) (or the input image In (x,y)) and the binarized image Bn (x,y).
- the feature amounts include, for example, those expressing the size of the defect, such as width, length, and area, those relating to the shape of the defect, such as aspect ratio and circularity, and those relating to the shading of the defect, such as average brightness and brightness histogram.
- step S11 the image processing unit 41a detects surface defects of the band-shaped body based on the shading-corrected image J n (x, y). More specifically, the image processing unit 41a performs defect judgment based on the feature amount of each blob (defect) calculated in step S10. For example, the image processing unit 41a assigns a type of defect and a degree of harmfulness to each blob. As a method of judgment, a predetermined IF-THEN rule may be applied, a classifier generated by various machine learning methods may be applied, or a combination of these may be applied.
- step S12 the image processing unit 41a judges whether there is an input image I n (x, y) for which the processing from step S7 to step S11 has not been completed. If the image processing unit 41a judges that there is an input image for which the processing from step S7 to step S11 has not been completed, the image processing unit 41a returns to step S7 and repeats the processing from step S7 to step S11. If the image processing unit 41a judges that the processing for all input images has been completed, the processing ends.
- the surface defect detection device 1 can properly perform shading correction near the plate edge even when the steel material PL meanders and the edge position fluctuates, and can achieve shading correction that does not cause "folding" around the defective area. This enables the surface defect detection device 1 to detect surface defects with high sensitivity and to appropriately determine the type and harmfulness of the defect.
- the image processing unit 41a may calculate the following binary image Qn (x,y) in addition to Rn (x,y) in step S2.
- step S3 the image processing unit 41a performs calculation by replacing the formula (3) with the following formula (3').
- S(x,y) S(x,y) + I n (x,y) ⁇ Q n (x,y) (3')
- step S4 the image processing unit 41a performs calculation by replacing the formula (4) with the following formula (4').
- C(x,y) : C(x,y)+ Qn (x,y) (4')
- the image processing unit 41a may recognize non-steady parts within the inspection target area in each of the multiple input images, and contribute only the pixels of the steady parts to the average image A(x, y).
- Figure 3 shows multiple images of the steel material PL in Figure 1.
- Figure 3 shows some of 61 images obtained by repeatedly capturing images of the edge E to center of a thick steel plate as the steel material PL being transported on the transport roller table TB at regular intervals in the longitudinal direction of one thick steel plate using a camera of the imaging unit fixed above the transport roller table TB and flash lighting of the lighting unit.
- (a) in the upper row of Figure 3 shows an image near the leading end of the thick steel plate in the longitudinal direction.
- (b) in the middle row of Figure 3 shows an image of the longitudinal center of the thick steel plate.
- (c) in the lower row of Figure 3 shows an image near the trailing end of the thick steel plate in the longitudinal direction.
- FIG. 4 is a diagram showing an example of an embodiment of the present disclosure.
- FIG. 4 shows an example of an input image having a stain-like defect and a processed image thereof, in which image processing is applied to a series of images of the thick steel plate in FIG. 3 by a surface defect detection device 1 according to an embodiment of the present disclosure.
- FIG. 4A is a diagram showing an input image I n (x, y).
- FIG. 4B is a diagram showing an average image A(x, y). The average image A(x, y) in FIG. 4B corresponds to a shading pattern.
- FIG. 4C is a diagram showing a binary image R n (x, y) indicating an inspection target region.
- FIG. 4D is a diagram showing a shading-corrected image J n (x, y).
- FIG. 4E is a diagram showing a binary image B n (x, y) obtained by subjecting the shading-corrected image J n (x, y) to threshold processing.
- the binary image B n (x, y) in FIG. 4E is an image in which a defective portion is detected.
- the image processing unit 41a searches for the edge of the thick steel plate from the left side of the input image in detecting the inspection target area in step S2 in Figure 2, and determines the edge position as the position where a predetermined brightness value is exceeded for 20 or more consecutive pixels to the right. Since halation occurs near the edge, the image processing unit 41a excludes 18 pixels from the edge as a non-steady part from the inspection target area. In the labeling process in step S9 in Figure 2, the image processing unit 41a removes blobs with an area smaller than 20 pixels as noise.
- Figure 5 shows a comparative example based on conventional technology.
- Figure 5 shows the results of image processing when shading patterns are calculated using a moving average filter and shading correction is performed on the same input image as the image shown in Figure 4(a).
- the shading pattern is calculated for each y coordinate and is performed using a moving average filter of 64 pixels in the x direction.
- FIG. 5 is a diagram showing the same input image as (a) in FIG. 4.
- (b) in FIG. 5 is a diagram showing a shading pattern generated by applying a moving average filter to the input image in (a) in FIG. 5.
- (c) in FIG. 5 is a diagram showing the same binary image as (c) in FIG. 4 showing the inspection target area.
- (d) in FIG. 5 is a diagram showing a shading-corrected image based on the shading pattern in (b) in FIG. 5.
- e) in FIG. 5 is a diagram showing a binary image obtained by thresholding the shading-corrected image in (d) in FIG. 5.
- the binary image in (e) in FIG. 5 is an image in which a defective portion has been detected. For processes other than shading correction, the same processes and parameters as in FIG. 4 were used.
- Figure 6 shows a comparative example based on another conventional technique.
- Figure 6 shows the result of image processing when a shading pattern is calculated using a second-order polynomial approximation and shading correction is performed on the same input image as the image shown in Figure 4(a).
- FIG. 6 is a diagram showing the same input image as (a) of FIG. 4.
- (b) of FIG. 6 is a diagram showing a shading pattern generated by applying a second-order polynomial approximation to the luminance waveform in the x direction for each y coordinate of the input image in (a) of FIG. 6.
- (c) of FIG. 6 is a diagram showing the same binary image as (c) of FIG. 4 showing the inspection target area.
- (d) of FIG. 6 is a diagram showing a shading-corrected image based on the shading pattern in (b) of FIG. 6.
- (e) of FIG. 6 is a diagram showing a binary image obtained by threshold processing the shading-corrected image in (d) of FIG. 6.
- the binary image in (e) of FIG. 6 is an image in which a defective portion has been detected. For processes other than shading correction, the same processes and parameters as those in FIG. 4 were used.
- FIGS. 7A to 7C compare the results of shading correction between an embodiment based on the technology of the present disclosure and two comparative examples.
- FIG. 7A is a graph showing the corrected luminance waveform between A and A in the shading-corrected image of FIG. 4(d).
- FIG. 7B is a graph showing the corrected luminance waveform between E and E in the shading-corrected image of FIG. 5(d).
- FIG. 7C is a graph showing the corrected luminance waveform between C and C in the shading-corrected image of FIG. 6(d).
- the average image A(x,y) in Fig. 4(b) showing an embodiment of the present disclosure well represents the shading of the input image I n (x,y) in Fig. 4(a) without being affected by the pattern and defects on the steel plate surface.
- the shading-corrected image J n (x,y) in Fig. 4(d) achieves a uniform brightness overall as shown in the graph in Fig. 7A, and no "aliasing" occurs around the defects.
- the technology disclosed herein makes it possible to perform appropriate shading correction even in situations where the material being inspected is meandering or has a change in width, compared to conventional technology, and also suppresses "folding" around defects.
- an average image calculated from multiple images is used as a shading pattern, and only pixels within the inspection target area in each of the multiple images contribute to the average image, making it possible to detect surface defects on a strip with greater accuracy. More specifically, "folding" around the defect or “hollowing” of a large defect in the corrected image is suppressed. In addition, even if the edge position shown in the image fluctuates due to meandering or a change in width of the material being inspected, it is possible to perform appropriate shading correction up to the vicinity of the edge.
- the shape, size, arrangement, orientation, and number of each of the above-mentioned components are not limited to the above description and the illustrations in the drawings.
- the shape, size, arrangement, orientation, and number of each of the components may be configured arbitrarily as long as the function can be realized.
- a general-purpose electronic device such as a smartphone or computer can be configured to function as the surface defect detection device 1 according to the embodiment described above.
- a program describing the processing content that realizes each function of the surface defect detection device 1 according to the embodiment is stored in the memory of the electronic device, and the program is read and executed by the processor of the electronic device. Therefore, the disclosure according to one embodiment can also be realized as a program that can be executed by a processor.
- the disclosure of one embodiment may also be realized as a non-transitory computer-readable medium storing a program executable by one or more processors to cause the surface defect detection device 1 according to the embodiment to execute each function. It should be understood that these are also included within the scope of the present disclosure.
- the "imaging unit” is configured to use an area sensor, i.e., a camera having imaging elements arranged two-dimensionally, but this is not limited to this.
- the "imaging unit” may also be configured to use a line sensor, i.e., a camera having imaging elements arranged one-dimensionally.
- the surface defect detection device 1 may be configured to set the size of the input image I n (x, y) in the Y direction to 1, or to form the input image I n (x, y) every time a certain number of lines are accumulated.
- the strip has been described as including, for example, steel material, but is not limited to this.
- the strip may include any strip- or sheet-shaped object other than steel material.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Biochemistry (AREA)
- Pathology (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Immunology (AREA)
- Analytical Chemistry (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Textile Engineering (AREA)
- Quality & Reliability (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
Abstract
Description
(1)
帯状体の表面欠陥を光学的に検出する表面欠陥検出方法であって、
前記帯状体の表面を照明して得られる前記帯状体からの反射光を検出し、前記帯状体の表面を相対的に走査しながら撮像して、前記帯状体の表面を含む複数の画像を取得する画像取得ステップと、
取得した前記複数の画像の平均画像を算出する平均画像算出ステップと、
取得した前記複数の画像の各々を、前記平均画像を用いてシェーディング補正して補正画像を得る画像補正ステップと、
前記補正画像に基づいて前記帯状体の表面欠陥を検出する欠陥検出ステップと、
を含み、
前記平均画像算出ステップは、前記複数の画像の各々において、画像中の前記帯状体が存在する検査対象領域を認識して、前記検査対象領域にある画素についてのみ前記平均画像に寄与するようにしたことを含む、
表面欠陥検出方法、
である。
上記(1)に記載の表面欠陥検出方法では、
前記画像補正ステップは、取得した前記複数の画像の各々を前記平均画像で除算することを含んでもよい。
上記(1)に記載の表面欠陥検出方法では、
前記画像補正ステップは、取得した前記複数の画像の各々から前記平均画像を減算することを含んでもよい。
上記(1)乃至(3)のいずれか1つに記載の表面欠陥検出方法では、
前記平均画像算出ステップは、前記複数の画像の各々において、画像中の前記検査対象領域内の非定常部を認識して、定常部の画素についてのみ前記平均画像に寄与するようにしたことを含んでもよい。
上記(1)乃至(4)のいずれか1つに記載の表面欠陥検出方法では、
前記帯状体は鋼材を含んでもよい。
(6)
帯状体の表面欠陥を光学的に検出する表面欠陥検出装置であって、
前記帯状体の表面を照明する照明部と、
前記照明部により前記帯状体の表面を照明して得られる前記帯状体からの反射光を検出する撮像部と、
前記帯状体の表面を前記照明部及び前記撮像部を用いて相対的に走査して撮像した前記帯状体の表面を含む複数の画像を処理して、前記帯状体の表面欠陥を検出する画像処理部と、
を備え、
前記画像処理部は、
前記複数の画像の平均画像を算出し、
前記平均画像を用いて前記複数の画像をシェーディング補正した補正画像を取得し、
前記補正画像に基づいて前記帯状体の表面欠陥を検出し、
前記複数の画像の各々において画像中の前記帯状体が存在する検査対象領域を認識して、前記検査対象領域にある画素についてのみ前記平均画像に寄与するようにする、
表面欠陥検出装置、
である。
In(x,y)<TRのとき Rn(x,y)=0 (背景) (2)
S(x,y):=S(x,y)+In(x,y)×Rn(x,y) (3)
C(x,y):=C(x,y)+Rn(x,y) (4)
画像処理部41aは、検査対象材である鋼材PLごとにカウント画像C(x,y)を初期化する。
C(x,y)≠0のとき、A(x,y):=S(x,y)/C(x,y) (5)
(ただし、C(x,y)=0のときA(x,y)の値は任意とする。)
Jn(x,y):=In(x,y)/A(x,y) (6)
Jn(x,y):=In(x,y)-A(x,y) (7)
Jn’(x,y)=Jn(x,y)×128 (8)
Jn(x,y)≦T1又はJn(x,y)≧T2のとき、 Bn(x,y):=1 (9)
T1<Jn(x,y)<T2のとき、 Bn(x,y):=0 (10)
Rn(x,y)=1かつ(x,y)が定常部のとき Qn(x,y)=1 (11)
Rn(x,y)=0又は(x,y)が非定常部のとき Qn(x,y)=0 (12)
S(x,y):=S(x,y)+In(x,y)×Qn(x,y) (3’)
C(x,y):=C(x,y)+Qn(x,y) (4’)
2a 照明装置(照明部)
2b 照明装置(照明部)
3a エリアセンサ(撮像部)
3b エリアセンサ(撮像部)
4 演算処理装置
41a 画像処理部
41b 画像処理部
42 統合処理部
5 表示装置
E エッジ部
PL 鋼材(帯状体)
TB 搬送ローラーテーブル
Claims (6)
- 帯状体の表面欠陥を光学的に検出する表面欠陥検出方法であって、
前記帯状体の表面を照明して得られる前記帯状体からの反射光を検出し、前記帯状体の表面を相対的に走査しながら撮像して、前記帯状体の表面を含む複数の画像を取得する画像取得ステップと、
取得した前記複数の画像の平均画像を算出する平均画像算出ステップと、
取得した前記複数の画像の各々を、前記平均画像を用いてシェーディング補正して補正画像を得る画像補正ステップと、
前記補正画像に基づいて前記帯状体の表面欠陥を検出する欠陥検出ステップと、
を含み、
前記平均画像算出ステップは、前記複数の画像の各々において、画像中の前記帯状体が存在する検査対象領域を認識して、前記検査対象領域にある画素についてのみ前記平均画像に寄与するようにしたことを含む、
表面欠陥検出方法。 - 請求項1に記載の表面欠陥検出方法であって、
前記画像補正ステップは、取得した前記複数の画像の各々を前記平均画像で除算することを含む、
表面欠陥検出方法。 - 請求項1に記載の表面欠陥検出方法であって、
前記画像補正ステップは、取得した前記複数の画像の各々から前記平均画像を減算することを含む、
表面欠陥検出方法。 - 請求項1乃至3のいずれか1項に記載の表面欠陥検出方法であって、
前記平均画像算出ステップは、前記複数の画像の各々において、画像中の前記検査対象領域内の非定常部を認識して、定常部の画素についてのみ前記平均画像に寄与するようにしたことを含む、
表面欠陥検出方法。 - 請求項1乃至3のいずれか1項に記載の表面欠陥検出方法であって、
前記帯状体は鋼材を含む、
表面欠陥検出方法。 - 帯状体の表面欠陥を光学的に検出する表面欠陥検出装置であって、
前記帯状体の表面を照明する照明部と、
前記照明部により前記帯状体の表面を照明して得られる前記帯状体からの反射光を検出する撮像部と、
前記帯状体の表面を前記照明部及び前記撮像部を用いて相対的に走査して撮像した前記帯状体の表面を含む複数の画像を処理して、前記帯状体の表面欠陥を検出する画像処理部と、
を備え、
前記画像処理部は、
前記複数の画像の平均画像を算出し、
前記平均画像を用いて前記複数の画像をシェーディング補正した補正画像を取得し、
前記補正画像に基づいて前記帯状体の表面欠陥を検出し、
前記複数の画像の各々において画像中の前記帯状体が存在する検査対象領域を認識して、前記検査対象領域にある画素についてのみ前記平均画像に寄与するようにする、
表面欠陥検出装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020257008097A KR20250048337A (ko) | 2022-09-27 | 2023-06-28 | 표면 결함 검출 방법 및 표면 결함 검출 장치 |
EP23871346.5A EP4579225A1 (en) | 2022-09-27 | 2023-06-28 | Surface defect detecting method and surface defect detecting device |
CN202380066619.XA CN119895252A (zh) | 2022-09-27 | 2023-06-28 | 表面缺陷检测方法以及表面缺陷检测装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022154254A JP7626117B2 (ja) | 2022-09-27 | 2022-09-27 | 表面欠陥検出方法及び表面欠陥検出装置 |
JP2022-154254 | 2022-09-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024070101A1 true WO2024070101A1 (ja) | 2024-04-04 |
Family
ID=90476933
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/024078 WO2024070101A1 (ja) | 2022-09-27 | 2023-06-28 | 表面欠陥検出方法及び表面欠陥検出装置 |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP4579225A1 (ja) |
JP (1) | JP7626117B2 (ja) |
KR (1) | KR20250048337A (ja) |
CN (1) | CN119895252A (ja) |
WO (1) | WO2024070101A1 (ja) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004226347A (ja) * | 2003-01-27 | 2004-08-12 | Nippon Steel Corp | 表面欠陥検査装置におけるシェーディング補正方法 |
JP2008102818A (ja) * | 2006-10-20 | 2008-05-01 | Seiko Epson Corp | 撮像手段の出力値補正方法、シェーディング補正方法、欠陥検出方法、欠陥検出プログラムおよび欠陥検出装置 |
JP2016156671A (ja) * | 2015-02-24 | 2016-09-01 | Jfeスチール株式会社 | 金属帯エッジ部の欠陥検出方法および金属帯エッジ部の欠陥検出装置 |
JP2017219343A (ja) * | 2016-06-03 | 2017-12-14 | 住友化学株式会社 | 欠陥検査装置、欠陥検査方法、フィルム製造装置及びフィルム製造方法 |
JP2018155690A (ja) * | 2017-03-21 | 2018-10-04 | Jfeスチール株式会社 | 表面欠陥検査方法及び表面欠陥検査装置 |
US20180374220A1 (en) * | 2015-12-09 | 2018-12-27 | Kyungpook National University Industry-Academic Cooperation Foundation | Apparatus and method for dividing of static scene based on statistics of images |
WO2021149588A1 (ja) * | 2020-01-20 | 2021-07-29 | Jfeスチール株式会社 | 表面検査装置、表面検査方法、鋼材の製造方法、鋼材の品質管理方法、及び鋼材の製造設備 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012112729A (ja) | 2010-11-22 | 2012-06-14 | Toshiba Corp | 表面検査装置のシェーディング補正方法 |
JP6149990B2 (ja) | 2016-09-02 | 2017-06-21 | Jfeスチール株式会社 | 表面欠陥検出方法及び表面欠陥検出装置 |
-
2022
- 2022-09-27 JP JP2022154254A patent/JP7626117B2/ja active Active
-
2023
- 2023-06-28 KR KR1020257008097A patent/KR20250048337A/ko active Pending
- 2023-06-28 WO PCT/JP2023/024078 patent/WO2024070101A1/ja active Application Filing
- 2023-06-28 CN CN202380066619.XA patent/CN119895252A/zh active Pending
- 2023-06-28 EP EP23871346.5A patent/EP4579225A1/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004226347A (ja) * | 2003-01-27 | 2004-08-12 | Nippon Steel Corp | 表面欠陥検査装置におけるシェーディング補正方法 |
JP2008102818A (ja) * | 2006-10-20 | 2008-05-01 | Seiko Epson Corp | 撮像手段の出力値補正方法、シェーディング補正方法、欠陥検出方法、欠陥検出プログラムおよび欠陥検出装置 |
JP2016156671A (ja) * | 2015-02-24 | 2016-09-01 | Jfeスチール株式会社 | 金属帯エッジ部の欠陥検出方法および金属帯エッジ部の欠陥検出装置 |
US20180374220A1 (en) * | 2015-12-09 | 2018-12-27 | Kyungpook National University Industry-Academic Cooperation Foundation | Apparatus and method for dividing of static scene based on statistics of images |
JP2017219343A (ja) * | 2016-06-03 | 2017-12-14 | 住友化学株式会社 | 欠陥検査装置、欠陥検査方法、フィルム製造装置及びフィルム製造方法 |
JP2018155690A (ja) * | 2017-03-21 | 2018-10-04 | Jfeスチール株式会社 | 表面欠陥検査方法及び表面欠陥検査装置 |
WO2021149588A1 (ja) * | 2020-01-20 | 2021-07-29 | Jfeスチール株式会社 | 表面検査装置、表面検査方法、鋼材の製造方法、鋼材の品質管理方法、及び鋼材の製造設備 |
Also Published As
Publication number | Publication date |
---|---|
JP2024048292A (ja) | 2024-04-08 |
KR20250048337A (ko) | 2025-04-08 |
CN119895252A (zh) | 2025-04-25 |
JP7626117B2 (ja) | 2025-02-04 |
EP4579225A1 (en) | 2025-07-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110431404B (zh) | 表面缺陷检查方法及表面缺陷检查装置 | |
JP6264132B2 (ja) | 車体塗装面の検査装置および検査方法 | |
US20220011241A1 (en) | Surface-defect detecting method, surface-defect detecting apparatus, steel-material manufacturing method, steel-material quality management method, steel-material manufacturing facility, surface-defect determination model generating method, and surface-defect determination model | |
WO2016158873A1 (ja) | 溶融めっき鋼板の表面欠陥検査装置および表面欠陥検査方法 | |
CN107709977B (zh) | 表面缺陷检测装置及表面缺陷检测方法 | |
JP5347661B2 (ja) | 帯状体の表面検査装置、表面検査方法及びプログラム | |
JP2010085166A (ja) | プリプレグ欠点検査方法 | |
JP2007132757A (ja) | 外観検査方法および同装置 | |
JP2011013007A (ja) | 磁粉探傷装置 | |
JP2002148195A (ja) | 表面検査装置及び表面検査方法 | |
JP6035124B2 (ja) | 欠陥検査装置、及び欠陥検査方法 | |
JP2003329601A (ja) | 欠陥検査装置及び欠陥検査方法 | |
JP7626117B2 (ja) | 表面欠陥検出方法及び表面欠陥検出装置 | |
US6614918B1 (en) | Apparatus for inspecting light-and-shade portions and method thereof | |
JP4395057B2 (ja) | 帯状体や柱状体の周期疵検出方法およびその装置 | |
JP2015059854A (ja) | 欠陥検査方法及び欠陥検査装置 | |
JP4403036B2 (ja) | 疵検出方法及び装置 | |
JP2006343185A (ja) | 表面欠陥検査装置 | |
JP2003156451A (ja) | 欠陥検出装置 | |
JPWO2015159352A1 (ja) | ウェブ検査装置、ウェブ検査方法、ウェブ検査プログラム | |
JP2019060836A (ja) | 印刷物の浮き検出装置及び浮き検出方法、検査装置及び検査方法 | |
JP4349960B2 (ja) | 表面欠陥検査装置 | |
JP3126634B2 (ja) | 外観検査方法 | |
JP2003329596A (ja) | 欠陥検査装置及び欠陥検査方法 | |
JP2000329699A (ja) | 欠陥検査方法及びその装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23871346 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20257008097 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020257008097 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202517022319 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023871346 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 202517022319 Country of ref document: IN |
|
ENP | Entry into the national phase |
Ref document number: 2023871346 Country of ref document: EP Effective date: 20250324 |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112025005491 Country of ref document: BR |
|
WWP | Wipo information: published in national office |
Ref document number: 1020257008097 Country of ref document: KR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2023871346 Country of ref document: EP |