WO2018168510A1 - 円筒体表面検査装置および円筒体表面検査方法 - Google Patents
円筒体表面検査装置および円筒体表面検査方法 Download PDFInfo
- Publication number
- WO2018168510A1 WO2018168510A1 PCT/JP2018/007917 JP2018007917W WO2018168510A1 WO 2018168510 A1 WO2018168510 A1 WO 2018168510A1 JP 2018007917 W JP2018007917 W JP 2018007917W WO 2018168510 A1 WO2018168510 A1 WO 2018168510A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- luminance
- unit
- cylindrical body
- scanning
- image data
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/30—Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/30—Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
- G01B11/306—Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces for measuring evenness
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/95—Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
- G01N21/952—Inspecting the exterior surface of cylindrical bodies or wires
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
- G01N2021/8835—Adjustable illumination, e.g. software adjustable screen
Definitions
- the present invention relates to a cylindrical surface inspection device and a cylindrical surface inspection method for inspecting the surface of a cylindrical body used in a process of manufacturing a sheet-like object.
- a plurality of cylindrical conveyance rolls are used for conveyance, stretching, and the like. Due to surface deformation of the roll surface of these transport rolls or adhesion of foreign matter, the irregular shape of the roll surface is transferred to the surface of the sheet-like object, and defects (periodic defects) may occur continuously with the period of the roll diameter. . This periodic defect is discovered by inspection by a human or a defect detection device in the manufacturing process or before product shipment, and leads to quality assurance and process improvement.
- the transport roll which is a source of defects
- measures against the source are carried out by cleaning the entire roll surface for removing foreign matter against the adhesion of foreign matter.
- the deformation portion of the roll surface is specified, and this is dealt with by performing a polishing operation or a film formation operation as a local unevenness removal operation. Therefore, in order to cope with surface deformation, it is necessary to carry out the unevenness removal work after specifying the deformation portion with certainty.
- automatic inspection equipment can be expected to stably realize high-precision and high-sensitivity detection compared to visual detection, there are various reasons, for example, there are multiple rolls to be inspected. Permanent installation of the automatic inspection apparatus on the inspection target roll may be difficult due to reasons such as fewer than the number corresponding to the number of inspection target rolls.
- the automatic inspection apparatus is temporarily installed in the vicinity of the roll to be inspected as necessary, but it is necessary to install the automatic inspection apparatus with high accuracy according to a predetermined setting.
- the illumination of the automatic inspection device in an automatic inspection device that employs a line sensor camera that is generally used in a continuous conveyance process such as the manufacturing process of sheet-like objects, the illumination of the automatic inspection device, the distance between the sensor and the inspection target roll, angle, parallelism, etc. It is necessary to adjust with high accuracy.
- the illumination and the optical axis of the camera are somewhat shifted from the specular reflection. It is desirable to inspect under an optical system condition called off-axis regular reflection or catadioptric.
- the temporary automatic inspection device stabilizes the inspection performance expected of the automatic inspection device. It was difficult to express it.
- Patent Document 1 In response to the problem that it is difficult to adjust the automatic inspection apparatus, a method has been proposed to deal with this problem with an automatic inspection apparatus using an area sensor.
- a method of an inspection method using an area sensor will be described using Patent Document 1.
- the image input method provides an effect as if an image is captured by a line sensor camera while using an area sensor camera.
- Irradiation light is irradiated from one direction onto the surface of the object to be measured that moves relatively by the illumination means.
- the reflected light reflected by the surface of the object to be measured is picked up using an image pickup means using a two-dimensional optical element capable of partial reading.
- a plurality of pixels including a pixel existing at a specified offset position from the pixel specified in (iii) above and a pixel adjacent to the pixel are selected.
- a line image in the main scanning direction orthogonal to the sub-scanning direction is obtained from the sum or average value of the light amounts of the plurality of pixels selected in (iv) above.
- the line image obtained in (v) above is used as a main scanning image, and sub-scanning is performed by continuously imaging a plurality of times to obtain a two-dimensional image of the surface of the object to be measured.
- the surface of the object to be measured is inspected without keeping the relative positions of the surface of the object to be measured, the light source and the imaging system by obtaining the line image by the procedures (iii) to (v) above. I can do it.
- Patent Document 1 it is difficult to detect the irregularity defect when the position adjustment of the automatic inspection apparatus is insufficient.
- FIG. 15 is an explanatory diagram showing the positional relationship between the two-dimensional imaging unit and the illumination unit when the relative positions of the detected object surface, the light source, and the imaging system are shifted in the parallel direction in the prior art.
- FIG. 16 is an explanatory diagram illustrating a positional relationship between the two-dimensional imaging unit and the illumination unit when the relative positions of the detected object surface, the light source, and the imaging system are shifted in the vertical direction in the conventional technology.
- FIG. 17A is an explanatory diagram illustrating an example of a scanning position when the bright line width on the surface of the object to be detected changes in the conventional technique and the bright line width is an assumed thickness.
- 17B is an explanatory diagram illustrating an example of a scanning position in the case where the bright line width on the surface of the object to be detected changes in the conventional technique and the bright line width is narrower than the assumed thickness.
- the brightest bright line position (bright line center) in the sub-scanning direction on the surface of the detected object 100 is detected for each scanning direction, and a pixel present at a certain offset position from the bright line position and the pixel.
- a plurality of pixels including adjacent pixels are selected. Therefore, for example, as shown in FIG. 15, when the relative positions of the surface of the detected object 100 and the illumination unit 101 and the two-dimensional imaging unit 102 are shifted in the tangential L direction, that is, reflection from the surface of the detected detected object 100.
- the tangent is a tangent at the intersection of the optical axis of the imaging optical system and the surface of the object to be detected.
- FIG. 16 when the distance between the surface of the detected object 100 and the illumination unit 101 and the two-dimensional imaging unit 102 changes in the perpendicular N direction as viewed from the surface of the detected object 100, FIG. As shown in FIG. 17B, a change occurs in the width of the bright line. For example, for a situation where the bright line becomes thin (see FIG. 17B), it is considered that the pixel selection position (scanning position) in the above procedure (iv) may protrude from the bright line. It is difficult to detect unevenness because the height difference of the surface of the object to be detected cannot be detected.
- an object of the present invention is to perform a stable and highly accurate inspection even when the relative position between the surface of the cylindrical body and the inspection apparatus changes in the cylindrical surface defect inspection for inspecting the surface of the cylindrical body. It is an object of the present invention to provide a cylindrical surface inspection method capable of achieving the above and a cylindrical surface inspection device that realizes the method.
- the cylindrical surface inspection apparatus of the present invention that achieves the above-described object is a cylindrical surface inspection apparatus that inspects the surface of a cylindrical body that moves relatively in one direction at an inspection position, A light irradiation unit for irradiating the cylindrical body with light; A two-dimensional imaging unit disposed at a position for receiving light irradiated from the light irradiation unit and reflected by the surface of the cylindrical body; The two-dimensional image data obtained by the two-dimensional imaging unit is a scanning position in the first direction of the two-dimensional image data in a predetermined cycle and corresponding to the circumferential direction of the cylindrical body.
- a scanning position determining unit for determining a position Among the two-dimensional image data, a plurality of the two-dimensional imaging units obtained by the two-dimensional imaging unit obtain the image data in the second direction perpendicular to the first direction at the scanning position determined by the scanning position determination unit.
- a time-series scan image generation unit configured to generate a time-series scan image by arranging the extracted image data in the second direction in the first direction in a time-series order, performed on two-dimensional image data; And an inspection unit that inspects the time-series scanning image to detect defects, The scanning position determination unit; From the two-dimensional image data obtained by the two-dimensional imaging unit, the integrated value of the luminance of each pixel in the second direction at each position in the first direction is calculated and arranged in the first direction.
- Brightness profile creation section to create a brightness profile
- a luminance peak position calculation unit that calculates a peak position with the highest luminance from the luminance profile created by the luminance profile creation unit,
- a luminance measuring unit for measuring the luminance at the luminance peak position calculated by the luminance peak position calculating unit;
- the position in the first direction corresponding to the luminance obtained by multiplying the luminance at the peak position measured by the luminance measuring unit by a predetermined coefficient multiple less than 1 is calculated from the luminance profile created by the luminance profile creating unit.
- Brightness reduction position calculation unit And a scanning position holding unit that holds the position calculated by the brightness reduction position calculating unit as the scanning position.
- the light irradiation unit is a linear light source, the direction of the central axis of the cylindrical body, the longitudinal direction of the linear light source, and the second of the two-dimensional imaging unit. Are preferably arranged in parallel to each other.
- the cylindrical surface inspection method of the present invention that achieves the above-mentioned problem is a cylindrical surface inspection method for inspecting the surface of a cylindrical body, While relatively moving the cylinder in one direction at the inspection position, the cylinder is irradiated with light, Two-dimensional imaging of the light reflected by the surface of the cylindrical body of the irradiated light, Determining a scanning position in the first direction of the two-dimensional image data only in a predetermined cycle and corresponding to the circumferential direction of the cylindrical body; Out of the two-dimensional image data, extraction of image data in a second direction perpendicular to the first direction at each position in the determined first direction is performed on a plurality of the two-dimensional image data.
- the image data in the first direction taken out are arranged in the first direction in time series order to generate a time series scanned image, A procedure for inspecting the time-series scanned image to detect defects;
- the scanning position is Calculating an integrated value of luminance of each pixel in the second direction at each position in the first direction from the two-dimensional image data, and arranging them in the first direction to obtain a luminance profile;
- the position in the first direction corresponding to the brightness obtained by multiplying the brightness at the peak position with the highest brightness by a coefficient multiple less than 1 determined in advance from the brightness profile.
- the cylindrical surface inspection method of the present invention irradiates the surface of the cylindrical body with linear light, and images in the direction of the central axis of the cylindrical body, the longitudinal direction of the linear light, and the two-dimensional image. It is preferable that the main scanning directions are parallel to each other.
- the “cylindrical body relatively moving in one direction” refers to a cylindrical body that continuously moves in a predetermined direction at the inspection position.
- it may be an object rotating in one direction such as a transport roll used for transporting a film, or may be a product roll wound with a sheet product such as a film.
- the cylindrical body surface inspection apparatus and cylindrical body surface inspection method which can test
- FIG. 1 is an explanatory diagram showing a configuration of a transport roll in an embodiment of the present invention.
- FIG. 2 is an explanatory diagram of an inspection configuration in one embodiment of the present invention.
- FIG. 3 is an explanatory diagram of a main part of the inspection configuration in one embodiment of the present invention.
- FIG. 4 is an explanatory diagram of a method for acquiring time-series scanned images according to an embodiment of the present invention.
- FIG. 5 is an explanatory diagram of a scanning position determination flow in one embodiment of the present invention.
- FIG. 6A is an explanatory diagram illustrating an example of a scanning position when a change occurs in the bright line width of the detected object surface in the embodiment of the present invention.
- FIG. 6B is an explanatory diagram illustrating an example of a scanning position when a change occurs in the bright line width of the surface of the detection object in the embodiment of the present invention.
- FIG. 7 is an explanatory diagram illustrating an example of a two-dimensional image capturing result according to the embodiment of the present invention.
- FIG. 8 is an explanatory diagram illustrating an example of a vertical profile acquisition result from a two-dimensional image according to an embodiment of the present invention.
- FIG. 9 is an explanatory diagram of a method for calculating the maximum luminance peak position in an embodiment of the present invention.
- FIG. 10 is an explanatory diagram illustrating an example of a generation result of a time-series scan image according to an embodiment of the present invention.
- FIG. 11 is an explanatory diagram illustrating an example of the result of the binarization process using the bright threshold according to the embodiment of this invention.
- FIG. 12 is an explanatory diagram illustrating an example of a result of binarization processing using a dark threshold according to an embodiment of the present invention.
- FIG. 13 is an explanatory diagram illustrating an example of a logical sum operation result of a binarized image in one embodiment of the present invention.
- FIG. 14 is an explanatory diagram showing an example of the result of the expansion / contraction process in one embodiment of the present invention.
- FIG. 15 is an explanatory diagram illustrating the positional relationship between the two-dimensional imaging unit and the illumination unit when the relative positions of the surface of the object to be detected, the light source, and the imaging system are shifted in the parallel direction in the prior art.
- FIG. 16 is an explanatory diagram illustrating a positional relationship between the two-dimensional imaging unit and the illumination unit when the relative positions of the detected object surface, the light source, and the imaging system are shifted in the vertical direction in the conventional technology.
- FIG. 17A is an explanatory diagram illustrating an example of a scanning position when a change occurs in the bright line width on the surface of the detection object in the related art.
- FIG. 17B is an explanatory diagram illustrating an example of a scanning position when a change occurs in the bright line width of the surface of the detection object in the related art.
- FIG. 1 is an explanatory diagram showing the configuration of the transport roll.
- Reference numeral 1 denotes a cylindrical roll body, which is conveyed by rotating in one direction while bringing a sheet-like object such as a film into contact with the surface.
- Reference numeral 2 denotes a core of the roll body 1
- reference numeral 3 denotes a bearing of the roll.
- a surface defect 4 having a concave shape is present on the surface of the roll body 1.
- the roll body 1 is preferably rotated at a constant speed by the power transmitted from an electric motor or the like.
- the roll body 1 is manually rotated to rotate the roll body 1 using a shaft encoder, a rotary encoder, or the like.
- Appropriate imaging timing control may be performed by monitoring the rotation amount, or the rotation speed of the roll body 1 may be monitored using a shaft encoder, a rotary encoder, or the like while rotating at a constant speed with external power such as an electric motor.
- appropriate imaging timing control may be performed.
- FIG. 2 is a view for explaining the configuration of a roll surface defect inspection apparatus for inspecting the surface of the roll body 1.
- the roll body 1 shows a cross section in which a plane perpendicular to the longitudinal direction of the roll body 1 and passing through the surface defect 4 is a cut surface.
- Reference numeral 5 denotes a light irradiation unit which irradiates the roll body 1 with light.
- the light irradiation unit 5 may be a fluorescent lamp, a halogen light source, a metal halide light source, a xenon light source, or an LED light source.
- the light source which has a specific wavelength characteristic, and the light source which has specific directivity may be sufficient.
- the light source is a linear light source having a light projecting portion that is long in one direction and having a substantially uniform amount of light emitted from the light projecting portion.
- a description will be given of a light source that uses LED illumination and irradiates a plurality of LED light sources as a substantially uniform light having a high directivity in one direction by arranging a plurality of LED light sources in a horizontal row (a direction orthogonal to the paper surface of FIG. 2).
- the longitudinal direction of the LED illumination is substantially parallel to the inspection width direction.
- the longitudinal direction of the LED illumination may be a plane parallel to the rotation axis of the roll body 1 and may be rotated within the plane.
- Reference numeral 6 denotes a two-dimensional imaging unit, which is disposed so as to receive reflected light and scattered light that are irradiated from the light irradiation unit 5 and reflected from the surface of the roll body 1.
- the two-dimensional imaging unit 6 includes an area sensor camera 6a and a lens 6b.
- the area sensor camera 6a has a plurality of photoelectric conversion elements configured two-dimensionally. Each photoelectric conversion element preferably has high sensitivity, is resistant to noise, and has a small difference in sensitivity between elements. Moreover, it is preferable that all the photoelectric conversion elements can perform exposure control simultaneously.
- a global shutter type area sensor camera capable of simultaneous exposure control of all photoelectric conversion elements
- the horizontal direction of the photoelectric conversion elements that is, the alignment direction in the main scanning direction (second direction) is the light irradiation unit 5. It is substantially parallel to the longitudinal direction of (linear light source).
- the two-dimensional imaging unit 6 has the same angle at which the light irradiation unit 5 irradiates the roll body 1 with light and the angle at which the two-dimensional imaging unit 6 receives reflected or scattered light from the roll body 1, that is, It arrange
- the two-dimensional imaging unit 6 may use an optical auxiliary means such as a polarizing filter or a wavelength selection filter in order to obtain a light amount distribution or some optical difference depending on the type of defect.
- an optical auxiliary means such as a polarizing filter or a wavelength selection filter
- FIG. 3 is an explanatory diagram of a main part of the inspection configuration in one embodiment of the present invention.
- FIG. 3 corresponds to the configuration diagram corresponding to the direction of arrow Q in FIG. Plane P N shown in FIG. 3, the optical axes of the two-dimensional imaging unit 6 parallel, and shows a longitudinal axis perpendicular to the plane of the roll body 1.
- the shape of the light emitting surface 5a emitted from the light irradiation unit 5 is rectangular.
- the longitudinal direction of the light emitting surface 5 a is parallel to the longitudinal axis of the roll body 1.
- the light emitting surface 5 a is disposed so as to be orthogonal to the regular reflection optical axis with the two-dimensional imaging unit 6 through the inspection surface of the roll body 1.
- the length of the light emitting surface 5a in the longitudinal direction is set to have a width that is sufficiently wider than the spread angle defined by the angle of view ⁇ captured by the two-dimensional imaging unit 6 so that it can be regarded as infinity in measurement.
- the Specifically, the length (illumination length) of the light emitting surface 5a in the longitudinal direction is L 1 , and the length necessary for the light receiving range in the main scanning direction (longitudinal axis direction of the roll body 1) of the two-dimensional imaging unit 6 (necessary) when the length) was L 2, with L 1> L 2, capable of optical radiation of uniform brightness in the main scanning direction.
- the installation position of the light irradiation unit 5 (distance to the surface of the roll body 1) is 2
- the illumination length is ensured by arranging it at the same position as the dimension imaging unit 6.
- Reference numeral 7 denotes an image processing unit, which is connected to the two-dimensional imaging unit 6. Information on the light received by the two-dimensional imaging unit 6 is photoelectrically converted and received as two-dimensional image data by the image processing unit 7.
- the image processing unit 7 extracts a defect portion from the two-dimensional image data, and records / displays the information.
- the defect occurrence position in the conveyance direction may be determined based on a signal from an encoder for measuring a conveyance distance (not shown), or the defect occurrence position may be determined based on an elapsed time from the start of inspection.
- the origin position of the inspection may be determined based on information from a position detection sensor (not shown) prepared so that the origin of the roll body 1 in the conveyance direction can be detected.
- the defect occurrence position in the width direction of the roll body 1 may be determined based on which element position in the main scanning direction of the photoelectric conversion element of the area sensor camera 6a of the two-dimensional imaging unit 6 is detected.
- the two-dimensional imaging unit 6 is placed on a slider (not shown) that can move in a direction substantially parallel to the longitudinal direction of the roll body 1. Even if the position of the defect occurrence in the width direction of the roll body 1 is managed by installing the value obtained by adding the amount of movement of the slider and the element position in the main scanning direction of the photoelectric conversion element of the area sensor camera 6a. Good.
- the light irradiation unit 5 may be installed on the slider in the same manner as the two-dimensional imaging unit 6, and is independently arranged with a length capable of irradiating the entire length of the roll body 1 with substantially uniform brightness. May be.
- the image processing unit 7 includes a scanning position determination unit 7a that determines a scanning position (for example, a scanning position PA) in the sub-scanning direction (first direction) of two-dimensional image data that is executed only in a predetermined cycle.
- Image data in the main scanning direction (second direction) at the scanning position PA determined by the position determination unit 7a is extracted, and this is performed every time the two-dimensional imaging unit 6 obtains two-dimensional image data.
- the image data in the scanning direction is arranged in the time-sequential order in the sub-scanning direction to form a time-series scanning image generation unit 7b that generates a time-series scanning image, and an inspection unit 7c that inspects the time-series scanning image and detects a defect image. Is done.
- the scanning position determination unit 7a has a memory such as a buffer (not shown), and holds the determined scanning position PA in this memory.
- the sub-scanning direction corresponds to the movement (rotation) direction of the object to be detected (the circumferential direction of the roll body 1), and the main scanning direction and the sub-scanning direction are perpendicular to each other.
- the scanning position determination unit 7a calculates an integrated value of the luminance of each pixel in the main scanning direction at each position in the sub-scanning direction from the two-dimensional image data obtained by the two-dimensional imaging unit 6, and outputs them in the sub-scanning direction.
- Luminance profile creation unit 71 that creates an aligned luminance profile
- a luminance peak position calculation unit 72 that calculates the highest peak position from the luminance profile created by luminance profile creation unit 71
- the luminance measurement unit 73 that measures the luminance at the luminance peak position calculated by 72 and the luminance profile created by the luminance profile creation unit 71 are less than 1 predetermined for the luminance at the peak position measured by the luminance measurement unit 73.
- the brightness reduction position calculation unit 74 that calculates a position that becomes the luminance multiplied by the coefficient multiple value, and the position calculated by the brightness reduction position calculation unit 74 are set as the scanning position. Having a scanning position holding portion 75 for holding the PA, the.
- FIG. 4 is an explanatory diagram of a method for acquiring time-series scanned images.
- FIG. 4A shows an image at each position in the sub-scanning direction.
- FIG. 4B is a diagram showing a luminance profile at each position for each time.
- the two-dimensional image data 8 is image data captured by the two-dimensional imaging unit 6 while moving the imaging position little by little on the surface of the roll body 1 that moves relatively in one direction, and is reflected from the light irradiation unit 5.
- the portion in which the image is reflected is bright and the portion in which the image is not reflected is image data obtained by imaging the dark image (see FIG. 4A). Since the two-dimensional imaging unit 6 performs imaging in a state where the surface of the roll body 1 is in focus, the reflection image, which is a mirror image of the light irradiation unit 5 by the roll body 1, has an edge portion that is unclear due to defocus However, it gets brighter as it gets closer to the center of the reflected image, and becomes darker as it moves away from the edge of the reflected image.
- the brightness indicates the distribution of the reflection intensity (luminance) of light from the light irradiating means in the imaging region.
- the light irradiation part 5 is long in one direction and has a light projection part with substantially uniform brightness, the unclear area of the edge part is limited to the sub-scanning direction. Therefore, the two-dimensional image data 8 is imaged as a linear bright line extending in the main scanning direction and having brightness undulations in the sub scanning direction.
- the luminance profile 9 is an abstraction that is easy to understand, and shows the change in luminance value in the sub-scanning direction as a profile waveform (see FIG. 4B).
- the luminance peak position 10 of the luminance profile 9 corresponds to the brightest sub-scanning position of the two-dimensional image data 8, and this position is a specular reflection condition, that is, the light incident angle formed by the light irradiation unit 5 and the roll body 1; This corresponds to the imaging position under the condition that the light receiving angle formed by the two-dimensional imaging unit 6 and the roll body 1 is the same.
- the position 11 corresponds to a position on the luminance profile 9 that is a luminance obtained by multiplying the luminance at the peak position 10 of the luminance profile 9 by a predetermined coefficient multiple less than 1.
- the uneven deformation portion (for example, the surface defect 4 in FIG. 2) that has passed through the position 11 in the two-dimensional image data 8 is either bright or dark. Fluctuates.
- the scanning position determination unit 7 a determines the scanning position PA, that is, the inspection position in the vertical direction of the two-dimensional image data based on the two-dimensional image data received from the two-dimensional imaging unit 6. A flow in which the scanning position determining unit 7a determines the scanning position PA will be described with reference to FIG.
- FIG. 5 is an explanatory diagram of a determination flow of the scanning position PA in the present invention.
- Step S101 is a step of starting the determination flow of the scanning position PA, and is a step of determining whether or not to execute the subsequent flow.
- the process proceeds to step S102.
- the image processing unit 7 determines to execute the determination flow.
- the image processing unit 7 determines not to execute the determination flow of the scanning position PA (step S101: No)
- the process proceeds to step S106.
- Whether or not to execute it follows a predetermined cycle. For example, it may be executed only at the timing of the start of inspection, or may be executed every time a certain distance or time is inspected, or the two-dimensional imaging unit 6 It may be carried out every time the two-dimensional image data received from is acquired.
- Step S102 is a step in which the luminance profile creation unit 71 acquires a luminance profile in the sub-scanning direction from the two-dimensional image data.
- the acquisition of the luminance profile may use all pixel information of the two-dimensional image data in the main scanning direction, or may use only a part of the predetermined region in the main scanning direction.
- the profile value at each position in the sub-scanning direction is the integrated value of the luminance values of each pixel in the main scanning direction, the average value of the luminance values of each pixel in the main scanning direction, and all the calculation target pixels in the main scanning direction.
- a model based on a Gaussian function Any of average values obtained by fitting may be used.
- Step S103 is a step in which the luminance peak position calculation unit 72 calculates the luminance peak position from the luminance profile.
- the luminance peak position may be a position in the sub-scanning direction having the maximum value of the luminance profile, or the area shape of the light irradiation unit 5 and the distance between the roll body 1 and the distance between the two-dimensional imaging unit 6 and the roll body 1.
- the peak position may be obtained by model fitting from a defined reflection image model function.
- Step S104 is a step in which the luminance measuring unit 73 measures the luminance of the luminance peak value.
- the luminance peak value the profile value of the luminance peak position of the luminance profile may be used, the distance between the light emitting surface (for example, the light emitting surface 5a shown in FIG. 2) of the light irradiation unit 5 and the roll body 1, and two-dimensional imaging.
- a model profile value of a peak position obtained by model fitting from a reflected image model function determined from a distance between the portion 6 (light receiving surface of the element) and the roll body 1 may be used.
- Step S105 is a step in which the luminance reduction position calculation unit 74 calculates the luminance reduction position.
- the position corresponding to the brightness of the calculated value obtained by multiplying the brightness at the peak position measured in the flow up to the above steps by a brightness reduction coefficient of less than 1 may be set as the brightness reduction position. Or it is calculated
- the position corresponding to the brightness of the calculated value obtained by multiplying the brightness value at the brightness peak position obtained from the brightness profile model function by a brightness reduction coefficient of less than 1 may be set as the brightness reduction position.
- the brightness lowering position since there are two brightness lowering positions on the upper and lower sides in the sub-scanning direction across the peak position measured in the flow up to the above steps, either of them may be adopted as the brightness lowering position.
- the average distance value is calculated by calculating the distance from the brightness reduction position in the negative direction and the brightness reduction position in the positive direction, and the average value of the distance from the peak position in either the negative direction or the positive direction.
- a position separated by a distance may be adopted as the brightness reduction position.
- the calculation direction of the brightness reduction position is always limited to one of a negative direction and a positive direction with respect to the peak position.
- Step S106 is a step in which the scanning position holding unit 75 holds the scanning position PA.
- the brightness reduction position calculated in step S105 is held as the sub-scanning position PA. However, if the determination flow of the scanning position PA is not executed, the brightness reduction position that was last executed is held as the scanning position PA. Keep doing.
- FIG. 6A is an explanatory diagram illustrating an example of a scanning position when the bright line width on the surface of the object to be detected changes and the bright line width is an assumed thickness.
- FIG. 6B is an explanatory diagram illustrating an example of a scanning position when a change occurs in the bright line width on the surface of the detection object and the bright line width is smaller than the assumed thickness. As shown in FIGS.
- the time-series scanned image generation unit 7b From the two-dimensional image data received from the two-dimensional imaging unit 6, the time-series scanned image generation unit 7b generates image data for only one row in the main scanning direction corresponding to the scanning position PA held by the scanning position determination unit 7a.
- a time-series scan image is generated by taking out and combining these in time-series in the sub-scanning direction (that is, the horizontal direction) by a predetermined number of rows (that is, a predetermined number of times of imaging).
- the inspection unit 7c processes the generated time-series scanning image to detect a defect image.
- the method for detecting the defect image is not particularly specified, but it is desirable to detect the defect by detecting a local luminance change in the time-series scanned image.
- Various parameters used at this time may be threshold values set for the luminance value of the defect image position in the time-series scanned image, may be a signal / image processing filter for processing the time-series scanned image, It may be a threshold set for the shape feature amount of the defect candidate that satisfies the threshold value or the luminance information feature amount included in the defect candidate region.
- These parameters may be optimized in advance during non-inspection, or may be sequentially optimized during inspection. Preferably, it is optimized beforehand.
- the amount of data used for this optimization is large.
- the parameter optimization in the inspection unit 7c means that the defect location extracted by the parameter is the same as the location that the person has confirmed the defect image and determined as the defect location. Actually, it is difficult for the extracted defect part and the defect part determined by humans to coincide with each other. However, the optimization of the detection improves the detection accuracy.
- a film transport roll having a hard chrome plated surface was inspected.
- an apparatus having the configuration shown in FIG. 2 was used.
- the light irradiation part 5 650,000 lux white LED illumination was used.
- the light irradiation part 5 was installed so that the longitudinal direction thereof was parallel to the roll rotation axis direction and the irradiation axis was inclined at 20 ° with respect to the normal direction of the inspection surface of the roll body 1. .
- the two-dimensional imaging unit 6 is configured such that the main scanning direction of the photoelectric conversion element is substantially parallel to the longitudinal direction of the light irradiating unit 5 (light source) and the rotation axis direction of the roll body 1, and the light receiving center optical axis is
- the roll body 1 was installed so as to have an inclination of 20 ° with respect to the normal direction of the inspection surface of the roll body 1.
- the light receiving center optical axis of the two-dimensional image unit 6 is inclined to the opposite side to the irradiation axis of the light irradiation unit 5 with respect to the normal line of the inspection surface of the roll body 1.
- the image processing unit 7 is configured by combining a frame grabber board and a personal computer.
- the image processing unit 7 performs image processing on the two-dimensional image data obtained from the two-dimensional imaging unit 6 to generate a time series scanning image, and detects a defect image from the time series scanning image.
- the specific image processing flow is as follows (1) to (8).
- (1) The two-dimensional image shown in FIG. 7 was acquired from the two-dimensional imaging unit 6 while rotating the transport roll at a constant rotational speed.
- the profile value at each position in the sub-scanning direction shown in FIG. 8 is calculated. did.
- the maximum value of the luminance profile value was searched as shown in FIG. 9, and the maximum luminance peak position was calculated by obtaining the sub-scanning position of the searched luminance maximum value.
- the profile value at the maximum luminance peak position was acquired as the maximum luminance value of the profile. In the luminance profile shown in FIG. 9, the maximum luminance value was 787840.
- a value obtained by multiplying the maximum luminance value of the profile by 0.3 (in this case, 236352) as a threshold value a position below the threshold value is searched from the maximum luminance peak position in the positive direction of the sub-scanning direction. did.
- the brightness reduction position is registered as the scanning position PA.
- Image data for one row of the registered scanning position PA was acquired from the two-dimensional image received from the two-dimensional imaging unit 6, and this was performed for each imaging with 4096 imaging operations as one cycle. Then, the acquired image data of 4096 sub-scanning positions PA were combined in time series to generate a time series scanned image shown in FIG. Note that the time-series scanning image shown in FIG. 10 shows only 50 pixels ⁇ 50 pixels, which are a part of 4096 pixels ⁇ 4096 pixels and include the defect image 12, for easy understanding of the description. . (8) With respect to the obtained time-series scanned image shown in FIG.
- a bright side threshold corresponding to a 20% increment of a luminance value of a normal part (for example, a part where no defect exists) and a decrement of 20% Binarization processing was performed with each of the dark side threshold values to obtain the two binary images of FIGS. 11 and 12. Thereafter, two binary images were ORed to generate a combined image shown in FIG. Further, the bright region and the dark region detected by performing the expansion / contraction process were combined, and the image shown in FIG. 14 was acquired with only the region having an area exceeding 100 pixels as the defect region 13. 11 to 14 show only 50 pixels ⁇ 50 pixels which are a part of 4096 pixels ⁇ 4096 pixels and include a defect image for easy understanding of the description.
- a defect image can be detected by generating a time-series scan image from the two-dimensional image data acquired by the two-dimensional imaging unit 6 and performing image processing on the time-series scan image.
- the cylindrical surface inspection method and the cylindrical surface inspection apparatus according to the present invention are stable and stable even if the relative position between the surface of the cylindrical body and the inspection apparatus changes in the cylindrical surface defect inspection for inspecting the surface of the cylindrical body. Useful for high-precision inspection.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Biochemistry (AREA)
- Immunology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Quality & Reliability (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
- Image Analysis (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
(i) 相対的に移動する被測定物表面に照明手段により一方向から照明光を照射する。
(ii) 照射光が被測定物表面で反射した反射光を、部分読み出し可能な2次元光学素子による撮像手段を用いて撮像する。
(iii) 撮像手段により撮像して取得した2次元画像の被測定物の移動方向に相当する副走査方向の画素列毎の反射光分布に基づいて、副走査方向の画素列で最も明るくなる輝線が位置する画素を特定する。
(iv) 上記(iii)で特定した画素から規定のオフセット位置に存在する画素とその画素に隣接する画素とを含む複数画素を選択する。
(v) 上記(iv)で選択した複数画素の光量の和または平均値により、副走査方向と直交する主走査方向のライン画像を求める。
(vi) 上記(v)で求めたライン画像を主走査画像とし、連続して複数回撮像することにより副走査を行ない、被測定物表面の2次元画像を得る。
前記円筒体に対して光を照射する光照射部、
前記光照射部から照射されて前記円筒体の表面で反射した光を受光する位置に配置された2次元撮像部、
前記2次元撮像部によって得られた2次元画像データに対して、あらかじめ定められた周期で前記2次元画像データの第1の方向の走査位置であって、前記円筒体の周方向に対応する走査位置を決定する走査位置決定部、
前記2次元画像データのうち、前記走査位置決定部において決定された走査位置における前記第1の方向と垂直な第2の方向の画像データの取り出しを、前記2次元撮像部が得た複数の前記2次元画像データに対して行い、取り出した前記第2の方向の各画像データを時系列順に、前記第1の方向に並べて時系列走査画像を生成する時系列走査画像生成部、
および、前記時系列走査画像を検査して欠点を検出する検査部、で構成され、
前記走査位置決定部が、
前記2次元撮像部で得られた2次元画像データから、前記第1の方向の各位置における前記第2の方向の各画素の輝度の積算値を算出し、それらを前記第1の方向に並べた輝度のプロファイルを作成する輝度プロファイル作成部、
前記輝度プロファイル作成部が作成した前記輝度のプロファイルから、輝度の最も高いピーク位置を算出する輝度ピーク位置算出部、
前記輝度ピーク位置算出部が算出した輝度のピーク位置の輝度を計測する輝度計測部、
前記輝度プロファイル作成部が作成した輝度のプロファイルから、前記輝度計測部が計測したピーク位置の輝度にあらかじめ定めた1未満の係数倍値を掛けた輝度に対応する前記第1の方向の位置を算出する輝度低下位置算出部、
および、前記輝度低下位置算出部が算出した位置を、前記走査位置として保持する走査位置保持部、で構成される。
検査位置において一方向に円筒体を相対的に移動させながら、円筒体に対して光を照射し、
前記照射した光が円筒体の表面で反射した光を2次元で撮像し、
あらかじめ定めた周期でのみ2次元画像データの第1の方向の走査位置であって、前記円筒体の周方向に対応する走査位置を決定し、
前記2次元画像データのうち、前記決定した前記第1の方向の各位置における前記第1の方向と垂直な第2の方向の画像データの取り出しを、複数の前記2次元画像データに対して行い、取り出した前記第1の方向の各画像データを時系列順に、前記第1の方向に並べて時系列走査画像を生成し、
前記時系列走査画像を検査して欠点を検出する手順を含み、
前記走査位置を、
前記2次元画像データから前記第1の方向の各位置における前記第2の方向の各画素の輝度の積算値を算出し、それらを前記第1の方向に並べて輝度のプロファイルを求め、
前記輝度のプロファイルから、輝度の最も高いピーク位置の輝度にあらかじめ定めた1未満の係数倍値を掛けた輝度に対応する前記第1の方向の位置とする。
(1)搬送ロールを一定の回転速度で回転させながら、2次元撮像部6から図7に示す2次元画像を取得した。
(2)図7に示す2次元画像から、副走査方向の各画素位置で主走査方向の全画素の積算値を取得することで、図8に示す副走査方向の各位置のプロファイル値を算出した。
(3)図8に示す輝度プロファイルから、図9に示す通り輝度プロファイル値の最大値を探索し、探索した輝度最大値の副走査位置を求めることで最大輝度ピーク位置を算出した。
(4)最大輝度ピーク位置におけるプロファイル値をプロファイルの最大輝度値として取得した。図9に示す輝度プロファイルでは、最大輝度値は787840であった。
(5)プロファイルの最大輝度値に0.3を掛けた値(ここでは236352)を閾値として、最大輝度ピーク位置から副走査方向の正の方向に閾値を下回る位置を探索し、輝度低下位置とした。
(6)輝度低下位置を走査位置PAとして登録した。
(7)2次元撮像部6から受信した2次元画像から登録した走査位置PAの1行分の画像データを取得し、それを4096回の撮像を1サイクルとして撮像毎に行った。そして、取得した4096個の副走査位置PAの画像データを時系列に結合し、図10に示す時系列走査画像を生成した。なお、図10に示す時系列走査画像は、説明の理解を容易にするため、4096画素×4096画素の一部で欠点像12を含む領域である50画素×50画素のみを切り出して示している。
(8)得られた図10に示す時系列走査画像に対して、正常部分(例えば欠点が存在しない部分)の輝度値の20%増分に相当する明側の閾値と、20%減分に相当する暗側の閾値とでそれぞれ2値化処理を行い、図11および図12の2枚の2値画像を取得した。その後、2枚の2値画像を論理和演算して図13に示す結合画像を生成した。さらに膨張・収縮処理を実施して検出した明領域と暗領域とを結合し、面積が100画素を超える領域のみを欠点領域13として図14に示す画像を取得した。なお、図11~図14は、いずれも説明の理解を容易にするため、4096画素×4096画素の一部で欠点像を含む領域である50画素×50画素のみを切り出して示している。
2 芯
3 軸受け
4 表面欠点
5 光照射部
6 2次元撮像部
6a エリアセンサーカメラ
6b レンズ
7 画像処理部
7a 走査位置決定部
7b 時系列走査画像生成部
7c 検査部
8 2次元画像データ
9 輝度のプロファイル
10 輝度ピーク位置
11 位置
71 輝度プロファイル作成部
72 輝度ピーク位置算出部
73 輝度計測部
74 輝度低下位置算出部
75 走査位置保持部
Claims (4)
- 検査位置において一方向に相対的に移動する円筒体の表面を検査する円筒体表面検査装置であって、
前記円筒体に対して光を照射する光照射部、
前記光照射部から照射されて前記円筒体の表面で反射した光を受光する位置に配置された2次元撮像部、
前記2次元撮像部によって得られた2次元画像データに対して、あらかじめ定められた周期で前記2次元画像データの第1の方向の走査位置であって、前記円筒体の周方向に対応する走査位置を決定する走査位置決定部、
前記2次元画像データのうち、前記走査位置決定部において決定された走査位置における前記第1の方向と垂直な第2の方向の画像データの取り出しを、前記2次元撮像部が得た複数の前記2次元画像データに対して行い、取り出した前記第2の方向の各画像データを時系列順に、前記第1の方向に並べて時系列走査画像を生成する時系列走査画像生成部、
および、前記時系列走査画像を検査して欠点を検出する検査部、で構成され、
前記走査位置決定部が、
前記2次元撮像部で得られた2次元画像データから、前記第1の方向の各位置における前記第2の方向の各画素の輝度の積算値を算出し、それらを前記第1の方向に並べた輝度のプロファイルを作成する輝度プロファイル作成部、
前記輝度プロファイル作成部が作成した前記輝度のプロファイルから、輝度の最も高いピーク位置を算出する輝度ピーク位置算出部、
前記輝度ピーク位置算出部が算出した輝度のピーク位置の輝度を計測する輝度計測部、
前記輝度プロファイル作成部が作成した輝度のプロファイルから、前記輝度計測部が計測したピーク位置の輝度にあらかじめ定めた1未満の係数倍値を掛けた輝度に対応する前記第1の方向の位置を算出する輝度低下位置算出部、
および、前記輝度低下位置算出部が算出した位置を、前記走査位置として保持する走査位置保持部、で構成された円筒体表面検査装置。 - 前記光照射部は線状の光源であり、前記円筒体の中心軸の方向、前記線状の光源の長手方向、および前記2次元撮像部の前記第2の方向が、互いに平行に配置された、請求項1に記載の円筒体表面検査装置。
- 円筒体の表面を検査する円筒体表面検査方法であって、
検査位置において一方向に円筒体を相対的に移動させながら、円筒体に対して光を照射し、
前記照射した光が円筒体の表面で反射した光を2次元で撮像し、
あらかじめ定めた周期でのみ2次元画像データの第1の方向の走査位置であって、前記円筒体の周方向に対応する走査位置を決定し、
前記2次元画像データのうち、前記決定した前記第1の方向の各位置における前記第1の方向と垂直な第2の方向の画像データの取り出しを、複数の前記2次元画像データに対して行い、取り出した前記第1の方向の各画像データを時系列順に、前記第1の方向に並べて時系列走査画像を生成し、
前記時系列走査画像を検査して欠点を検出する手順を含み、
前記走査位置を、
前記2次元画像データから前記第1の方向の各位置における前記第2の方向の各画素の輝度の積算値を算出し、それらを前記第1の方向に並べて輝度のプロファイルを求め、
前記輝度のプロファイルから、輝度の最も高いピーク位置の輝度にあらかじめ定めた1未満の係数倍値を掛けた輝度に対応する前記第1の方向の位置とする、円筒体表面検査方法。 - 前記円筒体の表面に線状の光を照射するとともに、前記円筒体の中心軸の方向、前記線状の光の長手方向、および前記2次元で撮像する際の主走査方向を、互いに平行となるようにする、請求項3に記載の円筒体表面検査方法。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/489,390 US10955354B2 (en) | 2017-03-17 | 2018-03-02 | Cylindrical body surface inspection device and cylindrical body surface inspection method |
EP18768694.4A EP3598112B1 (en) | 2017-03-17 | 2018-03-02 | Cylindrical body surface inspection device and cylindrical body surface inspection method |
JP2018512346A JP7010213B2 (ja) | 2017-03-17 | 2018-03-02 | 円筒体表面検査装置および円筒体表面検査方法 |
MYPI2019005132A MY195515A (en) | 2017-03-17 | 2018-03-02 | Cylindrical Body Surface Inspection Device and Cylindrical Body Surface Inspection Method |
CN201880017052.6A CN110402386B (zh) | 2017-03-17 | 2018-03-02 | 圆筒体表面检查装置及圆筒体表面检查方法 |
KR1020197023846A KR102409084B1 (ko) | 2017-03-17 | 2018-03-02 | 원통체 표면 검사 장치 및 원통체 표면 검사 방법 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-052310 | 2017-03-17 | ||
JP2017052310 | 2017-03-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018168510A1 true WO2018168510A1 (ja) | 2018-09-20 |
Family
ID=63522259
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/007917 WO2018168510A1 (ja) | 2017-03-17 | 2018-03-02 | 円筒体表面検査装置および円筒体表面検査方法 |
Country Status (7)
Country | Link |
---|---|
US (1) | US10955354B2 (ja) |
EP (1) | EP3598112B1 (ja) |
JP (1) | JP7010213B2 (ja) |
KR (1) | KR102409084B1 (ja) |
CN (1) | CN110402386B (ja) |
MY (1) | MY195515A (ja) |
WO (1) | WO2018168510A1 (ja) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6939640B2 (ja) * | 2018-02-23 | 2021-09-22 | オムロン株式会社 | 画像検査装置および画像検査方法 |
CN114074294B (zh) * | 2020-08-19 | 2022-11-18 | 宝山钢铁股份有限公司 | 一种用于圆钢修磨质量检测系统的抑制炫光方法 |
CN117250208B (zh) * | 2023-11-20 | 2024-02-06 | 青岛天仁微纳科技有限责任公司 | 基于机器视觉的纳米压印晶圆缺陷精准检测系统及方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004108828A (ja) | 2002-09-13 | 2004-04-08 | Ricoh Co Ltd | 画像入力方法、画像入力装置及び表面欠陥検査装置 |
JP2007071562A (ja) * | 2005-09-05 | 2007-03-22 | Toray Ind Inc | 表面検査装置、表面検査方法およびフィルムの製造方法 |
US20080192243A1 (en) * | 2007-02-09 | 2008-08-14 | Kamran Uz Zaman | Plural light source and camera to detect surface flaws |
JP2013195169A (ja) * | 2012-03-16 | 2013-09-30 | Jfe Steel Corp | 円筒体又は円柱体材料の表面欠陥検査方法及び装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6169600B1 (en) * | 1998-11-20 | 2001-01-02 | Acuity Imaging, Llc | Cylindrical object surface inspection system |
JP3919618B2 (ja) * | 2002-07-10 | 2007-05-30 | キヤノン株式会社 | 記録媒体判別方法、プログラム、記憶媒体および記録装置 |
CN101287960A (zh) * | 2004-10-13 | 2008-10-15 | 阿克罗米特里克斯有限责任公司 | 测量连续移动样品的样品表面平整度的系统与方法 |
JP5006551B2 (ja) * | 2006-02-14 | 2012-08-22 | 住友化学株式会社 | 欠陥検査装置及び欠陥検査方法 |
US8818746B1 (en) * | 2010-03-26 | 2014-08-26 | The United States Of America As Represented By The Secretary Of The Army | Crack detection in thick-walled cylinders |
US9148601B2 (en) * | 2012-09-26 | 2015-09-29 | Teledyne Dalsa, Inc. | CMOS TDI image sensor with rolling shutter pixels |
CN103286081B (zh) * | 2013-05-07 | 2015-04-22 | 浙江工业大学 | 基于单目多视角机器视觉的钢珠表面缺陷在线自动分选装置 |
US9106857B1 (en) * | 2014-05-09 | 2015-08-11 | Teledyne Dalsa, Inc. | Dynamic fixed-pattern noise reduction in a CMOS TDI image sensor |
US10242437B2 (en) * | 2016-01-15 | 2019-03-26 | W. L. Gore & Associates, Inc. | Systems and methods for detecting syringe seal defects |
-
2018
- 2018-03-02 EP EP18768694.4A patent/EP3598112B1/en active Active
- 2018-03-02 WO PCT/JP2018/007917 patent/WO2018168510A1/ja active Application Filing
- 2018-03-02 MY MYPI2019005132A patent/MY195515A/en unknown
- 2018-03-02 CN CN201880017052.6A patent/CN110402386B/zh active Active
- 2018-03-02 KR KR1020197023846A patent/KR102409084B1/ko active IP Right Grant
- 2018-03-02 US US16/489,390 patent/US10955354B2/en active Active
- 2018-03-02 JP JP2018512346A patent/JP7010213B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004108828A (ja) | 2002-09-13 | 2004-04-08 | Ricoh Co Ltd | 画像入力方法、画像入力装置及び表面欠陥検査装置 |
JP2007071562A (ja) * | 2005-09-05 | 2007-03-22 | Toray Ind Inc | 表面検査装置、表面検査方法およびフィルムの製造方法 |
US20080192243A1 (en) * | 2007-02-09 | 2008-08-14 | Kamran Uz Zaman | Plural light source and camera to detect surface flaws |
JP2013195169A (ja) * | 2012-03-16 | 2013-09-30 | Jfe Steel Corp | 円筒体又は円柱体材料の表面欠陥検査方法及び装置 |
Also Published As
Publication number | Publication date |
---|---|
US20200116648A1 (en) | 2020-04-16 |
KR102409084B1 (ko) | 2022-06-15 |
CN110402386B (zh) | 2021-08-06 |
MY195515A (en) | 2023-01-29 |
EP3598112A4 (en) | 2021-01-06 |
EP3598112B1 (en) | 2021-12-15 |
EP3598112A1 (en) | 2020-01-22 |
JPWO2018168510A1 (ja) | 2020-01-16 |
CN110402386A (zh) | 2019-11-01 |
US10955354B2 (en) | 2021-03-23 |
JP7010213B2 (ja) | 2022-01-26 |
KR20190128151A (ko) | 2019-11-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP3754003B2 (ja) | 欠陥検査装置及びその方法 | |
US7924418B2 (en) | Inspection apparatus and method | |
JP5031691B2 (ja) | 表面疵検査装置 | |
TWI648534B (zh) | 磊晶晶圓之裏面檢查方法、磊晶晶圓裏面檢查裝置、磊晶成長裝置之升降銷管理方法以及磊晶晶圓之製造方法 | |
WO2018168510A1 (ja) | 円筒体表面検査装置および円筒体表面検査方法 | |
JP5068731B2 (ja) | 表面疵検査装置、表面疵検査方法及びプログラム | |
JP2012078144A (ja) | 透明体シート状物の表面欠陥検査装置 | |
WO2017169242A1 (ja) | 欠陥検査装置、及び欠陥検査方法 | |
KR20180050369A (ko) | 에피택셜 웨이퍼 이면 검사 장치 및 그것을 이용한 에피택셜 웨이퍼 이면 검사 방법 | |
JP5842373B2 (ja) | 表面欠陥検出方法、および表面欠陥検出装置 | |
JP2008025990A (ja) | 鋼帯表面欠陥の検出方法及び検出装置 | |
JP2010078485A (ja) | 印刷物の検査方法 | |
WO2023047866A1 (ja) | シート状物の凹凸測定装置、シート状物の凹凸測定方法 | |
JP2012058206A (ja) | マスクの欠陥検査方法及び欠陥検査装置 | |
JP2011085468A (ja) | シート状物の欠陥検査方法 | |
JP2012173194A (ja) | 表面検査装置、表面検査方法およびフィルムの製造方法 | |
JP2009222683A (ja) | 表面検査方法および装置 | |
JP4023295B2 (ja) | 表面検査方法及び表面検査装置 | |
JP4549838B2 (ja) | 光沢度測定方法および装置 | |
JP2010230450A (ja) | 物体表面検査装置 | |
JP2001141659A (ja) | 画像撮像装置及び欠陥検出装置 | |
JP2007198762A (ja) | 欠陥検出方法および欠陥検出装置 | |
JP2012141322A (ja) | 表面欠陥検査装置 | |
JP2004125629A (ja) | 欠陥検出装置 | |
JP2015049091A (ja) | ロール状フィルムの凹凸欠陥検査装置及び検査方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2018512346 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18768694 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20197023846 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2018768694 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2018768694 Country of ref document: EP Effective date: 20191017 |