WO2019171474A1 - Dispositif et procédé d'inspection de propriété de surface et programme - Google Patents

Dispositif et procédé d'inspection de propriété de surface et programme Download PDF

Info

Publication number
WO2019171474A1
WO2019171474A1 PCT/JP2018/008605 JP2018008605W WO2019171474A1 WO 2019171474 A1 WO2019171474 A1 WO 2019171474A1 JP 2018008605 W JP2018008605 W JP 2018008605W WO 2019171474 A1 WO2019171474 A1 WO 2019171474A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
line width
light
line
luminance
Prior art date
Application number
PCT/JP2018/008605
Other languages
English (en)
Japanese (ja)
Inventor
武男 中田
酒井 宏樹
俊博 筒井
剛史 真坂
Original Assignee
日本製鉄株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本製鉄株式会社 filed Critical 日本製鉄株式会社
Priority to PCT/JP2018/008605 priority Critical patent/WO2019171474A1/fr
Publication of WO2019171474A1 publication Critical patent/WO2019171474A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
    • G01N21/892Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles characterised by the flaw, defect or object feature examined

Definitions

  • the present invention relates to a surface texture inspection device, a surface texture inspection method, and a program.
  • examples thereof include a cylindrical processed product manufactured by adjusting the outer diameter of a material after hot forming a billet or the like by mechanical cutting or the like.
  • an inspection method of these machined steel materials in Patent Document 1 below, for example, an inspection device using a so-called optical cutting method that irradiates the surface of a machined steel material with a line laser and images the entire surface, or a machined steel material. There has been proposed an optical inspection for monitoring the surface state by changing the light receiving intensity of the light projected on the surface of the light.
  • Patent Document 2 As a method for detecting minute unevenness, in Patent Document 2 below, the surface of a machined product is irradiated with slit laser light (linear light), and the irradiation light is focused on the concave portion and irradiated on the convex portion.
  • slit laser light linear light
  • the irradiation light is focused on the concave portion and irradiated on the convex portion.
  • This technique is a technique that utilizes the characteristic that the width of the irradiation light is small if the surface of the machined product is smooth, and the width of the irradiation light is large if there is a recess.
  • the inspection technique proposed in Patent Document 2 is a technique for inspecting only a specific part of the surface of a machined product, and in addition, uses a change in the line width of irradiation light in a part of the machined product. Therefore, there is a problem that the visibility of the position of the defective portion is poor. Further, in the technique proposed in Patent Document 2 above, since the two-dimensional imaging of the portion having a large surface roughness (unevenness) is not performed, the portion having a large surface roughness using various image processing Cannot be extracted, or the size of a part having a large surface roughness cannot be specified. Furthermore, since the change of the line width in the linear laser beam is slight, there is a problem that the visibility is low only with the projection image of the irradiation light.
  • the billet which is an intermediate material
  • the billet is hot-formed regardless of whether the unevenness information by the light cutting method or the brightness change information of the illumination reflected light is used. It is in the present situation that it is impossible to clearly reveal the portion where the background of the material after remaining, rust, etc. are clearly manifested.
  • it is possible to image the surface of the machined product based on the luminance information of the illumination reflected light it is affected by surrounding noise components (for example, machined skin and dust), and the defective part It is very difficult to distinguish between the normal part and the normal part.
  • the present invention has been made in view of the above-mentioned problems, and the object of the present invention is to provide a machined product obtained by machining an intermediate material with respect to the background remaining portion and rust of the intermediate material.
  • An object of the present invention is to provide a surface property inspection apparatus, a surface property inspection method, and a program capable of inspecting the surface shape of a machined product while becoming apparent.
  • a machined product obtained by machining an intermediate material is used as an object to be inspected, and the object to be inspected is subjected to a linear laser beam.
  • a depth image representing the uneven state of the surface of the object to be inspected a luminance image representing the luminance distribution of the linear laser light on the surface of the object to be inspected, and a linear shape on the surface of the object to be inspected
  • line width images representing the line width distribution of the laser beam of I came to know that there is.
  • a surface property inspection apparatus using a machined product made of an intermediate material as an object to be inspected wherein the illumination apparatus irradiates a surface of the moving object to be inspected with a linear laser beam, and An imaging apparatus that generates a plurality of light-cut images, which are captured images of the linear laser light on the surface, along the moving direction of the inspection object by imaging the surface irradiated with the linear laser light And, based on a fringe image frame in which light cutting lines, which are line segments corresponding to the irradiated portions of the linear laser light in each of the plurality of light cutting images, are sequentially arranged along the moving direction, A depth image representing an uneven state of the surface of the body, a luminance image representing a luminance distribution of the linear laser light on the surface of the inspection object, and the linear laser light on the surface of the inspection object Bright line width distribution in moving direction An image calculation unit that calculates a line width image associated with the value, and a detection process for detecting a surface property of the object to be inspected
  • the line width image is calculated by calculating and assigning a luminance value according to the calculated magnitude of the difference, and the detection processing unit uses the intermediate material as a material based on the line width image.
  • the detection processing unit detects the background remaining portion and the rust based on whether a luminance value of the line width image is equal to or higher than a first threshold value for detecting the background remaining portion and rust.
  • the surface property inspection apparatus according to (1).
  • the image calculation unit calculates a barycentric position in the line width direction of the light cutting line along the moving direction of the object to be inspected, and is a position in the moving direction designated in advance with respect to the light cutting image.
  • the surface property inspection apparatus according to (1) or (2), wherein the depth image is calculated based on a displacement amount between a reference position and the gravity center position.
  • the detection processing unit identifies and identifies the defective part based on whether or not the luminance values of the depth image and the luminance image are equal to or greater than a second threshold value for specifying the defective part.
  • a defect existing on the surface of the object to be inspected is determined (1) to (3)
  • the surface property inspection apparatus according to any one of the above.
  • the detection processing unit performs the detection of the defective portion other than the portion where the background remaining portion or the rust is detected based on the line width image.
  • the surface property inspection apparatus according to 4).
  • the imaging device includes at least two imaging devices that image the surface irradiated with the linear laser light from each of an upstream side in the moving direction and a downstream side in the moving direction.
  • the surface property inspection apparatus according to any one of (1) to (5).
  • Light that generates a plurality of light-cut images that are captured images of the linear laser light on the surface along the moving direction of the inspection object by imaging the surface irradiated with the laser light of Based on a cutting image generation step and a fringe image frame in which light cutting lines that are line segments corresponding to the irradiated portions of the linear laser light in each of the plurality of light cutting images are arranged in order along the moving direction.
  • a depth image representing the uneven state of the surface of the object to be inspected
  • a luminance image representing a luminance distribution of the linear laser light on the surface of the object to be inspected
  • the linear image on the surface of the object to be inspected the linear image on the surface of the object to be inspected.
  • the transfer of laser light An image calculation step for calculating a line width image in which the distribution of the line width in the direction is associated with the luminance value, and the depth image, the luminance image, and the line width image of the object to be inspected based on the calculated depth image
  • a detection processing step of detecting a surface property wherein the image calculation step includes, for each light cutting line in the fringe image frame, a predetermined width and a predetermined width at each position in the extending direction of the light cutting line.
  • the line width image is calculated by calculating a difference with a threshold line width and assigning a luminance value according to the calculated magnitude of the difference, and the detection processing step is based on the line width image.
  • the background remaining portion and the rust are detected based on whether a luminance value of the line width image is equal to or higher than a first threshold value for detecting the background remaining portion and rust.
  • the image calculating step calculates a barycentric position in the line width direction of the light cutting line along the moving direction of the object to be inspected, and is a position in the moving direction designated in advance with respect to the light cutting image.
  • the surface property inspection method according to (7) or (8), wherein the depth image is calculated based on a displacement amount between a reference position and the gravity center position. (10)
  • the defective part is specified based on whether the luminance value of the depth image and the luminance image is equal to or higher than a second threshold value for specifying the defective part.
  • a defect present on the surface of the object to be inspected is determined (7) to (9)
  • the surface property inspection method according to any one of the above.
  • the imaging device includes at least two imaging devices that image the surface irradiated with the linear laser light from each of an upstream side in the moving direction and a downstream side in the moving direction.
  • the surface property inspection method according to any one of (1) to (11).
  • a machined product made of an intermediate material is used as an object to be inspected, and an illumination device that irradiates the surface of the moving object to be inspected with linear laser light, and the linear laser light is irradiated By imaging the surface, it is possible to mutually communicate with each of the imaging devices that generate a plurality of light-cut images that are images of the linear laser light on the surface along the moving direction of the inspection object. Based on a fringe image frame in which light cutting lines, which are line segments corresponding to irradiation portions of the linear laser light in each of the generated plurality of light cutting images, are arranged in order along the moving direction.
  • the line width image is calculated by calculating a difference from a predetermined threshold line width and assigning a luminance value according to the calculated magnitude of the difference, and the detection processing function is based on the line width image.
  • a program for detecting whether or not a background remaining portion or rust made of the intermediate material is present on the surface of the object to be inspected The detection processing function detects the background remaining portion and the rust based on whether a luminance value of the line width image is equal to or higher than a first threshold value for detecting the background remaining portion and rust. (13) The program according to (13). (15) The image calculation function calculates a barycentric position in the line width direction of the light cutting line along the moving direction of the object to be inspected, and is a position in a moving direction designated in advance with respect to the light cutting image. The program according to (13) or (14), wherein the depth image is calculated based on a displacement amount between a reference position and the gravity center position.
  • the detection processing function specifies a defective part based on whether or not the luminance values of the depth image and the luminance image are equal to or greater than a second threshold value for specifying the defective part.
  • a defect present on the surface of the object to be inspected is determined (13) to (15)
  • the program as described in any one of these.
  • the detection processing function performs the detection of the defect portion other than the portion where the background remaining portion or the rust is detected based on the line width image.
  • the imaging device includes at least two imaging devices that image the surface irradiated with the linear laser light from each of an upstream side in the moving direction and a downstream side in the moving direction.
  • a machined product obtained by machining a material after hot forming a billet or the like as an intermediate material after billet or the like as an intermediate material is hot formed.
  • the surface shape of the machined product can be inspected while the remaining surface of the surface of the material and the rust remaining on the surface and the rust appear.
  • FIG. 1 is an explanatory diagram showing an example of the configuration of the surface texture inspection apparatus 10 according to the present embodiment.
  • the surface texture inspection device 10 is a machined product obtained by machining a material after hot forming a billet or the like that is an intermediate material (hereinafter simply referred to as a “machined product using an intermediate material”). This is an apparatus for inspecting the surface property of the inspection object 1 that is moving in a predetermined direction.
  • the intermediate material is a material before becoming a final product such as a slab or billet. After such an intermediate material is hot-formed, machining such as polishing and cutting is performed, whereby a machined product focused on in this embodiment is manufactured.
  • a machined product for example, a disk-shaped manufactured by adjusting the shape of the material after hot forming a billet, which is a kind of intermediate material mainly composed of iron (Fe), by mechanical cutting Examples thereof include a processed product and a cylindrical processed product manufactured by adjusting the outer diameter of a material after hot forming a billet or the like by mechanical cutting or the like.
  • the intermediate material is not limited to a material mainly composed of iron, and may be a material mainly composed of a non-ferrous metal.
  • the surface texture inspection apparatus 10 images an object to be inspected that images the surface of the object 1 to be inspected (in particular, the machined surface of a machined product that is the object 1 to be inspected).
  • the apparatus 100 mainly includes an arithmetic processing device 200 that performs image processing on an image obtained as a result of imaging.
  • the inspection object imaging device 100 is installed above the moving inspection object 1 as described in detail below.
  • the inspection object imaging apparatus 100 is an apparatus that sequentially images the surface of the inspection object 1 moving in a predetermined direction and outputs a captured image obtained as a result of the imaging to the arithmetic processing apparatus 200. Imaging processing by the inspected object imaging device 100 is controlled by the arithmetic processing device 200.
  • a PLG Pulse Logic Generator: pulse-type speed detector
  • a transport line for transporting a machined product that is the object to be inspected 1 in order to detect the moving speed of the object 1 to be inspected. Yes.
  • the arithmetic processing device 200 periodically transmits a control signal to the inspecting object imaging device 100 based on one pulse of the PLG signal input from the PLG, and the inspecting object imaging device 100 is transmitted based on the control signal.
  • the inspected object imaging apparatus 100 can image the surface of the inspected object 1 every time the inspected object 1 moves by a predetermined distance or a predetermined time.
  • the arithmetic processing unit 200 controls the entire imaging process of the surface of the inspection object 1 in the inspection object imaging apparatus 100 as described above.
  • the arithmetic processing device 200 generates a fringe image frame using the captured image generated by the inspected object imaging device 100, and performs image processing on the fringe image frame, so that the inspected object 1 is inspected. Detects various defects that may be present on the surface. Examples of the above-mentioned various defects include, for example, a flaw accompanied by unevenness change, a flaw of a pattern system, a remaining surface of the material after the intermediate material is hot-formed (hereinafter referred to simply as “intermediate material as material”). ”And rust and the like.
  • FIGS. 2A to 4C are explanatory diagrams schematically showing an example of the configuration of the inspection subject imaging apparatus according to the present embodiment.
  • FIG. 3A is an explanatory diagram for explaining a state of reflection on the surface of a normal object to be inspected
  • FIG. 3B is an explanatory diagram for explaining a state of reflection on a rough surface portion of the object to be inspected.
  • FIG. 4A is an explanatory view schematically showing a light section image when a normal surface of an object to be inspected is imaged.
  • FIG. 3A is an explanatory diagram for explaining a state of reflection on the surface of a normal object to be inspected
  • FIG. 3B is an explanatory diagram for explaining a state of reflection on a rough surface portion of the object to be inspected.
  • FIG. 4A is an explanatory view schematically showing a light section image when a normal surface of an object to be inspected is imaged.
  • FIG. 4B is an explanatory view schematically showing a light section image when the surface of the object to be inspected including the concave portion is imaged.
  • FIG. 4C is an explanatory view schematically showing a light section image when the surface of the inspection object including the rough surface portion is imaged.
  • FIG. 2A is a schematic diagram when the inspection object imaging apparatus 100 is viewed from above the inspection object 1.
  • the illustration shown in the lower part of FIG. 2A is the inspection object imaging apparatus 100.
  • FIG. 2 is a schematic view when the device is viewed from the side of the device under test 1.
  • the diagram shown in the upper part of FIG. 2B is a schematic diagram when the object imaging device 100 is viewed from above the object 1 to be inspected
  • the figure shown in the lower part of FIG. 1 is a schematic diagram when an imaging apparatus 100 is viewed from the side of an object to be inspected 1.
  • the moving direction of the machined product that is the inspection object 1 is the y-axis positive direction
  • the direction orthogonal to the moving direction is the x-axis positive direction
  • the vertical direction is the positive z-axis direction.
  • the inspected object imaging device 100 includes an illumination device 101 and an area camera 103 which is an example of the imaging device.
  • the illumination device 101 and the area camera 103 are fixed by known means (not shown) so that their installation positions do not change.
  • the illumination device 101 is a device that illuminates the surface of the inspection object 1 by irradiating the surface of the machined product that is the inspection object 1 with predetermined light.
  • the illuminating device 101 includes at least a laser light source that irradiates the surface of the inspection object 1 with a linear laser beam L.
  • the illuminating device 101 condenses light source units that emit laser light of a predetermined wavelength, such as a visible light band, and the line width direction while expanding the laser light emitted from the light source unit in the x-axis direction.
  • a lens for example, a cylindrical lens, a rod lens, a Powell lens, etc.
  • the line width of the linear laser beam L immediately before reaching the surface of the device under test 1 can be, for example, about several hundred ⁇ m (for example, about 200 ⁇ m).
  • a linear bright portion is formed along the x-axis direction.
  • a line segment corresponding to this linear bright part is called a light cutting line.
  • the area camera 103 which is an example of an imaging device, is a device that images the entire surface of the inspection object 1 irradiated with the linear laser light L over the entire surface.
  • the area camera 103 includes a lens having a predetermined open aperture value and a focal length, and various sensors such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) that function as an imaging device.
  • the area camera 103 may be a monochrome camera or a color camera.
  • the focal length and angle of view of the lens mounted on the area camera 103 and the distance between the illumination device 101 and the image sensor of the area camera 103 are not particularly specified, x on the surface of the object 1 to be inspected The selection is preferably made so that the entire direction is within the field of view VF. Further, the size and pixel size of the image sensor mounted on the area camera 103 are not particularly specified, but it is preferable to use a larger image sensor in consideration of the image quality and image resolution of the generated image. Further, from the viewpoint of image processing described below, it is preferable that the line width of the linear laser beam L is adjusted to be about 2 to 4 pixels on the image sensor.
  • the area camera 103 images the linear laser light L applied to the surface of the object 1 to be inspected, so that an optical cutting line that is a line segment corresponding to the irradiated portion of the linear laser light L is imaged. In addition, a so-called light cut image is generated.
  • the area camera 103 When the area camera 103 generates the light cut image, the area camera 103 outputs the generated light cut image to the arithmetic processing device 200.
  • the optical positional relationship between the illumination device 101 and the area camera 103 can be set as appropriate.
  • the illumination device 101 is provided vertically above the inspection object 1, irradiates the inspection object 1 with the linear laser beam L vertically, and the area camera 103.
  • the illumination device 101 and the area camera 103 can be arranged so that the reflected light of the linear laser beam L is imaged from the direction of the angle ⁇ with respect to the vertical direction (z-axis direction).
  • the size of the angle ⁇ shown in FIG. 2A is preferably as large as possible within a range where there are no restrictions on the installation of the area camera 103. Thereby, it is possible to capture the irregular reflection of the light section line with the area camera 103.
  • the size of ⁇ shown in FIG. 2A is preferably about 30 to 60 degrees, for example.
  • FIG. 2A illustrates a case where the inspection object imaging apparatus 100 includes only one area camera which is an example of the imaging apparatus, but as illustrated in FIG. 2B, the inspection object imaging is performed.
  • the apparatus 100 includes at least two area cameras 103 and 105 so that the surface of the inspection object 1 irradiated with the linear laser beam L can be imaged from each of the upstream side in the movement direction and the downstream side in the movement direction. You may have.
  • the area camera 103 and the area camera 105 are equally arranged at an angle ⁇ on the upstream side and the downstream side in the moving direction of the device under test 1.
  • each area camera is provided on the upstream side and the downstream side in the moving direction, and by using the light cut image output from each area camera, the direction of the tilt is not affected. It becomes possible to inspect the surface property of the inspection object 1 more accurately.
  • ⁇ Inspected object (machined product) Width (length in the x-axis direction): about 600 mm to 1750 mm ⁇
  • Lighting device 101 Irradiates red laser light from a laser light source with an output of 100 mW.
  • the line width of the linear laser beam L irradiated on the surface of the inspection object 1 is 0.25 mm (250 ⁇ m). However, the line width in this case is defined as 13.5% from the peak intensity value.
  • ⁇ Area camera A CCD of 2048 pixels ⁇ 2048 pixels (pixel size: 5.5 ⁇ m ⁇ 5.5 ⁇ m) is mounted as an image sensor, and the frame rate is 200 fps.
  • the focal length of the lens is 24 mm and the field angle is 26 °.
  • the pixel size of the image to be captured is 0.25 mm ⁇ 0.25 mm, and the line width of the linear laser beam L is captured with a width of 2 to 4 bright lines on the captured image.
  • the width of the object to be inspected is large, for example, it is possible to arrange a plurality of area cameras in the width direction as necessary for securing pixel resolution.
  • Image acquisition method When the object to be inspected 1 is moving, images are continuously acquired in synchronization with a signal output from the PLG. Specifically, imaging is performed every time the inspected object 1 moves forward in a time of 5 msec. At this time, two area cameras installed as shown in FIG. 2B are synchronized, and the same visual field is simultaneously imaged.
  • the illumination device 101 is provided above the vertical direction and irradiates the linear laser beam L downward in the vertical direction.
  • the installation angle ⁇ of the two area cameras is ⁇ 45 degrees. Note that the imaging resolution is determined according to the size of the target scratch or the like, but the capture pitch is set to 0.25 mm to match the pixel size of the image sensor.
  • the machined product focused on as the object to be inspected 1 in the present embodiment is subjected to machining such as polishing or grinding, the surface thereof has a substantially constant surface property and has regular reflectivity. strong. Therefore, the surface of a normal part (hereinafter, also simply referred to as “normal part”) that does not have scratches with unevenness, a background remaining part, rust, or the like has only extremely small unevenness of 10 ⁇ m or less.
  • the linear laser light L irradiated to the normal part shows substantially the same reflection characteristics and is imaged by the area camera.
  • the line width of the light section line takes a substantially constant value as schematically shown in FIG.
  • the light cutting image at the concave portion shows a line of a substantially constant light cutting line.
  • the optical cutting line is bent while maintaining the width.
  • the remaining surface of the ground and the surface of the rust to which attention is paid in this embodiment have irregularities of about 100 ⁇ m, and are rougher than the normal part.
  • the background remaining portion and rust (hereinafter collectively referred to as “rough surface portion”) function as a diffusion surface because of the roughness of the surface, as schematically shown in FIG. 3B. Therefore, the linear laser beam L irradiated to the rough surface portion causes irregular reflection.
  • the line width of the portion corresponding to the rough surface portion is compared with the normal portion as schematically shown in FIG. 4C because of the irregular reflection as described above. Is expanded.
  • the remaining surface portion of the ground and the rust are made visible based on the line width of the light section line in the light section image schematically shown in FIGS. 4A to 4C. Further, the arithmetic processing device 200 generates information on the surface shape of the inspection object 1 by a so-called light cutting method using the light cutting image, and detects unevenness scratches and the like existing on the surface of the inspection object 1.
  • the configuration of the inspection object imaging device 100 according to the present embodiment and the light section image generated by the inspection object imaging device 100 have been described in detail above with reference to FIGS. 2A to 4C.
  • the arithmetic processing device 200 mainly includes an imaging control unit 201, an image processing unit 203, a display control unit 205, and a storage unit 207.
  • the imaging control unit 201 is realized by a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), a communication device, and the like.
  • the imaging control unit 201 performs imaging control of the inspection object 1 by the inspection object imaging apparatus 100 according to the present embodiment. More specifically, the imaging control unit 201 sends a control signal for starting oscillation of laser light to the illumination device 101 when imaging of the inspection object 1 is started.
  • a PLG signal is periodically sent from the conveyance line of the device under test 1 (for example, one pulse of PLG signal is sent every time the device under test 1 moves 0.25 mm).
  • the imaging control unit 201 sends a trigger signal for starting imaging to the area cameras 103 and 105 every time a PLG signal is acquired.
  • the image processing unit 203 is realized by, for example, a CPU, a ROM, a RAM, a communication device, and the like.
  • the image processing unit 203 uses the imaging data of the light section image acquired from the inspected object imaging device 100 (more specifically, the area cameras 103 and 105 of the inspected object imaging device 100) to form a fringe image frame to be described later. Is generated. Thereafter, the image processing unit 203 performs image processing as will be described below on the generated striped image frame to inspect the surface property of the machined product that is the inspection object 1 and may exist on the surface. Detects various types of defects. When the image processing unit 203 finishes the defect detection process on the surface of the inspection object 1, the image processing unit 203 transmits information about the obtained detection result to the display control unit 205.
  • the image processing unit 203 will be described in detail later.
  • the display control unit 205 is realized by, for example, a CPU, a ROM, a RAM, an output device, and the like.
  • the display control unit 205 transmits the surface property inspection result of the machined product that is the inspection object 1 transmitted from the image processing unit 203 to an output device such as a display provided in the arithmetic processing device 200 or the outside of the arithmetic processing device 200. Display control when displaying on the provided output device or the like is performed. Thereby, the user of the surface texture inspection apparatus 10 can grasp the inspection result regarding the surface texture of the machined product that is the inspection object 1 on the spot.
  • the storage unit 207 is realized by, for example, a RAM or a storage device included in the arithmetic processing device 200 according to the present embodiment.
  • the storage unit 207 stores various parameters, intermediate progress of processing, and various databases and programs that need to be saved when the arithmetic processing apparatus 200 according to the present embodiment performs some processing, as appropriate. To be recorded.
  • the storage unit 207 allows the imaging control unit 201, the image processing unit 203, the display control unit 205, and the like to execute read / write processing.
  • FIG. 5 is a block diagram illustrating an example of a configuration of an image processing unit included in the arithmetic processing apparatus according to the present embodiment.
  • the image processing unit 203 mainly includes a data acquisition unit 211, a fringe image frame generation unit 213, an image calculation unit 215, and a detection processing unit 225.
  • the data acquisition unit 211 is realized by, for example, a CPU, a ROM, a RAM, a communication device, and the like.
  • the data acquisition unit 211 acquires the image data (digital multi-valued image data) of the light section image output from the inspection subject imaging apparatus 100 (more specifically, the area cameras 103 and 105), and the storage unit 207 and the like. Are sequentially stored in the image memory provided. By using these digital multi-valued image data sequentially along the moving direction of the object 1 to be inspected, a fringe image frame as described later is generated.
  • the light section image acquired by the data acquisition unit 211 is a linear laser beam irradiated on the surface of the inspection object 1 at a certain position along the moving direction of the surface of the inspection object 1.
  • L is taken.
  • the light-cut image is displayed by setting the area camera gain and the lens aperture appropriately in advance, for example, the portion irradiated with the linear laser light L is displayed in white, and the other portions are displayed in black and white. It can be an image.
  • the irregularities superimposed on the light cutting line existing in the light cutting image and the line width of the light cutting line itself are various defects including the cross-sectional shape of the surface of the inspection object 1 and the rough surface portion existing on the surface. Contains information about etc.
  • the striped image frame generation unit 213 is realized by, for example, a CPU, a ROM, a RAM, and the like.
  • the fringe image frame generation unit 213 sequentially acquires the light section images stored along the moving direction of the device under test 1 from the image memory provided in the storage unit 207 or the like. Thereafter, the fringe image frame generation unit 213 uses an area including the optical cutting line among the obtained optical cutting images, and sequentially displays the image of the area including the optical cutting line along the moving direction of the inspection object 1. By arranging, a fringe image frame is generated.
  • the number of light cut images constituting one stripe image frame may be set as appropriate, but for example, one stripe image frame may be constituted by 256 light cut images.
  • FIG. 6 shows an example of a stripe image frame generated by the stripe image frame generation unit 213.
  • the striped image frame shown in FIG. 6 shows 16 light cut images out of 256 light cut images.
  • one line segment extending in the horizontal direction of the drawing corresponds to one light cut image, and the horizontal direction of the drawing corresponds to the x-axis direction in FIG. 2A and the like. doing.
  • the vertical direction of the drawing corresponds to the y-axis direction (that is, the moving direction of the device under test 1) in FIG. 2A and the like.
  • the stripe image frame generation unit 213 When the stripe image frame generation unit 213 generates the stripe image frame as illustrated in FIG. 6, the stripe image frame generation unit 213 outputs the generated stripe image frame to the image calculation unit 215 described later. Further, the fringe image frame generation unit 213 may associate the time information related to the date and time when the fringe image frame is generated with the data corresponding to the generated fringe image frame and store the data in the storage unit 207 or the like as history information. .
  • the image calculation unit 215 is realized by, for example, a CPU, a ROM, a RAM, and the like. Based on the fringe image frame generated by the fringe image frame generation unit 213, the image calculation unit 215 has a depth image representing an uneven state on the surface of the inspection object 1 and linear laser light on the surface of the inspection object 1. A luminance image representing the luminance distribution of L and a line width image in which the line width distribution in the moving direction of the linear laser beam L on the surface of the inspection object 1 is associated with the luminance value are calculated. As shown in FIG. 5, the image calculation unit 215 includes a light section line processing unit 217, a depth image calculation unit 219, a luminance image calculation unit 221, and a line width image calculation unit 223.
  • the light section line processing unit 217 is realized by, for example, a CPU, a ROM, a RAM, and the like.
  • the light section line processing unit 217 calculates, for each light section line included in the fringe image frame, a light section line feature amount including a displacement amount of the light section line (bending degree of the bright line).
  • a light section line feature amount including a displacement amount of the light section line (bending degree of the bright line).
  • FIGS. 7A and 7B is an explanatory diagram schematically showing a fringe image frame.
  • FIG. 7B is an explanatory diagram for explaining the optical section line processing performed by the optical section line processing unit.
  • FIG. 7A it is assumed that there are N light cutting lines in one striped image frame, and the horizontal length of the striped image frame is M pixels. Further, one optical section image which includes a single optical cutting line, and a vertical y n pixels ⁇ horizontal M pixels.
  • one vertical pixel number y size of n in the light section image in other words, including one optical cutting line, from one light section image, the number of vertical pixels to be cut out when fringe image frame is generated (the size of yn) can be determined by roughly calculating in advance the range of the height of the concave portion or the convex portion that may exist in the inspection object 1 based on past operation data or the like. is there.
  • the X-axis is taken in the x-axis direction (lateral direction of the striped image frame in FIG. 7A) orthogonal to the moving direction of the inspection object 1, and the y-axis direction corresponding to the moving direction of the inspection object 1
  • the Y-axis is taken in the vertical direction of the striped image frame in FIG. 7A, and the pixel position in the striped image frame is expressed by XY coordinates.
  • the position of the m-th pixel (1 ⁇ m ⁇ M) from the left side of the j (1 ⁇ j ⁇ N) th light section line existing in the fringe image frame ie, represented by X j, m ). The position).
  • the light section line processing unit 217 first determines an X coordinate position (a position represented by X j, m in this description) of a light section line (hereinafter also simply referred to as “line”) to be focused.
  • line a light section line
  • the distribution of pixel values that is, the luminance value of the light section line
  • the light section line processing unit 217 does not perform the process described below for all the pixels at the X coordinate position in the light section image, but the reference position Y s of the Y coordinate in the light section image.
  • the processing described below is performed for pixels belonging to the range of W before and after W (that is, pixels belonging to the range of Y s ⁇ W to Y s + W).
  • the reference position Y s of the Y coordinate is a position in the y-axis direction that is designated in advance with respect to the j-th light section image of the striped image frame.
  • the parameter W that defines the processing range can be determined as follows. That is, the range of the height of the concave portion and the convex portion that may exist in the inspection object 1 is specified based on past operation data and the like, and the range of W before and after the reference position Y s of the Y coordinate in the light section image is determined. What is necessary is just to determine suitably the magnitude
  • the light section line processing unit 217 has a luminance value equal to or higher than a predetermined threshold Th for specifying a pixel corresponding to the light section line from among pixels included in the range of Y s ⁇ W to Y s + W. Identify the pixel.
  • a predetermined threshold Th for specifying a pixel corresponding to the light section line from among pixels included in the range of Y s ⁇ W to Y s + W. Identify the pixel.
  • three pixels represented by Y j, k , Y j, k + 1 , Y j, k + 2 have luminance values I j, k , I j, k + 1 , I j that are equal to or higher than the threshold Th, respectively. , K + 2 .
  • the number p j, m obtained by adding pixels having luminance values equal to or greater than the predetermined threshold Th in the line width direction is a value corresponding to the number of pixels of the bright line at the position (j, m). one of.
  • the light section line processing unit 217 performs information (Y j, k , I j, k ), (Y j, k + 1 , I j, k + 1 ), (Y j, k + 2 ) regarding the extracted pixels in the following processing. , I j, k + 2 ) (hereinafter, sometimes simply abbreviated as (Y, I)), further light section line feature quantities are calculated.
  • the light section line processing unit 217 calculates the total sum K j, m of the luminances of the extracted pixels using the parameters p j, m and the information (Y, I) regarding the extracted pixels.
  • This total luminance K j, m is also one of the features of the light section line.
  • the center-of-gravity position Y C (j, m) is a value represented by the following expression 101, where A is a set of extracted pixels. Therefore, in the case of the example shown in FIG. 7B, the center-of-gravity position Y C (j, m) is a value represented by the following expression 101a.
  • the position in the Y-axis direction corresponding to the pixel is a value quantized with a take-in pitch (for example, 0.25 mm) in the inspected object imaging apparatus 100.
  • the center-of-gravity position Y C (j, m) calculated by the calculation shown in the above equation 101 is a value calculated by using a numerical calculation called division. It can be smaller than the take-in pitch (so-called quantization unit). Therefore, the displacement amount ⁇ d j, m calculated using the center-of-gravity position Y C (j, m) is also a value that can have a value smaller than the movement width.
  • the displacement amount ⁇ d j, m calculated in this way is also one of the light section line feature amounts.
  • the light section line processing unit 217 calculates the above three types of feature amounts with respect to M elements included in each section line. As a result, as shown in FIGS. 8A to 8C, a two-dimensional array of M columns ⁇ N rows is generated with respect to the amount of displacement ⁇ d of the light section line, the luminance sum K, and the number of pixels p of the bright line.
  • the light section line processing unit 217 outputs the feature amount related to the displacement amount ⁇ d of the light section line among the calculated light section line feature amounts to the depth image calculation unit 219 described later. In addition, the light section line processing unit 217 outputs, to the brightness image calculation unit 221, which will be described later, among the calculated light section line feature amounts, the brightness sum K and the feature amount related to the number of bright line pixels p. Further, the light section line processing unit 217 outputs the feature amount related to the number of pixels p of the bright line among the calculated light section line feature amounts to the line width image calculation unit 223 described later.
  • the depth image calculation unit 219 is realized by, for example, a CPU, a ROM, a RAM, and the like.
  • the depth image calculation unit 219 is a depth image that represents the uneven state of the surface of the object 1 to be inspected based on the optical cutting line feature value (particularly, the feature value related to the displacement amount ⁇ d) generated by the optical cutting line processing unit 217. Is calculated.
  • the depth image calculation unit 219 performs an angle (a two-dimensional arrangement) regarding the amount of displacement ⁇ d as shown in FIG. 8A and the angle formed by the linear laser beam L and the optical axis of the area camera ( The depth image is calculated using the angle ⁇ ) in FIGS. 2A and 2B.
  • This depth image is an image representing a two-dimensional uneven state distribution in which the one-dimensional distribution of the uneven state at each position in the Y-axis direction is sequentially arranged along the Y-axis direction.
  • FIG. 9 is an explanatory diagram showing the relationship between the displacement of the optical cutting line and the height of the defect.
  • FIG. 9 schematically shows a case where a concave portion exists on the surface of the inspection object 1.
  • the difference between the height of the surface position and the height of the bottom of the recess when the recess is not present on the surface of the inspection object 1 is represented by ⁇ h.
  • the linearly incident linear laser beam L is surface-reflected
  • the reflected light propagates like the light ray A in FIG.
  • the reflected light propagates like a light beam B in FIG.
  • the deviation between the light beam A and the light beam B is observed as the displacement ⁇ d of the light cutting line in this embodiment.
  • FIG. 9 demonstrated the case where a recessed part exists in the surface of the to-be-inspected object 1, even if it is a case where a convex part exists in the surface of the to-be-inspected object 1, the same relationship is materialized.
  • the depth image calculation unit 219 uses the relationship as described above, and relates to the unevenness of the surface of the object 1 to be inspected based on the feature amount related to the displacement amount ⁇ d of the optical cutting line calculated by the optical cutting line processing unit 217.
  • the amount ⁇ h is calculated.
  • the displacement amount ⁇ d of the light cutting line used for the calculation of the depth image is calculated based on the barycentric position of the light cutting line as described above, and has a value smaller than the capture pitch. It is a possible value. Therefore, the depth image calculated by the depth image calculation unit 219 is an image in which unevenness is reproduced with a resolution finer than the pixel size of the image sensor.
  • the optical cutting line may be distorted such as a curve.
  • the unevenness superimposed on the optical cutting line is information on the cross-sectional shape of the surface of the object 1 and surface defects existing on the surface. Therefore, when the depth image calculation unit 219 calculates the depth image based on the displacement amount ⁇ d of the light cutting line, the depth image calculation unit 219 performs distortion correction processing for each light cutting line, and the unevenness superimposed on the light cutting line. Only the information regarding may be extracted. By performing such a distortion correction process, it is possible to obtain only information on the uneven ridges present on the surface of the inspected object 1 even when the optical cutting line has a distortion such as curvature. .
  • the depth image calculation unit 219 outputs information on the depth image calculated as described above to the detection processing unit 225 described later.
  • the luminance image calculation unit 221 is realized by, for example, a CPU, a ROM, a RAM, and the like.
  • the luminance image calculation unit 221 generates a line on the surface of the object 1 to be inspected based on the light cutting line feature amount generated by the light cutting line processing unit 217 (particularly, the feature amount related to the luminance sum K and the number of pixels p of the bright line).
  • a luminance image representing the luminance distribution of the laser beam L is calculated.
  • the luminance image calculation unit 221 performs a feature amount (two-dimensional array) related to the luminance sum K as shown in FIG. 8B and a feature amount related to the number of pixels p of bright lines as shown in FIG. 8C (
  • the average luminance K AVE (j, m) K j, m / p j, m (1 ⁇ j ⁇ N, 1 ⁇ m ⁇ ), which is the average value of the total luminance in the line width direction. M) is calculated.
  • the luminance image calculation unit 221 sets the data array formed of the calculated average luminance K AVE (j, m) as the luminance image of the object 1 to be inspected.
  • Such a luminance image is an image representing a two-dimensional luminance distribution in which the one-dimensional luminance distribution of the linear laser beam L at each position in the Y-axis direction is sequentially arranged along the Y-axis direction.
  • the luminance image calculation unit 221 outputs information on the luminance image calculated as described above to the detection processing unit 225 described later.
  • the line width image calculation unit 223 is realized by, for example, a CPU, a ROM, a RAM, and the like.
  • the line width image calculation unit 223 is configured to generate a linear laser beam on the surface of the inspected object 1 based on the light section line feature amount generated by the light section line processing unit 217 (particularly, the feature amount regarding the number of pixels p of the bright line).
  • a line width image in which the distribution of the line width in the moving direction of L is associated with the luminance value is calculated.
  • FIG. 11 is an explanatory diagram for explaining the line widths of the light cutting lines in the normal part and the rough surface part.
  • FIG. 12A is an explanatory diagram for explaining the line width of the optical cutting line in the normal portion
  • FIG. 12B is an explanatory diagram for explaining the line width of the optical cutting line in the rough surface portion.
  • FIG. 13 is an explanatory diagram for describing a line width image according to the present embodiment.
  • the linear laser light L irradiated to the rough surface portion is scattered by the rough surface portion. Is done. Therefore, as schematically shown in FIGS. 4C and 11, an increase in the line width of the light section line is recognized at the position of the light section image corresponding to the rough surface portion.
  • the position corresponding to the AA ′ cutting line corresponds to the normal part
  • the position corresponding to the BB ′ cutting line is a rough line. It corresponds to the surface part.
  • a pixel having a luminance value equal to or higher than a predetermined threshold Th for specifying a pixel corresponding to the light cutting line is schematically illustrated in FIG. 12A. Assume that there are 3 pixels as shown. Similarly, in the luminance distribution of the rough surface portion along the B-B ′ cutting line, it is assumed that the pixels having the luminance value equal to or higher than the threshold Th are 8 pixels as schematically shown in FIG. 12B.
  • the line width of the linear laser beam L is set to correspond to 2 to 4 pixels. Since the line width of the linear laser light L is recognized in the rough surface portion, the line width of the linear laser light L in the rough surface portion is set to 4 pixels which are set values in the inspection object imaging apparatus 100. Exceeds value. Therefore, regarding the line width of the linear laser beam L, it is possible to distinguish between the normal portion and the rough surface portion by setting in advance a predetermined threshold Th2 for specifying a pixel corresponding to the rough surface portion. Become. For example, in the imaging device 100 to be inspected, it is assumed that the line width of the normal portion of the linear laser light L is set to 2 to 4 pixels. In this case, by setting the threshold Th2 as 5 pixels with a margin, a portion having a line width of 5 pixels or more can be distinguished as a rough surface portion.
  • the line width image calculation unit 223 corresponds to the rough surface portion by determining the threshold value of the feature amount related to the number of pixels p of the bright lines generated by the light cutting line processing unit 217 based on the threshold value Th2. It becomes possible to specify the pixel position.
  • the line width image calculation unit 223 By setting the threshold value Th2 as described above, the position of the rough surface portion can be specified, while the line width image generated by the line width image calculation unit 223 does not reveal the normal portion. It is important to determine the luminance value that constitutes. Therefore, in the line width image calculation unit 223 according to the present embodiment, for each light cutting line in the striped image frame, the line width and a predetermined threshold line at each position in the extending direction (Y-axis direction) of the light cutting line. A line width image is calculated by calculating a difference from the width and assigning a luminance value according to the calculated magnitude of the difference.
  • the line width image calculation unit 223 refers to the feature quantity related to the number of pixels p of the bright lines generated by the light section line processing unit 217, and determines the number of bright lines pixels p at each position and the threshold line width.
  • the difference from the corresponding threshold value Th2 is calculated. Note that, if the difference from the threshold Th2 is calculated for the normal part, the difference may be a negative value. However, the difference value in this case is preferably set to zero instead of being a negative value.
  • the line width image calculation unit 223 determines the luminance value at each pixel position so that the luminance value of the pixels constituting the line width image increases as the calculated difference value increases.
  • the line width image calculation unit 223 can determine the luminance value at each pixel position as follows, for example. That's fine.
  • the example shown below has shown about the case where 8 bit full scale is used, it is not necessary to use 8 bit full scale.
  • the visibility of the generated line width image can be improved by setting the luminance value in the full scale range.
  • the method of assigning the luminance value according to the difference value is not limited to the above example, and any assignment method may be adopted as long as the luminance value increases as the difference value increases. It is possible.
  • the line width image is generated as an image having a luminance value of 8 bits has been described, but it is needless to say that the line width image may be generated as an image having a luminance value exceeding 8 bits. .
  • a line width image as shown on the right side of FIG. 13 can be generated.
  • the line width image generated by such processing is a two-dimensional line width in which a one-dimensional distribution of the line width of the linear laser light L at each position in the Y-axis direction is sequentially arranged along the Y-axis direction. It is an image showing distribution.
  • the line width image calculation unit 223 outputs information on the line width image calculated as described above to the detection processing unit 225 described later.
  • the detection processing unit 225 is realized by, for example, a CPU, a ROM, a RAM, and the like. Based on the depth image, the luminance image, and the line width image respectively calculated by the depth image calculation unit 219, the luminance image calculation unit 221, and the line width image calculation unit 223, the detection processing unit 225 detects the surface of the object 1 to be inspected. Detect properties.
  • the detection processing unit 225 includes a defect part specifying function for specifying a defective part based on a depth image, a luminance image, and a line width image, and a feature quantity extracting function for extracting a feature quantity regarding the form and pixel value of the specified defective part. And a defect discriminating function for discriminating the type of defect, the degree of harmfulness, etc. based on the extracted feature quantity.
  • a defect part specifying function for specifying a defective part based on a depth image, a luminance image, and a line width image
  • a feature quantity extracting function for extracting a feature quantity regarding the form and pixel value of the specified defective part.
  • a defect discriminating function for discriminating the type of defect, the degree of harmfulness, etc. based on the extracted feature quantity.
  • the detection processing unit 225 emphasizes the area of the rough surface portion by filtering processing that obtains a linear sum of luminance values with peripheral pixels as necessary for each pixel of the acquired line width image. Then, it is determined whether or not the obtained value is equal to or greater than the first threshold value for specifying the rough surface portion.
  • the detection processing unit 225 performs a filter process for obtaining a linear sum of pixel values (values representing depth or luminance values) with peripheral pixels for each pixel of the acquired depth image and luminance image. Regions such as vertical line wrinkles, horizontal line wrinkles, and fine wrinkles are emphasized, and it is determined whether or not the obtained value is equal to or greater than a second threshold value for specifying a defective part. By performing such a filtering process and a determination process based on the filtering process result, the detection processing unit 225 can generate a binarized image for specifying a defective part.
  • the detection processing unit 225 identifies each defective portion by combining consecutively generated defect portions.
  • a feature amount related to the form and luminance value of the defective part is determined for each specified defective part. Extract.
  • the feature quantity related to the form of the defective part include the width of the defective part, the length of the defective part, the peripheral length of the defective part, the area of the defective part, and the area of the circumscribed rectangle of the defective part.
  • examples of the feature amount related to the luminance value of the defective part include a maximum value, a minimum value, and an average value of the luminance of the defective part.
  • the detection processing unit 225 extracts a feature quantity regarding the form and pixel value of the defective part for each specified defective part.
  • the feature quantity related to the form of the defective part include the width of the defective part, the length of the defective part, the peripheral length of the defective part, the area of the defective part, and the area of the circumscribed rectangle of the defective part.
  • the maximum value, the minimum value, the average value, etc. of the depth of the defective part can be mentioned, and for the luminance image, the maximum luminance of the defective part can be mentioned. Value, minimum value, average value, and the like.
  • the detection processing unit 225 determines the type of defect, the degree of harm, and the like for each defective portion based on the extracted feature amount.
  • the determination processing such as the type of defect and the degree of harmfulness based on the feature amount is performed using a logic table as shown in FIG. 14, for example. That is, the detection processing unit 225 determines the type of defect and the degree of harmfulness based on the determination condition represented by the logic table illustrated in FIG.
  • defect types (defects A1 to An) are described as vertical items in the logic table, and feature types (feature amounts B1) are displayed as horizontal items in the logic table.
  • feature amount Bm To feature amount Bm).
  • a discrimination conditional expression conditional expression C11 to conditional expression Cnm
  • Each row of such a logic table is a set, and becomes a determination condition for each type of defect. The determination process is performed in order from the type described in the top line, and ends when all the determination conditions described in any one line are satisfied.
  • Such a logic table is obtained by a known method using a database constructed by a learning process in which past operation data and a result of specifying a defect type and a hazard level by an examiner based on the operation data are used as teacher data. It is possible to generate.
  • the detection processing unit 225 specifies the type of defect and the degree of harm for each defective part detected in this manner, and outputs the obtained detection result to the display control unit 205. Thereby, the information regarding the defect which exists in the surface of the machined goods which are the to-be-inspected objects 1 will be output to a display part (not shown).
  • the detection processing unit 225 may output the obtained detection result to an external device such as a manufacturing control process computer, or may create a product defect form using the obtained detection result. Good.
  • the detection processing unit 225 may store the information related to the detection result of the defective part as history information in the storage unit 207 or the like in association with time information related to the date and time when the information is calculated.
  • a discriminator such as a neural network or a support vector machine (SVM) is generated by learning processing using past operation data and a result of specifying a defect type and a hazard level by a tester based on the operation data as teacher data, Such a discriminator may be used for discriminating the type of defect and the degree of harm.
  • SVM support vector machine
  • the detection processing unit 225 performs defect detection processing based on a predetermined threshold determination using a line width image, so that it is difficult to distinguish by the conventional light cutting method.
  • the portion of the rough surface portion (that is, the background remaining portion and rust) can be easily revealed.
  • the detection processing unit 225 performs the above-described processing on a portion other than the portion where the rough surface portion (that is, the background remaining portion and rust) is detected based on the line width image during the defect detection processing based on the depth image and the luminance image.
  • Such a defect site detection process may be performed. Thereby, it is possible to increase the speed of the defect detection process based on the depth image and the luminance image.
  • the detection processing unit 225 confirms whether or not the rough surface portion specified based on the line width image is extracted as a defect candidate portion in the luminance image, and performs a double check of the detection result. May be.
  • the optical correction line processing unit 217 may execute the approximate correction process before the optical cutting line processing unit 217 calculates the optical cutting line feature amount.
  • each component described above may be configured using a general-purpose member or circuit, or may be configured by hardware specialized for the function of each component.
  • the CPU or the like may perform all functions of each component. Therefore, it is possible to appropriately change the configuration to be used according to the technical level at the time of carrying out the present embodiment.
  • a computer program for realizing each function of the arithmetic processing apparatus according to the present embodiment as described above can be produced and mounted on a personal computer or the like.
  • a computer-readable recording medium storing such a computer program can be provided.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
  • the above computer program may be distributed via a network, for example, without using a recording medium.
  • the configuration of the surface property inspection apparatus 10 according to this embodiment has been described in detail above.
  • the surface texture inspection apparatus 10 according to the present embodiment it is possible to detect the residual surface portion and rust remaining on the machined surface with the same apparatus configuration as the apparatus based on the conventional light cutting method.
  • detection based on a conventional light cutting method such as an uneven surface or a surface pattern is also possible, a wide variety of defective parts can be inspected over the entire machined surface of a machined product.
  • the application destination of the surface texture inspection apparatus 10 according to the present embodiment is not limited to the specific machined product as described above, and the roughness of the defective portion is different from that of the normal portion. If it is such a product, the surface texture inspection apparatus 10 according to the present embodiment can be applied.
  • the two-dimensional image of the distribution of the remaining surface and the rust portion having low visibility makes it easy to identify a defective part by image processing, and uses an image with good visibility even in visual inspection. It becomes possible.
  • FIG. 15 is a flowchart showing an example of the flow of the surface texture inspection method according to the present embodiment.
  • the inspected object imaging device 100 of the surface property inspection apparatus 10 uses a linear laser beam L under the control of the imaging control unit 201 of the arithmetic processing device 200 for the machined product that is the inspected object 1.
  • the surface is imaged to generate a light section image (step S101).
  • the inspection object imaging device 100 outputs the generated light section image to the arithmetic processing device 200.
  • the data acquisition unit 211 of the image processing unit 203 included in the arithmetic processing device 200 acquires the image data of the light section image
  • the data section 211 stores the acquired light section image along the moving direction of the object 1 to be inspected. Are sequentially stored in an image memory provided in the storage.
  • the fringe image frame generation unit 213 of the image processing unit 203 included in the arithmetic processing device 200 generates a fringe image frame by sequentially arranging the acquired light-cut images along the moving direction of the device under test 1 ( Step S103).
  • the stripe image frame generation unit 213 outputs the generated stripe image frame to the light section line processing unit 217.
  • the light section line processing unit 217 uses the generated fringe image frame, and for each light section line, the number of pixels having a luminance equal to or higher than a predetermined threshold Th, the sum of the brightness of the pixels, and the displacement of the light section line. The amount is calculated (step S105). These calculation results are used as the feature value of the light section line.
  • the calculated light section line feature amount is output to the depth image calculation unit 219, the luminance image calculation unit 221 and the line width image calculation unit 223, respectively.
  • the depth image calculation unit 219 calculates a depth image using the calculated light section line feature amount (particularly, a feature amount related to the displacement amount of the light section line) (step S107). Further, the luminance image calculation unit 221 calculates a luminance image by using the calculated light section line feature amount (particularly, the feature amount relating to the number of pixels of the bright line and the feature amount relating to the sum of luminance) (step S107). ). Further, the line width image calculation unit 223 calculates a line width image using the calculated light section line feature amount (particularly, the feature amount related to the number of pixels of the bright line) (step S107). The depth image calculation unit 219, the luminance image calculation unit 221, and the line width image calculation unit 223 output the calculated images to the detection processing unit 225.
  • the detection processing unit 225 performs a surface property detection process of the inspection object 1 using the calculated depth image, luminance image, and line width image (step S109). As a result, the detection processing unit 225 can specify the type of defect and the degree of harmfulness for various types of defective parts existing on the surface of the inspection object 1. Through the flow as described above, various types of defects existing on the surface of the machined product that is the inspection object 1 are detected.
  • FIG. 16 is a block diagram for explaining a hardware configuration of the arithmetic processing device 200 according to the embodiment of the present invention.
  • the arithmetic processing apparatus 200 mainly includes a CPU 901, a ROM 903, and a RAM 905.
  • the arithmetic processing device 200 further includes a bus 907, an input device 909, an output device 911, a storage device 913, a drive 915, a connection port 917, and a communication device 919.
  • the CPU 901 functions as a central processing device and control device, and controls all or a part of the operation in the arithmetic processing device 200 according to various programs recorded in the ROM 903, the RAM 905, the storage device 913, or the removable recording medium 921. To do.
  • the ROM 903 stores programs used by the CPU 901, calculation parameters, and the like.
  • the RAM 905 primarily stores programs used by the CPU 901, parameters that change as appropriate during execution of the programs, and the like. These are connected to each other by a bus 907 constituted by an internal bus such as a CPU bus.
  • the bus 907 is connected to an external bus such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge.
  • PCI Peripheral Component Interconnect / Interface
  • the input device 909 is an operation means operated by the user such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
  • the input device 909 may be, for example, remote control means (so-called remote control) using infrared rays or other radio waves, or may be an external connection device 923 such as a PDA corresponding to the operation of the arithmetic processing device 200. May be.
  • the input device 909 includes, for example, an input control circuit that generates an input signal based on information input by a user using the operation unit and outputs the input signal to the CPU 901. By operating the input device 909, the user can input various data or instruct processing operations to the arithmetic processing device 200.
  • the output device 911 is configured by a device that can notify the user of the acquired information visually or audibly.
  • Such devices include display devices such as CRT display devices, liquid crystal display devices, plasma display devices, EL display devices and lamps, audio output devices such as speakers and headphones, printer devices, mobile phones, and facsimiles.
  • the output device 911 outputs results obtained by various processes performed by the arithmetic processing device 200, for example. Specifically, the display device displays the results obtained by various processes performed by the arithmetic processing device 200 as text or images.
  • the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs the analog signal.
  • the storage device 913 is a data storage device configured as an example of a storage unit of the arithmetic processing device 200.
  • the storage device 913 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the storage device 913 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
  • the drive 915 is a recording medium reader / writer, and is built in or externally attached to the arithmetic processing unit 200.
  • the drive 915 reads information recorded on a removable recording medium 921 such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 905.
  • the drive 915 can also write a record on a removable recording medium 921 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
  • the removable recording medium 921 is, for example, a CD medium, a DVD medium, a Blu-ray (registered trademark) medium, or the like.
  • the removable recording medium 921 may be a compact flash (registered trademark) (CompactFlash: CF), a flash memory, an SD memory card (Secure Digital memory card), or the like. Further, the removable recording medium 921 may be, for example, an IC card (Integrated Circuit card) on which a non-contact IC chip is mounted, an electronic device, or the like.
  • connection port 917 is a port for directly connecting a device to the arithmetic processing device 200.
  • Examples of the connection port 917 include a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, an RS-232C port, and an HDMI (registered trademark) High-Definition Multimedia interface.
  • the communication device 919 is a communication interface configured by a communication device for connecting to the communication network 925, for example.
  • the communication device 919 is, for example, a communication card for a wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB).
  • the communication device 919 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
  • the communication device 919 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet and other communication devices.
  • the communication network 925 connected to the communication device 919 is configured by a wired or wireless network, for example, the Internet, a home LAN, an in-house LAN, infrared communication, radio wave communication, satellite communication, or the like. May be.
  • each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Therefore, it is possible to change the hardware configuration to be used as appropriate according to the technical level at the time of carrying out this embodiment.
  • the surface texture inspection apparatus and the surface texture inspection method according to the present invention will be specifically described with reference to examples.
  • the Example shown below is only an example of the surface property inspection apparatus and surface property inspection method which concern on this invention, and the surface property inspection apparatus and surface property inspection method which concern on this invention are limited to the following example. It is not a thing.
  • the surface property inspection device 10 having the inspection object imaging device 100 having the configuration shown in FIG. 2B was used to inspect the surface property of the inspection object.
  • a disk-shaped processed product manufactured by grinding a material after hot forming a billet, which is an intermediate material mainly composed of Fe, is used as the object to be inspected 1.
  • the ground surface (the surface spreading in the radial direction of the disk) was the inspection target surface.
  • the diameter of such a disk-shaped processed product is 860 mm.
  • a red LD module that emits red laser light is used as an illuminating device of the inspected object imaging apparatus 100, and a point laser beam (output: 100 mW) emitted from the module is incident on a cylindrical lens to form a linear shape.
  • the laser beam L was obtained.
  • the linear laser beam L is controlled so as to have a length corresponding to the radius (430 mm) of the disk-shaped workpiece (length in the x-axis direction in FIG. 2B), and is rotating. Irradiated to the ground surface of the workpiece.
  • the line width of the linear laser beam L irradiated on the surface of the inspection object 1 was set to 0.25 mm (250 ⁇ m).
  • a commercially available camera in which a CCD (pixel size: 5.5 ⁇ m ⁇ 5.5 ⁇ m) of 2048 pixels ⁇ 2048 pixels was mounted as an image sensor was used.
  • the frame rate of such an image sensor is 200 fps.
  • the focal length of the lens mounted on the camera is 24 mm and the field angle is 26 °.
  • the pixel size of the image to be captured is 0.25 mm ⁇ 0.25 mm, and the line width of the linear laser beam L is captured with a width of 2 to 4 bright lines on the captured image.
  • images are continuously captured by two area cameras in synchronization with the signal output from the PLG. I got it. Specifically, imaging was performed every time the disk-shaped processed product was rotated for 5 msec.
  • the illumination device 101 is provided above the vertical direction, irradiates the linear laser beam L downward in the vertical direction, and the installation angle ⁇ of the two area cameras is ⁇ 45 degrees.
  • the image capture pitch was set to 0.25 mm to match the pixel size of the image sensor.
  • the light cut image obtained by the inspection object imaging device 100 as described above was subjected to image processing by the arithmetic processing device 200 having the configuration as described above, and the surface property of the ground surface of the disk-like processed product was inspected.
  • the line width threshold Th2 is set to 5 pixels, and the luminance value is assigned using the 8-bit full scale as mentioned above.
  • the light cut image of the portion corresponding to the normal portion, the light cut image of the portion corresponding to the concave portion, and the light cut image of the portion corresponding to the remaining background portion are summarized in FIG. Indicated.
  • the straight line segment is imaged while the line width of the light cut line is substantially constant, and in the light cut image corresponding to the recess, It can be seen that the light cutting line is bent at a position corresponding to the recess without substantially changing the line width of the light cutting line.
  • the line width of the light cut line was enlarged as is clear when compared with the light cut image corresponding to the normal portion.
  • area camera 1 is an area camera provided on the downstream side in the rotational direction of the disk-shaped workpiece
  • area camera 2 is provided on the upstream side in the rotational direction of the disk-shaped workpiece.
  • the line width image based on the light cut image from the area camera 1 is the line width image based on the light cut image from the area camera 2. It is clear that it is clearer. This result suggests that the background portion of the disk-like processed product used as the object to be inspected 1 has surface directionality. Even in such a case, by capturing the linear laser beam L from a plurality of directions, it is possible to more surely reveal the remaining portion of the background, and underestimate the remaining portion of the background and rust. It was confirmed that the possibility of being lost could be avoided.
  • DESCRIPTION OF SYMBOLS 10 Surface texture inspection apparatus 100 Test object imaging device 101 Illumination device 103,105 Area camera 200 Arithmetic processing device 201 Imaging control part 203 Image processing part 205 Display control part 207 Storage part 211 Data acquisition part 213 Stripe image frame generation part 215 Image Calculation unit 217 Optical section line processing unit 219 Depth image calculation unit 221 Luminance image calculation unit 223 Line width image calculation unit 225 Detection processing unit

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Textile Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

La présente invention a pour but d'inspecter la forme de surface d'un produit usiné, obtenu par usinage d'un matériau après la formation à chaud d'une billette ou d'un autre matériau intermédiaire, tout en rendant toute partie de surface d'origine résiduelle ou de la rouille relativement évidente. Pour atteindre ce but, l'invention porte sur un dispositif d'inspection de propriété de surface qui inspecte un produit usiné en tant qu'objet d'inspection et qui comprend : un dispositif d'éclairage, afin de diriger une lumière laser linéaire jusqu'à la surface d'un objet mobile d'inspection ; un dispositif d'imagerie, afin de générer des images de section de lumière par photographie de la surface irradiée par la lumière laser linéaire ; une unité de calcul d'images, afin d'utiliser une trame d'image en bandes obtenue des images de section de lumière pour calculer une image de profondeur, une image de luminance et une image de largeur de ligne ; une unité de traitement de détection, afin de détecter des caractéristiques de surface de l'objet d'inspection sur la base de l'image de profondeur, de l'image de luminance et de l'image de largeur de ligne calculées. L'unité de calcul d'images calcule, pour chaque ligne de section de lumière dans le cadre d'image en bandes, les différences entre les largeurs de ligne à chaque position, dans la direction d'extension de la ligne de section de lumière, et une largeur de ligne de seuil prescrite et calcule l'image de largeur de ligne par attribution de valeurs de luminosité, en fonction des tailles des différences. L'unité de traitement de détection détecte une partie de surface d'origine résiduelle ou de la rouille sur la base de l'image de largeur de ligne.
PCT/JP2018/008605 2018-03-06 2018-03-06 Dispositif et procédé d'inspection de propriété de surface et programme WO2019171474A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/008605 WO2019171474A1 (fr) 2018-03-06 2018-03-06 Dispositif et procédé d'inspection de propriété de surface et programme

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/008605 WO2019171474A1 (fr) 2018-03-06 2018-03-06 Dispositif et procédé d'inspection de propriété de surface et programme

Publications (1)

Publication Number Publication Date
WO2019171474A1 true WO2019171474A1 (fr) 2019-09-12

Family

ID=67845926

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/008605 WO2019171474A1 (fr) 2018-03-06 2018-03-06 Dispositif et procédé d'inspection de propriété de surface et programme

Country Status (1)

Country Link
WO (1) WO2019171474A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113211190A (zh) * 2021-06-03 2021-08-06 洛阳迈锐网络科技有限公司 一种数控加工中心刀具破损磨损在线检测装置及检测方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004003930A (ja) * 2002-04-04 2004-01-08 Nippon Steel Corp 光学的形状測定装置及び光学的形状測定方法
JP2005030812A (ja) * 2003-07-08 2005-02-03 Nippon Steel Corp 鋼板表面の検査方法、システム、画像処理装置、及びコンピュータプログラム
JP2012159491A (ja) * 2011-01-14 2012-08-23 Nippon Steel Corp 欠陥検出装置及び欠陥検出方法
US20150116727A1 (en) * 2012-04-04 2015-04-30 Siemens Vai Metals Technologies Gmbh Method and device for measuring the flatness of a metal product
JP2018048979A (ja) * 2016-09-23 2018-03-29 新日鐵住金株式会社 表面性状検査装置、表面性状検査方法及びプログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004003930A (ja) * 2002-04-04 2004-01-08 Nippon Steel Corp 光学的形状測定装置及び光学的形状測定方法
JP2005030812A (ja) * 2003-07-08 2005-02-03 Nippon Steel Corp 鋼板表面の検査方法、システム、画像処理装置、及びコンピュータプログラム
JP2012159491A (ja) * 2011-01-14 2012-08-23 Nippon Steel Corp 欠陥検出装置及び欠陥検出方法
US20150116727A1 (en) * 2012-04-04 2015-04-30 Siemens Vai Metals Technologies Gmbh Method and device for measuring the flatness of a metal product
JP2018048979A (ja) * 2016-09-23 2018-03-29 新日鐵住金株式会社 表面性状検査装置、表面性状検査方法及びプログラム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113211190A (zh) * 2021-06-03 2021-08-06 洛阳迈锐网络科技有限公司 一种数控加工中心刀具破损磨损在线检测装置及检测方法

Similar Documents

Publication Publication Date Title
JP5672480B2 (ja) ビードの終端部の形状を判定する装置及びその方法
JP6515344B2 (ja) 欠陥検出装置及び欠陥検出方法
JP4343911B2 (ja) 欠陥検査装置
JP5351673B2 (ja) 外観検査装置、外観検査方法
WO2017179243A1 (fr) Dispositif servant à capturer une image d'un objet à inspecter, procédé servant à capturer une image d'un objet à inspecter, dispositif d'inspection de surface et procédé d'inspection de surface
JP5742655B2 (ja) 欠陥検出装置及び欠陥検出方法
JP6119926B1 (ja) 金属体の形状検査装置及び金属体の形状検査方法
JP6683088B2 (ja) 表面性状検査装置、表面性状検査方法及びプログラム
JP6696278B2 (ja) ドリフト検査装置
JP4150390B2 (ja) 外観検査方法及び外観検査装置
JP6079948B1 (ja) 表面欠陥検出装置および表面欠陥検出方法
US6344897B2 (en) Inspection apparatus for foreign matter and pattern defect
JP6481217B1 (ja) 管状体内表面検査装置及び管状体内表面検査方法
JP5655045B2 (ja) 光学式表面欠陥検査装置及び光学式表面欠陥検査方法
WO2019171474A1 (fr) Dispositif et procédé d'inspection de propriété de surface et programme
US20230020684A1 (en) Laser based inclusion detection system and methods
JP5506243B2 (ja) 欠陥検査装置
JP5784796B2 (ja) 表面検査装置およびその方法
JP5605010B2 (ja) 表面検査方法
JP2003329428A (ja) 表面検査装置及び表面検査方法
JP2003247954A (ja) 円形体周縁部の欠陥検出方法
Yan et al. A boosted decision tree approach to shadow detection in scanning electron microscope (SEM) images for machine vision applications
Ekwongmunkong et al. Automated machine vision system for inspecting cutting quality of cubic zirconia
JP2007199089A (ja) 表面検査装置
JPH0772909B2 (ja) 外観検査による溶接状態判定方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18908982

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18908982

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP