WO2019171474A1 - Surface property inspection device, surface property inspection method, and program - Google Patents

Surface property inspection device, surface property inspection method, and program Download PDF

Info

Publication number
WO2019171474A1
WO2019171474A1 PCT/JP2018/008605 JP2018008605W WO2019171474A1 WO 2019171474 A1 WO2019171474 A1 WO 2019171474A1 JP 2018008605 W JP2018008605 W JP 2018008605W WO 2019171474 A1 WO2019171474 A1 WO 2019171474A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
line width
light
line
luminance
Prior art date
Application number
PCT/JP2018/008605
Other languages
French (fr)
Japanese (ja)
Inventor
武男 中田
酒井 宏樹
俊博 筒井
剛史 真坂
Original Assignee
日本製鉄株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本製鉄株式会社 filed Critical 日本製鉄株式会社
Priority to PCT/JP2018/008605 priority Critical patent/WO2019171474A1/en
Publication of WO2019171474A1 publication Critical patent/WO2019171474A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
    • G01N21/892Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles characterised by the flaw, defect or object feature examined

Definitions

  • the present invention relates to a surface texture inspection device, a surface texture inspection method, and a program.
  • examples thereof include a cylindrical processed product manufactured by adjusting the outer diameter of a material after hot forming a billet or the like by mechanical cutting or the like.
  • an inspection method of these machined steel materials in Patent Document 1 below, for example, an inspection device using a so-called optical cutting method that irradiates the surface of a machined steel material with a line laser and images the entire surface, or a machined steel material. There has been proposed an optical inspection for monitoring the surface state by changing the light receiving intensity of the light projected on the surface of the light.
  • Patent Document 2 As a method for detecting minute unevenness, in Patent Document 2 below, the surface of a machined product is irradiated with slit laser light (linear light), and the irradiation light is focused on the concave portion and irradiated on the convex portion.
  • slit laser light linear light
  • the irradiation light is focused on the concave portion and irradiated on the convex portion.
  • This technique is a technique that utilizes the characteristic that the width of the irradiation light is small if the surface of the machined product is smooth, and the width of the irradiation light is large if there is a recess.
  • the inspection technique proposed in Patent Document 2 is a technique for inspecting only a specific part of the surface of a machined product, and in addition, uses a change in the line width of irradiation light in a part of the machined product. Therefore, there is a problem that the visibility of the position of the defective portion is poor. Further, in the technique proposed in Patent Document 2 above, since the two-dimensional imaging of the portion having a large surface roughness (unevenness) is not performed, the portion having a large surface roughness using various image processing Cannot be extracted, or the size of a part having a large surface roughness cannot be specified. Furthermore, since the change of the line width in the linear laser beam is slight, there is a problem that the visibility is low only with the projection image of the irradiation light.
  • the billet which is an intermediate material
  • the billet is hot-formed regardless of whether the unevenness information by the light cutting method or the brightness change information of the illumination reflected light is used. It is in the present situation that it is impossible to clearly reveal the portion where the background of the material after remaining, rust, etc. are clearly manifested.
  • it is possible to image the surface of the machined product based on the luminance information of the illumination reflected light it is affected by surrounding noise components (for example, machined skin and dust), and the defective part It is very difficult to distinguish between the normal part and the normal part.
  • the present invention has been made in view of the above-mentioned problems, and the object of the present invention is to provide a machined product obtained by machining an intermediate material with respect to the background remaining portion and rust of the intermediate material.
  • An object of the present invention is to provide a surface property inspection apparatus, a surface property inspection method, and a program capable of inspecting the surface shape of a machined product while becoming apparent.
  • a machined product obtained by machining an intermediate material is used as an object to be inspected, and the object to be inspected is subjected to a linear laser beam.
  • a depth image representing the uneven state of the surface of the object to be inspected a luminance image representing the luminance distribution of the linear laser light on the surface of the object to be inspected, and a linear shape on the surface of the object to be inspected
  • line width images representing the line width distribution of the laser beam of I came to know that there is.
  • a surface property inspection apparatus using a machined product made of an intermediate material as an object to be inspected wherein the illumination apparatus irradiates a surface of the moving object to be inspected with a linear laser beam, and An imaging apparatus that generates a plurality of light-cut images, which are captured images of the linear laser light on the surface, along the moving direction of the inspection object by imaging the surface irradiated with the linear laser light And, based on a fringe image frame in which light cutting lines, which are line segments corresponding to the irradiated portions of the linear laser light in each of the plurality of light cutting images, are sequentially arranged along the moving direction, A depth image representing an uneven state of the surface of the body, a luminance image representing a luminance distribution of the linear laser light on the surface of the inspection object, and the linear laser light on the surface of the inspection object Bright line width distribution in moving direction An image calculation unit that calculates a line width image associated with the value, and a detection process for detecting a surface property of the object to be inspected
  • the line width image is calculated by calculating and assigning a luminance value according to the calculated magnitude of the difference, and the detection processing unit uses the intermediate material as a material based on the line width image.
  • the detection processing unit detects the background remaining portion and the rust based on whether a luminance value of the line width image is equal to or higher than a first threshold value for detecting the background remaining portion and rust.
  • the surface property inspection apparatus according to (1).
  • the image calculation unit calculates a barycentric position in the line width direction of the light cutting line along the moving direction of the object to be inspected, and is a position in the moving direction designated in advance with respect to the light cutting image.
  • the surface property inspection apparatus according to (1) or (2), wherein the depth image is calculated based on a displacement amount between a reference position and the gravity center position.
  • the detection processing unit identifies and identifies the defective part based on whether or not the luminance values of the depth image and the luminance image are equal to or greater than a second threshold value for specifying the defective part.
  • a defect existing on the surface of the object to be inspected is determined (1) to (3)
  • the surface property inspection apparatus according to any one of the above.
  • the detection processing unit performs the detection of the defective portion other than the portion where the background remaining portion or the rust is detected based on the line width image.
  • the surface property inspection apparatus according to 4).
  • the imaging device includes at least two imaging devices that image the surface irradiated with the linear laser light from each of an upstream side in the moving direction and a downstream side in the moving direction.
  • the surface property inspection apparatus according to any one of (1) to (5).
  • Light that generates a plurality of light-cut images that are captured images of the linear laser light on the surface along the moving direction of the inspection object by imaging the surface irradiated with the laser light of Based on a cutting image generation step and a fringe image frame in which light cutting lines that are line segments corresponding to the irradiated portions of the linear laser light in each of the plurality of light cutting images are arranged in order along the moving direction.
  • a depth image representing the uneven state of the surface of the object to be inspected
  • a luminance image representing a luminance distribution of the linear laser light on the surface of the object to be inspected
  • the linear image on the surface of the object to be inspected the linear image on the surface of the object to be inspected.
  • the transfer of laser light An image calculation step for calculating a line width image in which the distribution of the line width in the direction is associated with the luminance value, and the depth image, the luminance image, and the line width image of the object to be inspected based on the calculated depth image
  • a detection processing step of detecting a surface property wherein the image calculation step includes, for each light cutting line in the fringe image frame, a predetermined width and a predetermined width at each position in the extending direction of the light cutting line.
  • the line width image is calculated by calculating a difference with a threshold line width and assigning a luminance value according to the calculated magnitude of the difference, and the detection processing step is based on the line width image.
  • the background remaining portion and the rust are detected based on whether a luminance value of the line width image is equal to or higher than a first threshold value for detecting the background remaining portion and rust.
  • the image calculating step calculates a barycentric position in the line width direction of the light cutting line along the moving direction of the object to be inspected, and is a position in the moving direction designated in advance with respect to the light cutting image.
  • the surface property inspection method according to (7) or (8), wherein the depth image is calculated based on a displacement amount between a reference position and the gravity center position. (10)
  • the defective part is specified based on whether the luminance value of the depth image and the luminance image is equal to or higher than a second threshold value for specifying the defective part.
  • a defect present on the surface of the object to be inspected is determined (7) to (9)
  • the surface property inspection method according to any one of the above.
  • the imaging device includes at least two imaging devices that image the surface irradiated with the linear laser light from each of an upstream side in the moving direction and a downstream side in the moving direction.
  • the surface property inspection method according to any one of (1) to (11).
  • a machined product made of an intermediate material is used as an object to be inspected, and an illumination device that irradiates the surface of the moving object to be inspected with linear laser light, and the linear laser light is irradiated By imaging the surface, it is possible to mutually communicate with each of the imaging devices that generate a plurality of light-cut images that are images of the linear laser light on the surface along the moving direction of the inspection object. Based on a fringe image frame in which light cutting lines, which are line segments corresponding to irradiation portions of the linear laser light in each of the generated plurality of light cutting images, are arranged in order along the moving direction.
  • the line width image is calculated by calculating a difference from a predetermined threshold line width and assigning a luminance value according to the calculated magnitude of the difference, and the detection processing function is based on the line width image.
  • a program for detecting whether or not a background remaining portion or rust made of the intermediate material is present on the surface of the object to be inspected The detection processing function detects the background remaining portion and the rust based on whether a luminance value of the line width image is equal to or higher than a first threshold value for detecting the background remaining portion and rust. (13) The program according to (13). (15) The image calculation function calculates a barycentric position in the line width direction of the light cutting line along the moving direction of the object to be inspected, and is a position in a moving direction designated in advance with respect to the light cutting image. The program according to (13) or (14), wherein the depth image is calculated based on a displacement amount between a reference position and the gravity center position.
  • the detection processing function specifies a defective part based on whether or not the luminance values of the depth image and the luminance image are equal to or greater than a second threshold value for specifying the defective part.
  • a defect present on the surface of the object to be inspected is determined (13) to (15)
  • the program as described in any one of these.
  • the detection processing function performs the detection of the defect portion other than the portion where the background remaining portion or the rust is detected based on the line width image.
  • the imaging device includes at least two imaging devices that image the surface irradiated with the linear laser light from each of an upstream side in the moving direction and a downstream side in the moving direction.
  • a machined product obtained by machining a material after hot forming a billet or the like as an intermediate material after billet or the like as an intermediate material is hot formed.
  • the surface shape of the machined product can be inspected while the remaining surface of the surface of the material and the rust remaining on the surface and the rust appear.
  • FIG. 1 is an explanatory diagram showing an example of the configuration of the surface texture inspection apparatus 10 according to the present embodiment.
  • the surface texture inspection device 10 is a machined product obtained by machining a material after hot forming a billet or the like that is an intermediate material (hereinafter simply referred to as a “machined product using an intermediate material”). This is an apparatus for inspecting the surface property of the inspection object 1 that is moving in a predetermined direction.
  • the intermediate material is a material before becoming a final product such as a slab or billet. After such an intermediate material is hot-formed, machining such as polishing and cutting is performed, whereby a machined product focused on in this embodiment is manufactured.
  • a machined product for example, a disk-shaped manufactured by adjusting the shape of the material after hot forming a billet, which is a kind of intermediate material mainly composed of iron (Fe), by mechanical cutting Examples thereof include a processed product and a cylindrical processed product manufactured by adjusting the outer diameter of a material after hot forming a billet or the like by mechanical cutting or the like.
  • the intermediate material is not limited to a material mainly composed of iron, and may be a material mainly composed of a non-ferrous metal.
  • the surface texture inspection apparatus 10 images an object to be inspected that images the surface of the object 1 to be inspected (in particular, the machined surface of a machined product that is the object 1 to be inspected).
  • the apparatus 100 mainly includes an arithmetic processing device 200 that performs image processing on an image obtained as a result of imaging.
  • the inspection object imaging device 100 is installed above the moving inspection object 1 as described in detail below.
  • the inspection object imaging apparatus 100 is an apparatus that sequentially images the surface of the inspection object 1 moving in a predetermined direction and outputs a captured image obtained as a result of the imaging to the arithmetic processing apparatus 200. Imaging processing by the inspected object imaging device 100 is controlled by the arithmetic processing device 200.
  • a PLG Pulse Logic Generator: pulse-type speed detector
  • a transport line for transporting a machined product that is the object to be inspected 1 in order to detect the moving speed of the object 1 to be inspected. Yes.
  • the arithmetic processing device 200 periodically transmits a control signal to the inspecting object imaging device 100 based on one pulse of the PLG signal input from the PLG, and the inspecting object imaging device 100 is transmitted based on the control signal.
  • the inspected object imaging apparatus 100 can image the surface of the inspected object 1 every time the inspected object 1 moves by a predetermined distance or a predetermined time.
  • the arithmetic processing unit 200 controls the entire imaging process of the surface of the inspection object 1 in the inspection object imaging apparatus 100 as described above.
  • the arithmetic processing device 200 generates a fringe image frame using the captured image generated by the inspected object imaging device 100, and performs image processing on the fringe image frame, so that the inspected object 1 is inspected. Detects various defects that may be present on the surface. Examples of the above-mentioned various defects include, for example, a flaw accompanied by unevenness change, a flaw of a pattern system, a remaining surface of the material after the intermediate material is hot-formed (hereinafter referred to simply as “intermediate material as material”). ”And rust and the like.
  • FIGS. 2A to 4C are explanatory diagrams schematically showing an example of the configuration of the inspection subject imaging apparatus according to the present embodiment.
  • FIG. 3A is an explanatory diagram for explaining a state of reflection on the surface of a normal object to be inspected
  • FIG. 3B is an explanatory diagram for explaining a state of reflection on a rough surface portion of the object to be inspected.
  • FIG. 4A is an explanatory view schematically showing a light section image when a normal surface of an object to be inspected is imaged.
  • FIG. 3A is an explanatory diagram for explaining a state of reflection on the surface of a normal object to be inspected
  • FIG. 3B is an explanatory diagram for explaining a state of reflection on a rough surface portion of the object to be inspected.
  • FIG. 4A is an explanatory view schematically showing a light section image when a normal surface of an object to be inspected is imaged.
  • FIG. 4B is an explanatory view schematically showing a light section image when the surface of the object to be inspected including the concave portion is imaged.
  • FIG. 4C is an explanatory view schematically showing a light section image when the surface of the inspection object including the rough surface portion is imaged.
  • FIG. 2A is a schematic diagram when the inspection object imaging apparatus 100 is viewed from above the inspection object 1.
  • the illustration shown in the lower part of FIG. 2A is the inspection object imaging apparatus 100.
  • FIG. 2 is a schematic view when the device is viewed from the side of the device under test 1.
  • the diagram shown in the upper part of FIG. 2B is a schematic diagram when the object imaging device 100 is viewed from above the object 1 to be inspected
  • the figure shown in the lower part of FIG. 1 is a schematic diagram when an imaging apparatus 100 is viewed from the side of an object to be inspected 1.
  • the moving direction of the machined product that is the inspection object 1 is the y-axis positive direction
  • the direction orthogonal to the moving direction is the x-axis positive direction
  • the vertical direction is the positive z-axis direction.
  • the inspected object imaging device 100 includes an illumination device 101 and an area camera 103 which is an example of the imaging device.
  • the illumination device 101 and the area camera 103 are fixed by known means (not shown) so that their installation positions do not change.
  • the illumination device 101 is a device that illuminates the surface of the inspection object 1 by irradiating the surface of the machined product that is the inspection object 1 with predetermined light.
  • the illuminating device 101 includes at least a laser light source that irradiates the surface of the inspection object 1 with a linear laser beam L.
  • the illuminating device 101 condenses light source units that emit laser light of a predetermined wavelength, such as a visible light band, and the line width direction while expanding the laser light emitted from the light source unit in the x-axis direction.
  • a lens for example, a cylindrical lens, a rod lens, a Powell lens, etc.
  • the line width of the linear laser beam L immediately before reaching the surface of the device under test 1 can be, for example, about several hundred ⁇ m (for example, about 200 ⁇ m).
  • a linear bright portion is formed along the x-axis direction.
  • a line segment corresponding to this linear bright part is called a light cutting line.
  • the area camera 103 which is an example of an imaging device, is a device that images the entire surface of the inspection object 1 irradiated with the linear laser light L over the entire surface.
  • the area camera 103 includes a lens having a predetermined open aperture value and a focal length, and various sensors such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) that function as an imaging device.
  • the area camera 103 may be a monochrome camera or a color camera.
  • the focal length and angle of view of the lens mounted on the area camera 103 and the distance between the illumination device 101 and the image sensor of the area camera 103 are not particularly specified, x on the surface of the object 1 to be inspected The selection is preferably made so that the entire direction is within the field of view VF. Further, the size and pixel size of the image sensor mounted on the area camera 103 are not particularly specified, but it is preferable to use a larger image sensor in consideration of the image quality and image resolution of the generated image. Further, from the viewpoint of image processing described below, it is preferable that the line width of the linear laser beam L is adjusted to be about 2 to 4 pixels on the image sensor.
  • the area camera 103 images the linear laser light L applied to the surface of the object 1 to be inspected, so that an optical cutting line that is a line segment corresponding to the irradiated portion of the linear laser light L is imaged. In addition, a so-called light cut image is generated.
  • the area camera 103 When the area camera 103 generates the light cut image, the area camera 103 outputs the generated light cut image to the arithmetic processing device 200.
  • the optical positional relationship between the illumination device 101 and the area camera 103 can be set as appropriate.
  • the illumination device 101 is provided vertically above the inspection object 1, irradiates the inspection object 1 with the linear laser beam L vertically, and the area camera 103.
  • the illumination device 101 and the area camera 103 can be arranged so that the reflected light of the linear laser beam L is imaged from the direction of the angle ⁇ with respect to the vertical direction (z-axis direction).
  • the size of the angle ⁇ shown in FIG. 2A is preferably as large as possible within a range where there are no restrictions on the installation of the area camera 103. Thereby, it is possible to capture the irregular reflection of the light section line with the area camera 103.
  • the size of ⁇ shown in FIG. 2A is preferably about 30 to 60 degrees, for example.
  • FIG. 2A illustrates a case where the inspection object imaging apparatus 100 includes only one area camera which is an example of the imaging apparatus, but as illustrated in FIG. 2B, the inspection object imaging is performed.
  • the apparatus 100 includes at least two area cameras 103 and 105 so that the surface of the inspection object 1 irradiated with the linear laser beam L can be imaged from each of the upstream side in the movement direction and the downstream side in the movement direction. You may have.
  • the area camera 103 and the area camera 105 are equally arranged at an angle ⁇ on the upstream side and the downstream side in the moving direction of the device under test 1.
  • each area camera is provided on the upstream side and the downstream side in the moving direction, and by using the light cut image output from each area camera, the direction of the tilt is not affected. It becomes possible to inspect the surface property of the inspection object 1 more accurately.
  • ⁇ Inspected object (machined product) Width (length in the x-axis direction): about 600 mm to 1750 mm ⁇
  • Lighting device 101 Irradiates red laser light from a laser light source with an output of 100 mW.
  • the line width of the linear laser beam L irradiated on the surface of the inspection object 1 is 0.25 mm (250 ⁇ m). However, the line width in this case is defined as 13.5% from the peak intensity value.
  • ⁇ Area camera A CCD of 2048 pixels ⁇ 2048 pixels (pixel size: 5.5 ⁇ m ⁇ 5.5 ⁇ m) is mounted as an image sensor, and the frame rate is 200 fps.
  • the focal length of the lens is 24 mm and the field angle is 26 °.
  • the pixel size of the image to be captured is 0.25 mm ⁇ 0.25 mm, and the line width of the linear laser beam L is captured with a width of 2 to 4 bright lines on the captured image.
  • the width of the object to be inspected is large, for example, it is possible to arrange a plurality of area cameras in the width direction as necessary for securing pixel resolution.
  • Image acquisition method When the object to be inspected 1 is moving, images are continuously acquired in synchronization with a signal output from the PLG. Specifically, imaging is performed every time the inspected object 1 moves forward in a time of 5 msec. At this time, two area cameras installed as shown in FIG. 2B are synchronized, and the same visual field is simultaneously imaged.
  • the illumination device 101 is provided above the vertical direction and irradiates the linear laser beam L downward in the vertical direction.
  • the installation angle ⁇ of the two area cameras is ⁇ 45 degrees. Note that the imaging resolution is determined according to the size of the target scratch or the like, but the capture pitch is set to 0.25 mm to match the pixel size of the image sensor.
  • the machined product focused on as the object to be inspected 1 in the present embodiment is subjected to machining such as polishing or grinding, the surface thereof has a substantially constant surface property and has regular reflectivity. strong. Therefore, the surface of a normal part (hereinafter, also simply referred to as “normal part”) that does not have scratches with unevenness, a background remaining part, rust, or the like has only extremely small unevenness of 10 ⁇ m or less.
  • the linear laser light L irradiated to the normal part shows substantially the same reflection characteristics and is imaged by the area camera.
  • the line width of the light section line takes a substantially constant value as schematically shown in FIG.
  • the light cutting image at the concave portion shows a line of a substantially constant light cutting line.
  • the optical cutting line is bent while maintaining the width.
  • the remaining surface of the ground and the surface of the rust to which attention is paid in this embodiment have irregularities of about 100 ⁇ m, and are rougher than the normal part.
  • the background remaining portion and rust (hereinafter collectively referred to as “rough surface portion”) function as a diffusion surface because of the roughness of the surface, as schematically shown in FIG. 3B. Therefore, the linear laser beam L irradiated to the rough surface portion causes irregular reflection.
  • the line width of the portion corresponding to the rough surface portion is compared with the normal portion as schematically shown in FIG. 4C because of the irregular reflection as described above. Is expanded.
  • the remaining surface portion of the ground and the rust are made visible based on the line width of the light section line in the light section image schematically shown in FIGS. 4A to 4C. Further, the arithmetic processing device 200 generates information on the surface shape of the inspection object 1 by a so-called light cutting method using the light cutting image, and detects unevenness scratches and the like existing on the surface of the inspection object 1.
  • the configuration of the inspection object imaging device 100 according to the present embodiment and the light section image generated by the inspection object imaging device 100 have been described in detail above with reference to FIGS. 2A to 4C.
  • the arithmetic processing device 200 mainly includes an imaging control unit 201, an image processing unit 203, a display control unit 205, and a storage unit 207.
  • the imaging control unit 201 is realized by a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), a communication device, and the like.
  • the imaging control unit 201 performs imaging control of the inspection object 1 by the inspection object imaging apparatus 100 according to the present embodiment. More specifically, the imaging control unit 201 sends a control signal for starting oscillation of laser light to the illumination device 101 when imaging of the inspection object 1 is started.
  • a PLG signal is periodically sent from the conveyance line of the device under test 1 (for example, one pulse of PLG signal is sent every time the device under test 1 moves 0.25 mm).
  • the imaging control unit 201 sends a trigger signal for starting imaging to the area cameras 103 and 105 every time a PLG signal is acquired.
  • the image processing unit 203 is realized by, for example, a CPU, a ROM, a RAM, a communication device, and the like.
  • the image processing unit 203 uses the imaging data of the light section image acquired from the inspected object imaging device 100 (more specifically, the area cameras 103 and 105 of the inspected object imaging device 100) to form a fringe image frame to be described later. Is generated. Thereafter, the image processing unit 203 performs image processing as will be described below on the generated striped image frame to inspect the surface property of the machined product that is the inspection object 1 and may exist on the surface. Detects various types of defects. When the image processing unit 203 finishes the defect detection process on the surface of the inspection object 1, the image processing unit 203 transmits information about the obtained detection result to the display control unit 205.
  • the image processing unit 203 will be described in detail later.
  • the display control unit 205 is realized by, for example, a CPU, a ROM, a RAM, an output device, and the like.
  • the display control unit 205 transmits the surface property inspection result of the machined product that is the inspection object 1 transmitted from the image processing unit 203 to an output device such as a display provided in the arithmetic processing device 200 or the outside of the arithmetic processing device 200. Display control when displaying on the provided output device or the like is performed. Thereby, the user of the surface texture inspection apparatus 10 can grasp the inspection result regarding the surface texture of the machined product that is the inspection object 1 on the spot.
  • the storage unit 207 is realized by, for example, a RAM or a storage device included in the arithmetic processing device 200 according to the present embodiment.
  • the storage unit 207 stores various parameters, intermediate progress of processing, and various databases and programs that need to be saved when the arithmetic processing apparatus 200 according to the present embodiment performs some processing, as appropriate. To be recorded.
  • the storage unit 207 allows the imaging control unit 201, the image processing unit 203, the display control unit 205, and the like to execute read / write processing.
  • FIG. 5 is a block diagram illustrating an example of a configuration of an image processing unit included in the arithmetic processing apparatus according to the present embodiment.
  • the image processing unit 203 mainly includes a data acquisition unit 211, a fringe image frame generation unit 213, an image calculation unit 215, and a detection processing unit 225.
  • the data acquisition unit 211 is realized by, for example, a CPU, a ROM, a RAM, a communication device, and the like.
  • the data acquisition unit 211 acquires the image data (digital multi-valued image data) of the light section image output from the inspection subject imaging apparatus 100 (more specifically, the area cameras 103 and 105), and the storage unit 207 and the like. Are sequentially stored in the image memory provided. By using these digital multi-valued image data sequentially along the moving direction of the object 1 to be inspected, a fringe image frame as described later is generated.
  • the light section image acquired by the data acquisition unit 211 is a linear laser beam irradiated on the surface of the inspection object 1 at a certain position along the moving direction of the surface of the inspection object 1.
  • L is taken.
  • the light-cut image is displayed by setting the area camera gain and the lens aperture appropriately in advance, for example, the portion irradiated with the linear laser light L is displayed in white, and the other portions are displayed in black and white. It can be an image.
  • the irregularities superimposed on the light cutting line existing in the light cutting image and the line width of the light cutting line itself are various defects including the cross-sectional shape of the surface of the inspection object 1 and the rough surface portion existing on the surface. Contains information about etc.
  • the striped image frame generation unit 213 is realized by, for example, a CPU, a ROM, a RAM, and the like.
  • the fringe image frame generation unit 213 sequentially acquires the light section images stored along the moving direction of the device under test 1 from the image memory provided in the storage unit 207 or the like. Thereafter, the fringe image frame generation unit 213 uses an area including the optical cutting line among the obtained optical cutting images, and sequentially displays the image of the area including the optical cutting line along the moving direction of the inspection object 1. By arranging, a fringe image frame is generated.
  • the number of light cut images constituting one stripe image frame may be set as appropriate, but for example, one stripe image frame may be constituted by 256 light cut images.
  • FIG. 6 shows an example of a stripe image frame generated by the stripe image frame generation unit 213.
  • the striped image frame shown in FIG. 6 shows 16 light cut images out of 256 light cut images.
  • one line segment extending in the horizontal direction of the drawing corresponds to one light cut image, and the horizontal direction of the drawing corresponds to the x-axis direction in FIG. 2A and the like. doing.
  • the vertical direction of the drawing corresponds to the y-axis direction (that is, the moving direction of the device under test 1) in FIG. 2A and the like.
  • the stripe image frame generation unit 213 When the stripe image frame generation unit 213 generates the stripe image frame as illustrated in FIG. 6, the stripe image frame generation unit 213 outputs the generated stripe image frame to the image calculation unit 215 described later. Further, the fringe image frame generation unit 213 may associate the time information related to the date and time when the fringe image frame is generated with the data corresponding to the generated fringe image frame and store the data in the storage unit 207 or the like as history information. .
  • the image calculation unit 215 is realized by, for example, a CPU, a ROM, a RAM, and the like. Based on the fringe image frame generated by the fringe image frame generation unit 213, the image calculation unit 215 has a depth image representing an uneven state on the surface of the inspection object 1 and linear laser light on the surface of the inspection object 1. A luminance image representing the luminance distribution of L and a line width image in which the line width distribution in the moving direction of the linear laser beam L on the surface of the inspection object 1 is associated with the luminance value are calculated. As shown in FIG. 5, the image calculation unit 215 includes a light section line processing unit 217, a depth image calculation unit 219, a luminance image calculation unit 221, and a line width image calculation unit 223.
  • the light section line processing unit 217 is realized by, for example, a CPU, a ROM, a RAM, and the like.
  • the light section line processing unit 217 calculates, for each light section line included in the fringe image frame, a light section line feature amount including a displacement amount of the light section line (bending degree of the bright line).
  • a light section line feature amount including a displacement amount of the light section line (bending degree of the bright line).
  • FIGS. 7A and 7B is an explanatory diagram schematically showing a fringe image frame.
  • FIG. 7B is an explanatory diagram for explaining the optical section line processing performed by the optical section line processing unit.
  • FIG. 7A it is assumed that there are N light cutting lines in one striped image frame, and the horizontal length of the striped image frame is M pixels. Further, one optical section image which includes a single optical cutting line, and a vertical y n pixels ⁇ horizontal M pixels.
  • one vertical pixel number y size of n in the light section image in other words, including one optical cutting line, from one light section image, the number of vertical pixels to be cut out when fringe image frame is generated (the size of yn) can be determined by roughly calculating in advance the range of the height of the concave portion or the convex portion that may exist in the inspection object 1 based on past operation data or the like. is there.
  • the X-axis is taken in the x-axis direction (lateral direction of the striped image frame in FIG. 7A) orthogonal to the moving direction of the inspection object 1, and the y-axis direction corresponding to the moving direction of the inspection object 1
  • the Y-axis is taken in the vertical direction of the striped image frame in FIG. 7A, and the pixel position in the striped image frame is expressed by XY coordinates.
  • the position of the m-th pixel (1 ⁇ m ⁇ M) from the left side of the j (1 ⁇ j ⁇ N) th light section line existing in the fringe image frame ie, represented by X j, m ). The position).
  • the light section line processing unit 217 first determines an X coordinate position (a position represented by X j, m in this description) of a light section line (hereinafter also simply referred to as “line”) to be focused.
  • line a light section line
  • the distribution of pixel values that is, the luminance value of the light section line
  • the light section line processing unit 217 does not perform the process described below for all the pixels at the X coordinate position in the light section image, but the reference position Y s of the Y coordinate in the light section image.
  • the processing described below is performed for pixels belonging to the range of W before and after W (that is, pixels belonging to the range of Y s ⁇ W to Y s + W).
  • the reference position Y s of the Y coordinate is a position in the y-axis direction that is designated in advance with respect to the j-th light section image of the striped image frame.
  • the parameter W that defines the processing range can be determined as follows. That is, the range of the height of the concave portion and the convex portion that may exist in the inspection object 1 is specified based on past operation data and the like, and the range of W before and after the reference position Y s of the Y coordinate in the light section image is determined. What is necessary is just to determine suitably the magnitude
  • the light section line processing unit 217 has a luminance value equal to or higher than a predetermined threshold Th for specifying a pixel corresponding to the light section line from among pixels included in the range of Y s ⁇ W to Y s + W. Identify the pixel.
  • a predetermined threshold Th for specifying a pixel corresponding to the light section line from among pixels included in the range of Y s ⁇ W to Y s + W. Identify the pixel.
  • three pixels represented by Y j, k , Y j, k + 1 , Y j, k + 2 have luminance values I j, k , I j, k + 1 , I j that are equal to or higher than the threshold Th, respectively. , K + 2 .
  • the number p j, m obtained by adding pixels having luminance values equal to or greater than the predetermined threshold Th in the line width direction is a value corresponding to the number of pixels of the bright line at the position (j, m). one of.
  • the light section line processing unit 217 performs information (Y j, k , I j, k ), (Y j, k + 1 , I j, k + 1 ), (Y j, k + 2 ) regarding the extracted pixels in the following processing. , I j, k + 2 ) (hereinafter, sometimes simply abbreviated as (Y, I)), further light section line feature quantities are calculated.
  • the light section line processing unit 217 calculates the total sum K j, m of the luminances of the extracted pixels using the parameters p j, m and the information (Y, I) regarding the extracted pixels.
  • This total luminance K j, m is also one of the features of the light section line.
  • the center-of-gravity position Y C (j, m) is a value represented by the following expression 101, where A is a set of extracted pixels. Therefore, in the case of the example shown in FIG. 7B, the center-of-gravity position Y C (j, m) is a value represented by the following expression 101a.
  • the position in the Y-axis direction corresponding to the pixel is a value quantized with a take-in pitch (for example, 0.25 mm) in the inspected object imaging apparatus 100.
  • the center-of-gravity position Y C (j, m) calculated by the calculation shown in the above equation 101 is a value calculated by using a numerical calculation called division. It can be smaller than the take-in pitch (so-called quantization unit). Therefore, the displacement amount ⁇ d j, m calculated using the center-of-gravity position Y C (j, m) is also a value that can have a value smaller than the movement width.
  • the displacement amount ⁇ d j, m calculated in this way is also one of the light section line feature amounts.
  • the light section line processing unit 217 calculates the above three types of feature amounts with respect to M elements included in each section line. As a result, as shown in FIGS. 8A to 8C, a two-dimensional array of M columns ⁇ N rows is generated with respect to the amount of displacement ⁇ d of the light section line, the luminance sum K, and the number of pixels p of the bright line.
  • the light section line processing unit 217 outputs the feature amount related to the displacement amount ⁇ d of the light section line among the calculated light section line feature amounts to the depth image calculation unit 219 described later. In addition, the light section line processing unit 217 outputs, to the brightness image calculation unit 221, which will be described later, among the calculated light section line feature amounts, the brightness sum K and the feature amount related to the number of bright line pixels p. Further, the light section line processing unit 217 outputs the feature amount related to the number of pixels p of the bright line among the calculated light section line feature amounts to the line width image calculation unit 223 described later.
  • the depth image calculation unit 219 is realized by, for example, a CPU, a ROM, a RAM, and the like.
  • the depth image calculation unit 219 is a depth image that represents the uneven state of the surface of the object 1 to be inspected based on the optical cutting line feature value (particularly, the feature value related to the displacement amount ⁇ d) generated by the optical cutting line processing unit 217. Is calculated.
  • the depth image calculation unit 219 performs an angle (a two-dimensional arrangement) regarding the amount of displacement ⁇ d as shown in FIG. 8A and the angle formed by the linear laser beam L and the optical axis of the area camera ( The depth image is calculated using the angle ⁇ ) in FIGS. 2A and 2B.
  • This depth image is an image representing a two-dimensional uneven state distribution in which the one-dimensional distribution of the uneven state at each position in the Y-axis direction is sequentially arranged along the Y-axis direction.
  • FIG. 9 is an explanatory diagram showing the relationship between the displacement of the optical cutting line and the height of the defect.
  • FIG. 9 schematically shows a case where a concave portion exists on the surface of the inspection object 1.
  • the difference between the height of the surface position and the height of the bottom of the recess when the recess is not present on the surface of the inspection object 1 is represented by ⁇ h.
  • the linearly incident linear laser beam L is surface-reflected
  • the reflected light propagates like the light ray A in FIG.
  • the reflected light propagates like a light beam B in FIG.
  • the deviation between the light beam A and the light beam B is observed as the displacement ⁇ d of the light cutting line in this embodiment.
  • FIG. 9 demonstrated the case where a recessed part exists in the surface of the to-be-inspected object 1, even if it is a case where a convex part exists in the surface of the to-be-inspected object 1, the same relationship is materialized.
  • the depth image calculation unit 219 uses the relationship as described above, and relates to the unevenness of the surface of the object 1 to be inspected based on the feature amount related to the displacement amount ⁇ d of the optical cutting line calculated by the optical cutting line processing unit 217.
  • the amount ⁇ h is calculated.
  • the displacement amount ⁇ d of the light cutting line used for the calculation of the depth image is calculated based on the barycentric position of the light cutting line as described above, and has a value smaller than the capture pitch. It is a possible value. Therefore, the depth image calculated by the depth image calculation unit 219 is an image in which unevenness is reproduced with a resolution finer than the pixel size of the image sensor.
  • the optical cutting line may be distorted such as a curve.
  • the unevenness superimposed on the optical cutting line is information on the cross-sectional shape of the surface of the object 1 and surface defects existing on the surface. Therefore, when the depth image calculation unit 219 calculates the depth image based on the displacement amount ⁇ d of the light cutting line, the depth image calculation unit 219 performs distortion correction processing for each light cutting line, and the unevenness superimposed on the light cutting line. Only the information regarding may be extracted. By performing such a distortion correction process, it is possible to obtain only information on the uneven ridges present on the surface of the inspected object 1 even when the optical cutting line has a distortion such as curvature. .
  • the depth image calculation unit 219 outputs information on the depth image calculated as described above to the detection processing unit 225 described later.
  • the luminance image calculation unit 221 is realized by, for example, a CPU, a ROM, a RAM, and the like.
  • the luminance image calculation unit 221 generates a line on the surface of the object 1 to be inspected based on the light cutting line feature amount generated by the light cutting line processing unit 217 (particularly, the feature amount related to the luminance sum K and the number of pixels p of the bright line).
  • a luminance image representing the luminance distribution of the laser beam L is calculated.
  • the luminance image calculation unit 221 performs a feature amount (two-dimensional array) related to the luminance sum K as shown in FIG. 8B and a feature amount related to the number of pixels p of bright lines as shown in FIG. 8C (
  • the average luminance K AVE (j, m) K j, m / p j, m (1 ⁇ j ⁇ N, 1 ⁇ m ⁇ ), which is the average value of the total luminance in the line width direction. M) is calculated.
  • the luminance image calculation unit 221 sets the data array formed of the calculated average luminance K AVE (j, m) as the luminance image of the object 1 to be inspected.
  • Such a luminance image is an image representing a two-dimensional luminance distribution in which the one-dimensional luminance distribution of the linear laser beam L at each position in the Y-axis direction is sequentially arranged along the Y-axis direction.
  • the luminance image calculation unit 221 outputs information on the luminance image calculated as described above to the detection processing unit 225 described later.
  • the line width image calculation unit 223 is realized by, for example, a CPU, a ROM, a RAM, and the like.
  • the line width image calculation unit 223 is configured to generate a linear laser beam on the surface of the inspected object 1 based on the light section line feature amount generated by the light section line processing unit 217 (particularly, the feature amount regarding the number of pixels p of the bright line).
  • a line width image in which the distribution of the line width in the moving direction of L is associated with the luminance value is calculated.
  • FIG. 11 is an explanatory diagram for explaining the line widths of the light cutting lines in the normal part and the rough surface part.
  • FIG. 12A is an explanatory diagram for explaining the line width of the optical cutting line in the normal portion
  • FIG. 12B is an explanatory diagram for explaining the line width of the optical cutting line in the rough surface portion.
  • FIG. 13 is an explanatory diagram for describing a line width image according to the present embodiment.
  • the linear laser light L irradiated to the rough surface portion is scattered by the rough surface portion. Is done. Therefore, as schematically shown in FIGS. 4C and 11, an increase in the line width of the light section line is recognized at the position of the light section image corresponding to the rough surface portion.
  • the position corresponding to the AA ′ cutting line corresponds to the normal part
  • the position corresponding to the BB ′ cutting line is a rough line. It corresponds to the surface part.
  • a pixel having a luminance value equal to or higher than a predetermined threshold Th for specifying a pixel corresponding to the light cutting line is schematically illustrated in FIG. 12A. Assume that there are 3 pixels as shown. Similarly, in the luminance distribution of the rough surface portion along the B-B ′ cutting line, it is assumed that the pixels having the luminance value equal to or higher than the threshold Th are 8 pixels as schematically shown in FIG. 12B.
  • the line width of the linear laser beam L is set to correspond to 2 to 4 pixels. Since the line width of the linear laser light L is recognized in the rough surface portion, the line width of the linear laser light L in the rough surface portion is set to 4 pixels which are set values in the inspection object imaging apparatus 100. Exceeds value. Therefore, regarding the line width of the linear laser beam L, it is possible to distinguish between the normal portion and the rough surface portion by setting in advance a predetermined threshold Th2 for specifying a pixel corresponding to the rough surface portion. Become. For example, in the imaging device 100 to be inspected, it is assumed that the line width of the normal portion of the linear laser light L is set to 2 to 4 pixels. In this case, by setting the threshold Th2 as 5 pixels with a margin, a portion having a line width of 5 pixels or more can be distinguished as a rough surface portion.
  • the line width image calculation unit 223 corresponds to the rough surface portion by determining the threshold value of the feature amount related to the number of pixels p of the bright lines generated by the light cutting line processing unit 217 based on the threshold value Th2. It becomes possible to specify the pixel position.
  • the line width image calculation unit 223 By setting the threshold value Th2 as described above, the position of the rough surface portion can be specified, while the line width image generated by the line width image calculation unit 223 does not reveal the normal portion. It is important to determine the luminance value that constitutes. Therefore, in the line width image calculation unit 223 according to the present embodiment, for each light cutting line in the striped image frame, the line width and a predetermined threshold line at each position in the extending direction (Y-axis direction) of the light cutting line. A line width image is calculated by calculating a difference from the width and assigning a luminance value according to the calculated magnitude of the difference.
  • the line width image calculation unit 223 refers to the feature quantity related to the number of pixels p of the bright lines generated by the light section line processing unit 217, and determines the number of bright lines pixels p at each position and the threshold line width.
  • the difference from the corresponding threshold value Th2 is calculated. Note that, if the difference from the threshold Th2 is calculated for the normal part, the difference may be a negative value. However, the difference value in this case is preferably set to zero instead of being a negative value.
  • the line width image calculation unit 223 determines the luminance value at each pixel position so that the luminance value of the pixels constituting the line width image increases as the calculated difference value increases.
  • the line width image calculation unit 223 can determine the luminance value at each pixel position as follows, for example. That's fine.
  • the example shown below has shown about the case where 8 bit full scale is used, it is not necessary to use 8 bit full scale.
  • the visibility of the generated line width image can be improved by setting the luminance value in the full scale range.
  • the method of assigning the luminance value according to the difference value is not limited to the above example, and any assignment method may be adopted as long as the luminance value increases as the difference value increases. It is possible.
  • the line width image is generated as an image having a luminance value of 8 bits has been described, but it is needless to say that the line width image may be generated as an image having a luminance value exceeding 8 bits. .
  • a line width image as shown on the right side of FIG. 13 can be generated.
  • the line width image generated by such processing is a two-dimensional line width in which a one-dimensional distribution of the line width of the linear laser light L at each position in the Y-axis direction is sequentially arranged along the Y-axis direction. It is an image showing distribution.
  • the line width image calculation unit 223 outputs information on the line width image calculated as described above to the detection processing unit 225 described later.
  • the detection processing unit 225 is realized by, for example, a CPU, a ROM, a RAM, and the like. Based on the depth image, the luminance image, and the line width image respectively calculated by the depth image calculation unit 219, the luminance image calculation unit 221, and the line width image calculation unit 223, the detection processing unit 225 detects the surface of the object 1 to be inspected. Detect properties.
  • the detection processing unit 225 includes a defect part specifying function for specifying a defective part based on a depth image, a luminance image, and a line width image, and a feature quantity extracting function for extracting a feature quantity regarding the form and pixel value of the specified defective part. And a defect discriminating function for discriminating the type of defect, the degree of harmfulness, etc. based on the extracted feature quantity.
  • a defect part specifying function for specifying a defective part based on a depth image, a luminance image, and a line width image
  • a feature quantity extracting function for extracting a feature quantity regarding the form and pixel value of the specified defective part.
  • a defect discriminating function for discriminating the type of defect, the degree of harmfulness, etc. based on the extracted feature quantity.
  • the detection processing unit 225 emphasizes the area of the rough surface portion by filtering processing that obtains a linear sum of luminance values with peripheral pixels as necessary for each pixel of the acquired line width image. Then, it is determined whether or not the obtained value is equal to or greater than the first threshold value for specifying the rough surface portion.
  • the detection processing unit 225 performs a filter process for obtaining a linear sum of pixel values (values representing depth or luminance values) with peripheral pixels for each pixel of the acquired depth image and luminance image. Regions such as vertical line wrinkles, horizontal line wrinkles, and fine wrinkles are emphasized, and it is determined whether or not the obtained value is equal to or greater than a second threshold value for specifying a defective part. By performing such a filtering process and a determination process based on the filtering process result, the detection processing unit 225 can generate a binarized image for specifying a defective part.
  • the detection processing unit 225 identifies each defective portion by combining consecutively generated defect portions.
  • a feature amount related to the form and luminance value of the defective part is determined for each specified defective part. Extract.
  • the feature quantity related to the form of the defective part include the width of the defective part, the length of the defective part, the peripheral length of the defective part, the area of the defective part, and the area of the circumscribed rectangle of the defective part.
  • examples of the feature amount related to the luminance value of the defective part include a maximum value, a minimum value, and an average value of the luminance of the defective part.
  • the detection processing unit 225 extracts a feature quantity regarding the form and pixel value of the defective part for each specified defective part.
  • the feature quantity related to the form of the defective part include the width of the defective part, the length of the defective part, the peripheral length of the defective part, the area of the defective part, and the area of the circumscribed rectangle of the defective part.
  • the maximum value, the minimum value, the average value, etc. of the depth of the defective part can be mentioned, and for the luminance image, the maximum luminance of the defective part can be mentioned. Value, minimum value, average value, and the like.
  • the detection processing unit 225 determines the type of defect, the degree of harm, and the like for each defective portion based on the extracted feature amount.
  • the determination processing such as the type of defect and the degree of harmfulness based on the feature amount is performed using a logic table as shown in FIG. 14, for example. That is, the detection processing unit 225 determines the type of defect and the degree of harmfulness based on the determination condition represented by the logic table illustrated in FIG.
  • defect types (defects A1 to An) are described as vertical items in the logic table, and feature types (feature amounts B1) are displayed as horizontal items in the logic table.
  • feature amount Bm To feature amount Bm).
  • a discrimination conditional expression conditional expression C11 to conditional expression Cnm
  • Each row of such a logic table is a set, and becomes a determination condition for each type of defect. The determination process is performed in order from the type described in the top line, and ends when all the determination conditions described in any one line are satisfied.
  • Such a logic table is obtained by a known method using a database constructed by a learning process in which past operation data and a result of specifying a defect type and a hazard level by an examiner based on the operation data are used as teacher data. It is possible to generate.
  • the detection processing unit 225 specifies the type of defect and the degree of harm for each defective part detected in this manner, and outputs the obtained detection result to the display control unit 205. Thereby, the information regarding the defect which exists in the surface of the machined goods which are the to-be-inspected objects 1 will be output to a display part (not shown).
  • the detection processing unit 225 may output the obtained detection result to an external device such as a manufacturing control process computer, or may create a product defect form using the obtained detection result. Good.
  • the detection processing unit 225 may store the information related to the detection result of the defective part as history information in the storage unit 207 or the like in association with time information related to the date and time when the information is calculated.
  • a discriminator such as a neural network or a support vector machine (SVM) is generated by learning processing using past operation data and a result of specifying a defect type and a hazard level by a tester based on the operation data as teacher data, Such a discriminator may be used for discriminating the type of defect and the degree of harm.
  • SVM support vector machine
  • the detection processing unit 225 performs defect detection processing based on a predetermined threshold determination using a line width image, so that it is difficult to distinguish by the conventional light cutting method.
  • the portion of the rough surface portion (that is, the background remaining portion and rust) can be easily revealed.
  • the detection processing unit 225 performs the above-described processing on a portion other than the portion where the rough surface portion (that is, the background remaining portion and rust) is detected based on the line width image during the defect detection processing based on the depth image and the luminance image.
  • Such a defect site detection process may be performed. Thereby, it is possible to increase the speed of the defect detection process based on the depth image and the luminance image.
  • the detection processing unit 225 confirms whether or not the rough surface portion specified based on the line width image is extracted as a defect candidate portion in the luminance image, and performs a double check of the detection result. May be.
  • the optical correction line processing unit 217 may execute the approximate correction process before the optical cutting line processing unit 217 calculates the optical cutting line feature amount.
  • each component described above may be configured using a general-purpose member or circuit, or may be configured by hardware specialized for the function of each component.
  • the CPU or the like may perform all functions of each component. Therefore, it is possible to appropriately change the configuration to be used according to the technical level at the time of carrying out the present embodiment.
  • a computer program for realizing each function of the arithmetic processing apparatus according to the present embodiment as described above can be produced and mounted on a personal computer or the like.
  • a computer-readable recording medium storing such a computer program can be provided.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
  • the above computer program may be distributed via a network, for example, without using a recording medium.
  • the configuration of the surface property inspection apparatus 10 according to this embodiment has been described in detail above.
  • the surface texture inspection apparatus 10 according to the present embodiment it is possible to detect the residual surface portion and rust remaining on the machined surface with the same apparatus configuration as the apparatus based on the conventional light cutting method.
  • detection based on a conventional light cutting method such as an uneven surface or a surface pattern is also possible, a wide variety of defective parts can be inspected over the entire machined surface of a machined product.
  • the application destination of the surface texture inspection apparatus 10 according to the present embodiment is not limited to the specific machined product as described above, and the roughness of the defective portion is different from that of the normal portion. If it is such a product, the surface texture inspection apparatus 10 according to the present embodiment can be applied.
  • the two-dimensional image of the distribution of the remaining surface and the rust portion having low visibility makes it easy to identify a defective part by image processing, and uses an image with good visibility even in visual inspection. It becomes possible.
  • FIG. 15 is a flowchart showing an example of the flow of the surface texture inspection method according to the present embodiment.
  • the inspected object imaging device 100 of the surface property inspection apparatus 10 uses a linear laser beam L under the control of the imaging control unit 201 of the arithmetic processing device 200 for the machined product that is the inspected object 1.
  • the surface is imaged to generate a light section image (step S101).
  • the inspection object imaging device 100 outputs the generated light section image to the arithmetic processing device 200.
  • the data acquisition unit 211 of the image processing unit 203 included in the arithmetic processing device 200 acquires the image data of the light section image
  • the data section 211 stores the acquired light section image along the moving direction of the object 1 to be inspected. Are sequentially stored in an image memory provided in the storage.
  • the fringe image frame generation unit 213 of the image processing unit 203 included in the arithmetic processing device 200 generates a fringe image frame by sequentially arranging the acquired light-cut images along the moving direction of the device under test 1 ( Step S103).
  • the stripe image frame generation unit 213 outputs the generated stripe image frame to the light section line processing unit 217.
  • the light section line processing unit 217 uses the generated fringe image frame, and for each light section line, the number of pixels having a luminance equal to or higher than a predetermined threshold Th, the sum of the brightness of the pixels, and the displacement of the light section line. The amount is calculated (step S105). These calculation results are used as the feature value of the light section line.
  • the calculated light section line feature amount is output to the depth image calculation unit 219, the luminance image calculation unit 221 and the line width image calculation unit 223, respectively.
  • the depth image calculation unit 219 calculates a depth image using the calculated light section line feature amount (particularly, a feature amount related to the displacement amount of the light section line) (step S107). Further, the luminance image calculation unit 221 calculates a luminance image by using the calculated light section line feature amount (particularly, the feature amount relating to the number of pixels of the bright line and the feature amount relating to the sum of luminance) (step S107). ). Further, the line width image calculation unit 223 calculates a line width image using the calculated light section line feature amount (particularly, the feature amount related to the number of pixels of the bright line) (step S107). The depth image calculation unit 219, the luminance image calculation unit 221, and the line width image calculation unit 223 output the calculated images to the detection processing unit 225.
  • the detection processing unit 225 performs a surface property detection process of the inspection object 1 using the calculated depth image, luminance image, and line width image (step S109). As a result, the detection processing unit 225 can specify the type of defect and the degree of harmfulness for various types of defective parts existing on the surface of the inspection object 1. Through the flow as described above, various types of defects existing on the surface of the machined product that is the inspection object 1 are detected.
  • FIG. 16 is a block diagram for explaining a hardware configuration of the arithmetic processing device 200 according to the embodiment of the present invention.
  • the arithmetic processing apparatus 200 mainly includes a CPU 901, a ROM 903, and a RAM 905.
  • the arithmetic processing device 200 further includes a bus 907, an input device 909, an output device 911, a storage device 913, a drive 915, a connection port 917, and a communication device 919.
  • the CPU 901 functions as a central processing device and control device, and controls all or a part of the operation in the arithmetic processing device 200 according to various programs recorded in the ROM 903, the RAM 905, the storage device 913, or the removable recording medium 921. To do.
  • the ROM 903 stores programs used by the CPU 901, calculation parameters, and the like.
  • the RAM 905 primarily stores programs used by the CPU 901, parameters that change as appropriate during execution of the programs, and the like. These are connected to each other by a bus 907 constituted by an internal bus such as a CPU bus.
  • the bus 907 is connected to an external bus such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge.
  • PCI Peripheral Component Interconnect / Interface
  • the input device 909 is an operation means operated by the user such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
  • the input device 909 may be, for example, remote control means (so-called remote control) using infrared rays or other radio waves, or may be an external connection device 923 such as a PDA corresponding to the operation of the arithmetic processing device 200. May be.
  • the input device 909 includes, for example, an input control circuit that generates an input signal based on information input by a user using the operation unit and outputs the input signal to the CPU 901. By operating the input device 909, the user can input various data or instruct processing operations to the arithmetic processing device 200.
  • the output device 911 is configured by a device that can notify the user of the acquired information visually or audibly.
  • Such devices include display devices such as CRT display devices, liquid crystal display devices, plasma display devices, EL display devices and lamps, audio output devices such as speakers and headphones, printer devices, mobile phones, and facsimiles.
  • the output device 911 outputs results obtained by various processes performed by the arithmetic processing device 200, for example. Specifically, the display device displays the results obtained by various processes performed by the arithmetic processing device 200 as text or images.
  • the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs the analog signal.
  • the storage device 913 is a data storage device configured as an example of a storage unit of the arithmetic processing device 200.
  • the storage device 913 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the storage device 913 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
  • the drive 915 is a recording medium reader / writer, and is built in or externally attached to the arithmetic processing unit 200.
  • the drive 915 reads information recorded on a removable recording medium 921 such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 905.
  • the drive 915 can also write a record on a removable recording medium 921 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
  • the removable recording medium 921 is, for example, a CD medium, a DVD medium, a Blu-ray (registered trademark) medium, or the like.
  • the removable recording medium 921 may be a compact flash (registered trademark) (CompactFlash: CF), a flash memory, an SD memory card (Secure Digital memory card), or the like. Further, the removable recording medium 921 may be, for example, an IC card (Integrated Circuit card) on which a non-contact IC chip is mounted, an electronic device, or the like.
  • connection port 917 is a port for directly connecting a device to the arithmetic processing device 200.
  • Examples of the connection port 917 include a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, an RS-232C port, and an HDMI (registered trademark) High-Definition Multimedia interface.
  • the communication device 919 is a communication interface configured by a communication device for connecting to the communication network 925, for example.
  • the communication device 919 is, for example, a communication card for a wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB).
  • the communication device 919 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
  • the communication device 919 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet and other communication devices.
  • the communication network 925 connected to the communication device 919 is configured by a wired or wireless network, for example, the Internet, a home LAN, an in-house LAN, infrared communication, radio wave communication, satellite communication, or the like. May be.
  • each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Therefore, it is possible to change the hardware configuration to be used as appropriate according to the technical level at the time of carrying out this embodiment.
  • the surface texture inspection apparatus and the surface texture inspection method according to the present invention will be specifically described with reference to examples.
  • the Example shown below is only an example of the surface property inspection apparatus and surface property inspection method which concern on this invention, and the surface property inspection apparatus and surface property inspection method which concern on this invention are limited to the following example. It is not a thing.
  • the surface property inspection device 10 having the inspection object imaging device 100 having the configuration shown in FIG. 2B was used to inspect the surface property of the inspection object.
  • a disk-shaped processed product manufactured by grinding a material after hot forming a billet, which is an intermediate material mainly composed of Fe, is used as the object to be inspected 1.
  • the ground surface (the surface spreading in the radial direction of the disk) was the inspection target surface.
  • the diameter of such a disk-shaped processed product is 860 mm.
  • a red LD module that emits red laser light is used as an illuminating device of the inspected object imaging apparatus 100, and a point laser beam (output: 100 mW) emitted from the module is incident on a cylindrical lens to form a linear shape.
  • the laser beam L was obtained.
  • the linear laser beam L is controlled so as to have a length corresponding to the radius (430 mm) of the disk-shaped workpiece (length in the x-axis direction in FIG. 2B), and is rotating. Irradiated to the ground surface of the workpiece.
  • the line width of the linear laser beam L irradiated on the surface of the inspection object 1 was set to 0.25 mm (250 ⁇ m).
  • a commercially available camera in which a CCD (pixel size: 5.5 ⁇ m ⁇ 5.5 ⁇ m) of 2048 pixels ⁇ 2048 pixels was mounted as an image sensor was used.
  • the frame rate of such an image sensor is 200 fps.
  • the focal length of the lens mounted on the camera is 24 mm and the field angle is 26 °.
  • the pixel size of the image to be captured is 0.25 mm ⁇ 0.25 mm, and the line width of the linear laser beam L is captured with a width of 2 to 4 bright lines on the captured image.
  • images are continuously captured by two area cameras in synchronization with the signal output from the PLG. I got it. Specifically, imaging was performed every time the disk-shaped processed product was rotated for 5 msec.
  • the illumination device 101 is provided above the vertical direction, irradiates the linear laser beam L downward in the vertical direction, and the installation angle ⁇ of the two area cameras is ⁇ 45 degrees.
  • the image capture pitch was set to 0.25 mm to match the pixel size of the image sensor.
  • the light cut image obtained by the inspection object imaging device 100 as described above was subjected to image processing by the arithmetic processing device 200 having the configuration as described above, and the surface property of the ground surface of the disk-like processed product was inspected.
  • the line width threshold Th2 is set to 5 pixels, and the luminance value is assigned using the 8-bit full scale as mentioned above.
  • the light cut image of the portion corresponding to the normal portion, the light cut image of the portion corresponding to the concave portion, and the light cut image of the portion corresponding to the remaining background portion are summarized in FIG. Indicated.
  • the straight line segment is imaged while the line width of the light cut line is substantially constant, and in the light cut image corresponding to the recess, It can be seen that the light cutting line is bent at a position corresponding to the recess without substantially changing the line width of the light cutting line.
  • the line width of the light cut line was enlarged as is clear when compared with the light cut image corresponding to the normal portion.
  • area camera 1 is an area camera provided on the downstream side in the rotational direction of the disk-shaped workpiece
  • area camera 2 is provided on the upstream side in the rotational direction of the disk-shaped workpiece.
  • the line width image based on the light cut image from the area camera 1 is the line width image based on the light cut image from the area camera 2. It is clear that it is clearer. This result suggests that the background portion of the disk-like processed product used as the object to be inspected 1 has surface directionality. Even in such a case, by capturing the linear laser beam L from a plurality of directions, it is possible to more surely reveal the remaining portion of the background, and underestimate the remaining portion of the background and rust. It was confirmed that the possibility of being lost could be avoided.
  • DESCRIPTION OF SYMBOLS 10 Surface texture inspection apparatus 100 Test object imaging device 101 Illumination device 103,105 Area camera 200 Arithmetic processing device 201 Imaging control part 203 Image processing part 205 Display control part 207 Storage part 211 Data acquisition part 213 Stripe image frame generation part 215 Image Calculation unit 217 Optical section line processing unit 219 Depth image calculation unit 221 Luminance image calculation unit 223 Line width image calculation unit 225 Detection processing unit

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Textile Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

[Problem] To inspect the surface shape of a machined product obtained through machining of a material after hot forming of a billet or other intermediate material while making any remaining original surface portions or rust more evident. [Solution] This surface property inspection device inspects a machined product as an inspection object and comprises: an illumination device for illuminating linear laser light onto the surface of a moving inspection object; an imaging device for generating light-section images by photographing the surface irradiated with the linear laser light; an image calculation unit for using a striped image frame obtained from the light-section images to calculate a depth image, luminance image, and line-width image; and a detection processing unit for detecting surface characteristics of the inspection object on the basis of the calculated depth image, luminance image, and line-width image. The image calculation unit calculates, for each light-section line in the striped image frame, the differences between the line widths at each position in the extension direction of the light-section line and a prescribed threshold line width and calculates the line-width image by assigning brightness values according to the sizes of the differences. The detection processing unit detects a remaining original surface portion or rust on the basis of the line-width image.

Description

表面性状検査装置、表面性状検査方法及びプログラムSurface texture inspection device, surface texture inspection method and program
 本発明は、表面性状検査装置、表面性状検査方法及びプログラムに関する。 The present invention relates to a surface texture inspection device, a surface texture inspection method, and a program.
 スラブやビレット等に代表される中間素材に対して熱間成形等を行った後、各種の機械加工を施すことで製造される機械加工鋼材は、出荷に際して、適切な表面性状を有しているか否かの検査(表面検査)が行われる。かかる機械加工鋼材の表面検査は、打音による検査や目視による検査として実施されることが多い。これらの打音や目視による検査方法の場合、検査員の技量に負う部分が大きい。特に目視による検査には、磁粉探傷、外観検査など複数の検査項目が存在し、作業を煩雑なものとさせている。かかる機械加工鋼材では、表面に機械加工に伴う微小なキズが生成される、又は、母材の中間素材であるビレット等を熱間成形した後の材料の表面が、機械加工鋼材の表面に残存する、などといった、各種の表面性状の変化が生じうる。かかる表面の微小キズや、中間素材であるビレット等を熱間成形した後の材料の表面(地肌)の残存等が存在している機械加工鋼材が、破廉恥キズとして流出しないように、表面性状の検査には正確性が求められる。 Does the machined steel manufactured by performing various types of machining after intermediate forming such as slabs and billets have appropriate surface properties at the time of shipment? An inspection (surface inspection) of whether or not is performed. Such surface inspection of machined steel materials is often performed as inspection by hitting sound or inspection by visual inspection. In the case of these hitting sounds and visual inspection methods, the parts that are borne by the skill of the inspector are large. In particular, visual inspection includes a plurality of inspection items such as magnetic particle inspection and appearance inspection, which complicates operations. In such a machined steel material, fine scratches associated with machining are generated on the surface, or the surface of the material after hot forming a billet that is an intermediate material of the base material remains on the surface of the machined steel material. Various surface property changes can occur, such as. In order to prevent the machined steel material, which has microscopic scratches on the surface and the remaining surface (background) of the material after hot forming of the billet, which is an intermediate material, from flowing out as a shameful scratch, Inspection requires accuracy.
 上記のような機械加工鋼材の一例として、例えば、中間素材であるビレット等を熱間成形した後の材料を機械切削により形状を整えることで製造される円盤状の加工品や、中間素材であるビレット等を熱間成形した後の材料の外径を機械切削等により整えることで製造される円柱状の加工品等を挙げることができる。これら機械加工鋼材の検査方法として、以下の特許文献1では、例えば、機械加工鋼材の表面にラインレーザを照射し、その表面を全面にわたって撮像する、いわゆる光切断法による検査装置や、機械加工鋼材の表面に投光された光の受光強度の変化によって表面状態を監視する光学的検査などが提案されている。 As an example of the machined steel material as described above, for example, a disk-shaped processed product manufactured by adjusting the shape of a material after hot forming a billet or the like as an intermediate material by machine cutting, or an intermediate material Examples thereof include a cylindrical processed product manufactured by adjusting the outer diameter of a material after hot forming a billet or the like by mechanical cutting or the like. As an inspection method of these machined steel materials, in Patent Document 1 below, for example, an inspection device using a so-called optical cutting method that irradiates the surface of a machined steel material with a line laser and images the entire surface, or a machined steel material. There has been proposed an optical inspection for monitoring the surface state by changing the light receiving intensity of the light projected on the surface of the light.
 しかしながら、機械加工鋼材の表面に存在しうる、中間素材であるビレット等を熱間成形した後の材料の表面が残存した地肌残存部や、錆等といった有害キズは、凹凸が百μm以下と極めて小さいため、以下の特許文献1で提案されているような光切断法では検出することができない。 However, harmful scratches such as rust and other residual scratches on the surface of the material after hot forming a billet, which is an intermediate material, that can exist on the surface of machined steel, and unevenness are extremely low at 100 μm or less. Since it is small, it cannot be detected by the light cutting method proposed in Patent Document 1 below.
 そこで、微小な凹凸を検出する方法として、以下の特許文献2では、機械加工品の表面にスリットレーザ光(線状光)を照射し、凹部での照射光の集束及び凸部での照射光の散乱によって生じる照射光の幅の変化に着目する技術が、提案されている。かかる技術は、機械加工品の表面が滑らかであれば照射光の幅は小さく、凹部が存在すれば照射光の幅が太くなるという特性を利用した技術である。 Therefore, as a method for detecting minute unevenness, in Patent Document 2 below, the surface of a machined product is irradiated with slit laser light (linear light), and the irradiation light is focused on the concave portion and irradiated on the convex portion. A technique that pays attention to the change in the width of irradiation light caused by scattering of light has been proposed. This technique is a technique that utilizes the characteristic that the width of the irradiation light is small if the surface of the machined product is smooth, and the width of the irradiation light is large if there is a recess.
特開2003-240521号公報Japanese Patent Laid-Open No. 2003-240521 特開2002-346925号公報JP 2002-346925 A
 しかしながら、上記特許文献2で提案されている検査技術は、機械加工品の表面の特定部位のみを検査する技術であり、加えて、機械加工品の一部での照射光の線幅変化を利用した技術であるために、不良部の位置の視認性が悪いという問題があった。また、上記特許文献2で提案されている技術では、表面粗度(凹凸)の大きい部位の2次元画像化を実施していないことから、各種の画像処理を利用して表面粗度の大きい部位を抽出したり、表面粗度の大きい部位のサイズを特定したりすることができない。更には、線状のレーザ光における線幅の変化は僅かであるため、照射光の投影画像のみでは、視認性が低いという問題がある。 However, the inspection technique proposed in Patent Document 2 is a technique for inspecting only a specific part of the surface of a machined product, and in addition, uses a change in the line width of irradiation light in a part of the machined product. Therefore, there is a problem that the visibility of the position of the defective portion is poor. Further, in the technique proposed in Patent Document 2 above, since the two-dimensional imaging of the portion having a large surface roughness (unevenness) is not performed, the portion having a large surface roughness using various image processing Cannot be extracted, or the size of a part having a large surface roughness cannot be specified. Furthermore, since the change of the line width in the linear laser beam is slight, there is a problem that the visibility is low only with the projection image of the irradiation light.
 以上説明したように、機械加工品の表面性状を検査するに際して、光切断法による凹凸情報と照明反射光の輝度変化情報の何れを用いた場合においても、中間素材であるビレット等を熱間成形した後の材料の地肌が残存した部分や錆等を明確に顕在化することが不可能な現状にある。また、照明反射光の輝度情報に基づき、機械加工品の表面の様子を画像化することは可能ではあるが、周辺のノイズ成分(例えば、機械切削肌や埃等)の影響を受け、不良部と正常部との切り分けは極めて困難である。 As described above, when inspecting the surface properties of a machined product, the billet, which is an intermediate material, is hot-formed regardless of whether the unevenness information by the light cutting method or the brightness change information of the illumination reflected light is used. It is in the present situation that it is impossible to clearly reveal the portion where the background of the material after remaining, rust, etc. are clearly manifested. Although it is possible to image the surface of the machined product based on the luminance information of the illumination reflected light, it is affected by surrounding noise components (for example, machined skin and dust), and the defective part It is very difficult to distinguish between the normal part and the normal part.
 そこで、本発明は、上記問題に鑑みてなされたものであり、本発明の目的とするところは、中間素材を機械加工することで得られる機械加工品について、中間素材の地肌残存部及び錆を顕在化しつつ、機械加工品の表面形状を検査することが可能な、表面性状検査装置、表面性状検査方法及びプログラムを提供することにある。 Therefore, the present invention has been made in view of the above-mentioned problems, and the object of the present invention is to provide a machined product obtained by machining an intermediate material with respect to the background remaining portion and rust of the intermediate material. An object of the present invention is to provide a surface property inspection apparatus, a surface property inspection method, and a program capable of inspecting the surface shape of a machined product while becoming apparent.
 上記課題を解決するために、本発明者らが鋭意検討を行った結果、中間素材を機械加工することで得られる機械加工品を被検査体として、かかる被検査体を線状のレーザ光を用いて検査する際に、被検査体の表面の凹凸状態を表す深さ画像と、被検査体の表面における線状のレーザ光の輝度分布を表す輝度画像と、被検査体の表面における線状のレーザ光の線幅の分布を表す線幅画像という3種類の画像を算出することで、中間素材の地肌残存部及び錆を顕在化しつつ、機械加工品の表面形状を検査することが可能であるとの知見を得るに至った。
 かかる知見に基づき完成された本発明の要旨は、以下の通りである。
In order to solve the above problems, the present inventors have conducted intensive studies. As a result, a machined product obtained by machining an intermediate material is used as an object to be inspected, and the object to be inspected is subjected to a linear laser beam. When inspecting using, a depth image representing the uneven state of the surface of the object to be inspected, a luminance image representing the luminance distribution of the linear laser light on the surface of the object to be inspected, and a linear shape on the surface of the object to be inspected It is possible to inspect the surface shape of the machined product while revealing the remaining surface and rust of the intermediate material by calculating three types of images called line width images representing the line width distribution of the laser beam of I came to know that there is.
The gist of the present invention completed based on such findings is as follows.
(1)中間素材を素材とした機械加工品を被検査体とする表面性状検査装置であって、移動する前記被検査体の表面に対して線状のレーザ光を照射する照明装置と、前記線状のレーザ光が照射された前記表面を撮像することで、前記表面における前記線状のレーザ光の撮像画像である光切断画像を前記被検査体の移動方向に沿って複数生成する撮像装置と、複数の前記光切断画像それぞれにおける前記線状のレーザ光の照射部分に対応する線分である光切断線を前記移動方向に沿って順に配列させた縞画像フレームに基づいて、前記被検査体の表面の凹凸状態を表す深さ画像と、前記被検査体の表面における前記線状のレーザ光の輝度分布を表す輝度画像と、前記被検査体の表面における前記線状のレーザ光の前記移動方向の線幅の分布を輝度値に対応づけた線幅画像と、を算出する画像算出部と、算出された前記深さ画像、前記輝度画像及び前記線幅画像に基づいて、前記被検査体の表面性状を検出する検出処理部と、を備え、前記画像算出部は、前記縞画像フレームにおけるそれぞれの前記光切断線について、当該光切断線の延伸方向の各位置での前記線幅と所定の閾値線幅との差分を算出し、算出された前記差分の大きさに応じて輝度値を割り当てることで、前記線幅画像を算出し、前記検出処理部は、前記線幅画像に基づいて、前記中間素材を素材とした地肌残存部又は錆が、前記被検査体の表面に存在しているか否かを検出する、表面性状検査装置。
(2)前記検出処理部は、前記線幅画像の輝度値が、前記地肌残存部及び錆の検出のための第1の閾値以上か否かに基づき、前記地肌残存部及び前記錆を検出する、(1)に記載の表面性状検査装置。
(3)前記画像算出部は、前記被検査体の移動方向に沿った前記光切断線の線幅方向の重心位置を算出し、前記光切断画像に対して予め指定した移動方向の位置である基準位置と前記重心位置との変位量に基づいて、前記深さ画像を算出する、(1)又は(2)に記載の表面性状検査装置。
(4)前記検出処理部は、前記深さ画像及び前記輝度画像の輝度値が、欠陥部位特定のための第2の閾値以上であるか否かに基づいて欠陥部位を特定し、特定した前記欠陥部位について、当該欠陥部位の形態及び輝度値に関する特徴量情報を抽出し、抽出した特徴量情報に基づいて、前記被検査体の表面に存在する欠陥を判別する、(1)~(3)の何れか1つに記載の表面性状検査装置。
(5)前記検出処理部は、前記深さ画像及び前記輝度画像において、前記線幅画像に基づき前記地肌残存部又は前記錆が検出された部位以外で、前記欠陥部位の検出を実施する、(4)に記載の表面性状検査装置。
(6)前記撮像装置として、前記線状のレーザ光が照射された前記表面を、前記移動方向の上流側及び前記移動方向の下流側のそれぞれから撮像する少なくとも2つの撮像装置を有する、(1)~(5)の何れか1つに記載の表面性状検査装置。
(7)中間素材を素材とした機械加工品を被検査体とする表面性状検査方法であって、移動する前記被検査体の表面に照明装置から線状のレーザ光を照射し、当該線状のレーザ光が照射された前記表面を撮像装置により撮像することで、前記表面における前記線状のレーザ光の撮像画像である光切断画像を前記被検査体の移動方向に沿って複数生成する光切断画像生成ステップと、複数の前記光切断画像それぞれにおける前記線状のレーザ光の照射部分に対応する線分である光切断線を前記移動方向に沿って順に配列させた縞画像フレームに基づいて、前記被検査体の表面の凹凸状態を表す深さ画像と、前記被検査体の表面における前記線状のレーザ光の輝度分布を表す輝度画像と、前記被検査体の表面における前記線状のレーザ光の前記移動方向の線幅の分布を輝度値に対応づけた線幅画像と、を算出する画像算出ステップと、算出された前記深さ画像、前記輝度画像及び前記線幅画像に基づいて前記被検査体の表面性状を検出する検出処理ステップと、を含み、前記画像算出ステップは、前記縞画像フレームにおけるそれぞれの前記光切断線について、当該光切断線の延伸方向の各位置での前記線幅と所定の閾値線幅との差分を算出し、算出された前記差分の大きさに応じて輝度値を割り当てることで、前記線幅画像を算出し、前記検出処理ステップは、前記線幅画像に基づいて、前記中間素材を素材とした地肌残存部又は錆が、前記被検査体の表面に存在しているか否かを検出する、表面性状検査方法。
(8)前記検出処理ステップは、前記線幅画像の輝度値が、前記地肌残存部及び錆の検出のための第1の閾値以上か否かに基づき、前記地肌残存部及び前記錆を検出する、(7)に記載の表面性状検査方法。
(9)前記画像算出ステップは、前記被検査体の移動方向に沿った前記光切断線の線幅方向の重心位置を算出し、前記光切断画像に対して予め指定した移動方向の位置である基準位置と前記重心位置との変位量に基づいて、前記深さ画像を算出する、(7)又は(8)に記載の表面性状検査方法。
(10)前記検出処理ステップは、前記深さ画像及び前記輝度画像の輝度値が、欠陥部位特定のための第2の閾値以上であるか否かに基づいて欠陥部位を特定し、特定した前記欠陥部位について、当該欠陥部位の形態及び輝度値に関する特徴量情報を抽出し、抽出した特徴量情報に基づいて、前記被検査体の表面に存在する欠陥を判別する、(7)~(9)の何れか1つに記載の表面性状検査方法。
(11)前記検出処理ステップは、前記深さ画像及び前記輝度画像において、前記線幅画像に基づき前記地肌残存部又は前記錆が検出された部位以外で、前記欠陥部位の検出を実施する、(10)に記載の表面性状検査方法。
(12)前記撮像装置として、前記線状のレーザ光が照射された前記表面を、前記移動方向の上流側及び前記移動方向の下流側のそれぞれから撮像する少なくとも2つの撮像装置を有する、(7)~(11)の何れか1つに記載の表面性状検査方法。
(13)中間素材を素材とした機械加工品を被検査体とし、移動する前記被検査体の表面に線状のレーザ光を照射する照明装置、及び、前記線状のレーザ光が照射された前記表面を撮像することで、前記表面における前記線状のレーザ光の撮像画像である光切断画像を前記被検査体の移動方向に沿って複数生成する撮像装置のそれぞれと相互に通信が可能なコンピュータに、生成された複数の前記光切断画像それぞれにおける前記線状のレーザ光の照射部分に対応する線分である光切断線を前記移動方向に沿って順に配列させた縞画像フレームに基づいて、前記被検査体の表面の凹凸状態を表す深さ画像と、前記被検査体の表面における前記線状のレーザ光の輝度分布を表す輝度画像と、前記被検査体の表面における前記線状のレーザ光の前記移動方向の線幅の分布を輝度値に対応づけた線幅画像と、を算出する画像算出機能と、算出された前記深さ画像、前記輝度画像及び前記線幅画像に基づいて前記被検査体の表面性状を検出する検出処理機能と、を実現させ、前記画像算出機能は、前記縞画像フレームにおけるそれぞれの前記光切断線について、当該光切断線の延伸方向の各位置での前記線幅と所定の閾値線幅との差分を算出し、算出された前記差分の大きさに応じて輝度値を割り当てることで、前記線幅画像を算出し、前記検出処理機能は、前記線幅画像に基づいて、前記中間素材を素材とした地肌残存部又は錆が、前記被検査体の表面に存在しているか否かを検出する、プログラム。
(14)前記検出処理機能は、前記線幅画像の輝度値が、前記地肌残存部及び錆の検出のための第1の閾値以上か否かに基づき、前記地肌残存部及び前記錆を検出する、(13)に記載のプログラム。
(15)前記画像算出機能は、前記被検査体の移動方向に沿った前記光切断線の線幅方向の重心位置を算出し、前記光切断画像に対して予め指定した移動方向の位置である基準位置と前記重心位置との変位量に基づいて、前記深さ画像を算出する、(13)又は(14)に記載のプログラム。
(16)前記検出処理機能は、前記深さ画像及び前記輝度画像の輝度値が、欠陥部位特定のための第2の閾値以上であるか否かに基づいて欠陥部位を特定し、特定した前記欠陥部位について、当該欠陥部位の形態及び輝度値に関する特徴量情報を抽出し、抽出した特徴量情報に基づいて、前記被検査体の表面に存在する欠陥を判別する、(13)~(15)の何れか1つに記載のプログラム。
(17)前記検出処理機能は、前記深さ画像及び前記輝度画像において、前記線幅画像に基づき前記地肌残存部又は前記錆が検出された部位以外で、前記欠陥部位の検出を実施する、(16)に記載のプログラム。
(18)前記撮像装置として、前記線状のレーザ光が照射された前記表面を、前記移動方向の上流側及び前記移動方向の下流側のそれぞれから撮像する少なくとも2つの撮像装置を有する、(13)~(17)の何れか1項に記載のプログラム。
(1) A surface property inspection apparatus using a machined product made of an intermediate material as an object to be inspected, wherein the illumination apparatus irradiates a surface of the moving object to be inspected with a linear laser beam, and An imaging apparatus that generates a plurality of light-cut images, which are captured images of the linear laser light on the surface, along the moving direction of the inspection object by imaging the surface irradiated with the linear laser light And, based on a fringe image frame in which light cutting lines, which are line segments corresponding to the irradiated portions of the linear laser light in each of the plurality of light cutting images, are sequentially arranged along the moving direction, A depth image representing an uneven state of the surface of the body, a luminance image representing a luminance distribution of the linear laser light on the surface of the inspection object, and the linear laser light on the surface of the inspection object Bright line width distribution in moving direction An image calculation unit that calculates a line width image associated with the value, and a detection process for detecting a surface property of the object to be inspected based on the calculated depth image, the luminance image, and the line width image The image calculation unit calculates a difference between the line width at each position in the extending direction of the light cutting line and a predetermined threshold line width for each light cutting line in the fringe image frame. The line width image is calculated by calculating and assigning a luminance value according to the calculated magnitude of the difference, and the detection processing unit uses the intermediate material as a material based on the line width image. A surface property inspection apparatus for detecting whether a background remaining portion or rust is present on the surface of the object to be inspected.
(2) The detection processing unit detects the background remaining portion and the rust based on whether a luminance value of the line width image is equal to or higher than a first threshold value for detecting the background remaining portion and rust. The surface property inspection apparatus according to (1).
(3) The image calculation unit calculates a barycentric position in the line width direction of the light cutting line along the moving direction of the object to be inspected, and is a position in the moving direction designated in advance with respect to the light cutting image. The surface property inspection apparatus according to (1) or (2), wherein the depth image is calculated based on a displacement amount between a reference position and the gravity center position.
(4) The detection processing unit identifies and identifies the defective part based on whether or not the luminance values of the depth image and the luminance image are equal to or greater than a second threshold value for specifying the defective part. For the defective part, feature quantity information relating to the form and luminance value of the defective part is extracted, and based on the extracted feature quantity information, a defect existing on the surface of the object to be inspected is determined (1) to (3) The surface property inspection apparatus according to any one of the above.
(5) In the depth image and the luminance image, the detection processing unit performs the detection of the defective portion other than the portion where the background remaining portion or the rust is detected based on the line width image. The surface property inspection apparatus according to 4).
(6) The imaging device includes at least two imaging devices that image the surface irradiated with the linear laser light from each of an upstream side in the moving direction and a downstream side in the moving direction. The surface property inspection apparatus according to any one of (1) to (5).
(7) A surface property inspection method using a machined product made of an intermediate material as an object to be inspected, wherein a surface of the moving object to be inspected is irradiated with a linear laser beam from an illuminating device. Light that generates a plurality of light-cut images that are captured images of the linear laser light on the surface along the moving direction of the inspection object by imaging the surface irradiated with the laser light of Based on a cutting image generation step and a fringe image frame in which light cutting lines that are line segments corresponding to the irradiated portions of the linear laser light in each of the plurality of light cutting images are arranged in order along the moving direction. A depth image representing the uneven state of the surface of the object to be inspected, a luminance image representing a luminance distribution of the linear laser light on the surface of the object to be inspected, and the linear image on the surface of the object to be inspected. The transfer of laser light An image calculation step for calculating a line width image in which the distribution of the line width in the direction is associated with the luminance value, and the depth image, the luminance image, and the line width image of the object to be inspected based on the calculated depth image A detection processing step of detecting a surface property, wherein the image calculation step includes, for each light cutting line in the fringe image frame, a predetermined width and a predetermined width at each position in the extending direction of the light cutting line. The line width image is calculated by calculating a difference with a threshold line width and assigning a luminance value according to the calculated magnitude of the difference, and the detection processing step is based on the line width image. A surface property inspection method for detecting whether a background remaining portion or rust using the intermediate material is present on the surface of the object to be inspected.
(8) In the detection processing step, the background remaining portion and the rust are detected based on whether a luminance value of the line width image is equal to or higher than a first threshold value for detecting the background remaining portion and rust. (7) The surface property inspection method according to (7).
(9) The image calculating step calculates a barycentric position in the line width direction of the light cutting line along the moving direction of the object to be inspected, and is a position in the moving direction designated in advance with respect to the light cutting image. The surface property inspection method according to (7) or (8), wherein the depth image is calculated based on a displacement amount between a reference position and the gravity center position.
(10) In the detection processing step, the defective part is specified based on whether the luminance value of the depth image and the luminance image is equal to or higher than a second threshold value for specifying the defective part. For the defective part, feature quantity information relating to the form and luminance value of the defective part is extracted, and based on the extracted feature quantity information, a defect present on the surface of the object to be inspected is determined (7) to (9) The surface property inspection method according to any one of the above.
(11) In the detection processing step, in the depth image and the luminance image, detection of the defective portion is performed in a portion other than the portion where the background remaining portion or the rust is detected based on the line width image. 10. The surface property inspection method according to 10).
(12) The imaging device includes at least two imaging devices that image the surface irradiated with the linear laser light from each of an upstream side in the moving direction and a downstream side in the moving direction. The surface property inspection method according to any one of (1) to (11).
(13) A machined product made of an intermediate material is used as an object to be inspected, and an illumination device that irradiates the surface of the moving object to be inspected with linear laser light, and the linear laser light is irradiated By imaging the surface, it is possible to mutually communicate with each of the imaging devices that generate a plurality of light-cut images that are images of the linear laser light on the surface along the moving direction of the inspection object. Based on a fringe image frame in which light cutting lines, which are line segments corresponding to irradiation portions of the linear laser light in each of the generated plurality of light cutting images, are arranged in order along the moving direction. A depth image representing the uneven state of the surface of the object to be inspected, a luminance image representing a luminance distribution of the linear laser light on the surface of the object to be inspected, and the linear image on the surface of the object to be inspected. Before the laser beam An image calculation function for calculating a line width image in which the distribution of the line width in the moving direction is associated with the luminance value, and the inspected object based on the calculated depth image, the luminance image, and the line width image Detection processing function for detecting the surface texture of the image, the image calculation function, for each light section line in the fringe image frame, the line width at each position in the extending direction of the light section line The line width image is calculated by calculating a difference from a predetermined threshold line width and assigning a luminance value according to the calculated magnitude of the difference, and the detection processing function is based on the line width image. And a program for detecting whether or not a background remaining portion or rust made of the intermediate material is present on the surface of the object to be inspected.
(14) The detection processing function detects the background remaining portion and the rust based on whether a luminance value of the line width image is equal to or higher than a first threshold value for detecting the background remaining portion and rust. (13) The program according to (13).
(15) The image calculation function calculates a barycentric position in the line width direction of the light cutting line along the moving direction of the object to be inspected, and is a position in a moving direction designated in advance with respect to the light cutting image. The program according to (13) or (14), wherein the depth image is calculated based on a displacement amount between a reference position and the gravity center position.
(16) The detection processing function specifies a defective part based on whether or not the luminance values of the depth image and the luminance image are equal to or greater than a second threshold value for specifying the defective part. For the defective part, feature quantity information relating to the form and luminance value of the defective part is extracted, and based on the extracted feature quantity information, a defect present on the surface of the object to be inspected is determined (13) to (15) The program as described in any one of these.
(17) In the depth image and the luminance image, the detection processing function performs the detection of the defect portion other than the portion where the background remaining portion or the rust is detected based on the line width image. The program according to 16).
(18) The imaging device includes at least two imaging devices that image the surface irradiated with the linear laser light from each of an upstream side in the moving direction and a downstream side in the moving direction. The program according to any one of (17) to (17).
 以上説明したように本発明によれば、中間素材であるビレット等を熱間成形した後の材料を機械加工することで得られる機械加工品について、中間素材であるビレット等を熱間成形した後の材料の表面が残存した地肌残存部や錆を顕在化しつつ、機械加工品の表面形状を検査することが可能となる。 As described above, according to the present invention, for a machined product obtained by machining a material after hot forming a billet or the like as an intermediate material, after billet or the like as an intermediate material is hot formed. The surface shape of the machined product can be inspected while the remaining surface of the surface of the material and the rust remaining on the surface and the rust appear.
本発明の実施形態に係る表面性状検査装置の構成を模式的に示したブロック図である。It is the block diagram which showed typically the structure of the surface property inspection apparatus which concerns on embodiment of this invention. 同実施形態に係る被検査体撮像装置の構成の一例を模式的に示した説明図である。It is explanatory drawing which showed typically an example of the structure of the to-be-inspected object imaging device which concerns on the embodiment. 同実施形態に係る被検査体撮像装置の構成の一例を模式的に示した説明図である。It is explanatory drawing which showed typically an example of the structure of the to-be-inspected object imaging device which concerns on the embodiment. 正常な被検査体表面での反射の様子を説明するための説明図である。It is explanatory drawing for demonstrating the mode of reflection on the normal to-be-inspected object surface. 被検査体の粗面部での反射の様子を説明するための説明図である。It is explanatory drawing for demonstrating the mode of reflection in the rough surface part of a to-be-inspected object. 正常な被検査体表面を撮像した場合の光切断画像を模式的に示した説明図である。It is explanatory drawing which showed typically the light cutting image at the time of imaging the normal to-be-inspected object surface. 凹部を含む被検査体表面を撮像した場合の光切断画像を模式的に示した説明図である。It is explanatory drawing which showed typically the light cutting image at the time of imaging the to-be-inspected object surface containing a recessed part. 粗面部を含む被検査体表面を撮像した場合の光切断画像を模式的に示した説明図である。It is explanatory drawing which showed typically the light cutting image at the time of imaging the to-be-inspected object surface containing a rough surface part. 同実施形態に係る演算処理装置が備える画像処理部の構成の一例を示したブロック図である。It is the block diagram which showed an example of the structure of the image process part with which the arithmetic processing apparatus which concerns on the same embodiment is provided. 同実施形態に係る縞画像フレームの一例を示した説明図である。It is explanatory drawing which showed an example of the fringe image frame which concerns on the same embodiment. 同実施形態に係る光切断処理について説明するための説明図である。It is explanatory drawing for demonstrating the optical cutting process which concerns on the same embodiment. 同実施形態に係る光切断処理について説明するための説明図である。It is explanatory drawing for demonstrating the optical cutting process which concerns on the same embodiment. 同実施形態に係る光切断線変位の二次元配列を示した説明図である。It is explanatory drawing which showed the two-dimensional arrangement | sequence of the optical cutting line displacement which concerns on the same embodiment. 同実施形態に係る輝度の総和の二次元配列を示した説明図である。It is explanatory drawing which showed the two-dimensional arrangement | sequence of the sum total of the luminance which concerns on the same embodiment. 同実施形態に係る輝線の画素数の二次元配列を示した説明図である。It is explanatory drawing which showed the two-dimensional arrangement | sequence of the number of pixels of the bright line based on the embodiment. 光切断線の変位と欠陥の高さとの関係を示した説明図である。It is explanatory drawing which showed the relationship between the displacement of an optical cutting line, and the height of a defect. 同実施形態に係る光切断線の近似補正処理を説明するための説明図である。It is explanatory drawing for demonstrating the approximate correction process of the optical section line which concerns on the same embodiment. 正常部及び粗面部での光切断線の線幅について説明するための説明図である。It is explanatory drawing for demonstrating the line | wire width of the optical cutting line in a normal part and a rough surface part. 正常部での光切断線の線幅について説明するための説明図である。It is explanatory drawing for demonstrating the line | wire width of the optical cutting line in a normal part. 粗面部での光切断線の線幅について説明するための説明図である。It is explanatory drawing for demonstrating the line | wire width of the optical cutting line in a rough surface part. 同実施形態に係る線幅画像について説明するための説明図である。It is explanatory drawing for demonstrating the line | wire width image which concerns on the embodiment. 同実施形態に係る表面性状検出処理で用いられるロジックテーブルの一例を模式的に示した説明図である。It is explanatory drawing which showed typically an example of the logic table used by the surface property detection process which concerns on the same embodiment. 同実施形態に係る表面性状検査方法の流れの一例を示した流れ図である。It is the flowchart which showed an example of the flow of the surface property inspection method which concerns on the embodiment. 同実施形態に係る演算処理装置のハードウェア構成の一例を示したブロック図である。It is the block diagram which showed an example of the hardware constitutions of the arithmetic processing unit which concerns on the same embodiment. 実施例について説明するための説明図である。It is explanatory drawing for demonstrating an Example. 実施例について説明するための説明図である。It is explanatory drawing for demonstrating an Example.
 以下に添付図面を参照しながら、本発明の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. In addition, in this specification and drawing, about the component which has the substantially same function structure, duplication description is abbreviate | omitted by attaching | subjecting the same code | symbol.
(実施形態)
<表面性状検査装置の全体構成について>
 まず、図1を参照しながら、本発明の実施形態に係る表面性状検査装置10の全体構成について説明する。図1は、本実施形態に係る表面性状検査装置10の構成の一例を示した説明図である。
(Embodiment)
<About the overall configuration of the surface texture inspection device>
First, the overall configuration of a surface texture inspection apparatus 10 according to an embodiment of the present invention will be described with reference to FIG. FIG. 1 is an explanatory diagram showing an example of the configuration of the surface texture inspection apparatus 10 according to the present embodiment.
 本実施形態に係る表面性状検査装置10は、中間素材であるビレット等を熱間成形した後の材料を機械加工した機械加工品(以下では、単に「中間素材を素材とした機械加工品」と称する。)を被検査体1とし、所定の方向に移動している被検査体1の表面性状を検査する装置である。 The surface texture inspection device 10 according to the present embodiment is a machined product obtained by machining a material after hot forming a billet or the like that is an intermediate material (hereinafter simply referred to as a “machined product using an intermediate material”). This is an apparatus for inspecting the surface property of the inspection object 1 that is moving in a predetermined direction.
 ここで、中間素材とは、例えば、スラブやビレット等といった最終製品となる以前の素材である。かかる中間素材を熱間成形した後、研磨や切削等といった機械加工が施されることで、本実施形態で着目する機械加工品が製造される。このような機械加工品として、例えば、鉄(Fe)を主成分とする中間素材の一種であるビレット等を熱間成形した後の材料を機械切削により形状を整えることで製造される円盤状の加工品や、ビレット等を熱間成形した後の材料の外径を機械切削等により整えることで製造される円柱状の加工品等を挙げることができる。また、中間素材は、鉄を主成分とするものに限定されるものではなく、非鉄金属を主成分とするものであってもよい。 Here, the intermediate material is a material before becoming a final product such as a slab or billet. After such an intermediate material is hot-formed, machining such as polishing and cutting is performed, whereby a machined product focused on in this embodiment is manufactured. As such a machined product, for example, a disk-shaped manufactured by adjusting the shape of the material after hot forming a billet, which is a kind of intermediate material mainly composed of iron (Fe), by mechanical cutting Examples thereof include a processed product and a cylindrical processed product manufactured by adjusting the outer diameter of a material after hot forming a billet or the like by mechanical cutting or the like. Further, the intermediate material is not limited to a material mainly composed of iron, and may be a material mainly composed of a non-ferrous metal.
 本実施形態に係る表面性状検査装置10は、図1に示したように、被検査体1の表面(特に、被検査体1である機械加工品の機械加工面)を撮像する被検査体撮像装置100と、撮像の結果得られる画像に対して画像処理を行う演算処理装置200と、を主に備える。 As shown in FIG. 1, the surface texture inspection apparatus 10 according to the present embodiment images an object to be inspected that images the surface of the object 1 to be inspected (in particular, the machined surface of a machined product that is the object 1 to be inspected). The apparatus 100 mainly includes an arithmetic processing device 200 that performs image processing on an image obtained as a result of imaging.
 被検査体撮像装置100は、以下で詳述するように、移動している被検査体1の上方に設置される。被検査体撮像装置100は、所定の方向に移動している被検査体1の表面を順次撮像し、撮像の結果得られる撮像画像を、演算処理装置200に出力する装置である。被検査体撮像装置100による撮像処理は、演算処理装置200によって制御されている。一般に、被検査体1である機械加工品を搬送する搬送ラインには、被検査体1の移動速度を検出するために、例えばPLG(Pulse Logic Generator:パルス型速度検出器)等が設けられている。そこで、演算処理装置200は、PLGから入力される1パルスのPLG信号に基づき、定期的に制御信号を被検査体撮像装置100に対して送信し、制御信号に基づき被検査体撮像装置100を機能させることができる。これにより、被検査体撮像装置100は、被検査体1が所定の距離又は所定の時間だけ移動する毎に、被検査体1の表面を撮像することが可能となる。 The inspection object imaging device 100 is installed above the moving inspection object 1 as described in detail below. The inspection object imaging apparatus 100 is an apparatus that sequentially images the surface of the inspection object 1 moving in a predetermined direction and outputs a captured image obtained as a result of the imaging to the arithmetic processing apparatus 200. Imaging processing by the inspected object imaging device 100 is controlled by the arithmetic processing device 200. Generally, for example, a PLG (Pulse Logic Generator: pulse-type speed detector) is provided in a transport line for transporting a machined product that is the object to be inspected 1 in order to detect the moving speed of the object 1 to be inspected. Yes. Therefore, the arithmetic processing device 200 periodically transmits a control signal to the inspecting object imaging device 100 based on one pulse of the PLG signal input from the PLG, and the inspecting object imaging device 100 is transmitted based on the control signal. Can function. Thus, the inspected object imaging apparatus 100 can image the surface of the inspected object 1 every time the inspected object 1 moves by a predetermined distance or a predetermined time.
 演算処理装置200は、上記のように、被検査体撮像装置100における被検査体1の表面の撮像処理全般を制御する。また、演算処理装置200は、被検査体撮像装置100によって生成された撮像画像を利用して縞画像フレームを生成し、この縞画像フレームに対して画像処理を行うことで、被検査体1の表面に存在している可能性のある各種の欠陥を検出する。上記各種の欠陥としては、例えば、凹凸の変化を伴うキズ、模様系のキズ、中間素材を熱間成形した後の材料の表面が残存した地肌残存部(以下では、単に「中間素材を素材とした地肌残存部」と称する。)、及び、錆等がある。 The arithmetic processing unit 200 controls the entire imaging process of the surface of the inspection object 1 in the inspection object imaging apparatus 100 as described above. In addition, the arithmetic processing device 200 generates a fringe image frame using the captured image generated by the inspected object imaging device 100, and performs image processing on the fringe image frame, so that the inspected object 1 is inspected. Detects various defects that may be present on the surface. Examples of the above-mentioned various defects include, for example, a flaw accompanied by unevenness change, a flaw of a pattern system, a remaining surface of the material after the intermediate material is hot-formed (hereinafter referred to simply as “intermediate material as material”). ”And rust and the like.
 以下では、これら被検査体撮像装置100及び演算処理装置200について、それぞれ詳述する。 Hereinafter, each of the inspected object imaging device 100 and the arithmetic processing device 200 will be described in detail.
<被検査体撮像装置100について>
 続いて、図2A~図4Cを参照しながら、本実施形態に係る被検査体撮像装置100の構成の一例について、詳細に説明する。
 図2A及び図2Bは、本実施形態に係る被検査体撮像装置の構成の一例を模式的に示した説明図である。図3Aは、正常な被検査体表面での反射の様子を説明するための説明図であり、図3Bは、被検査体の粗面部での反射の様子を説明するための説明図である。図4Aは、正常な被検査体表面を撮像した場合の光切断画像を模式的に示した説明図である。図4Bは、凹部を含む被検査体表面を撮像した場合の光切断画像を模式的に示した説明図である。図4Cは、粗面部を含む被検査体表面を撮像した場合の光切断画像を模式的に示した説明図である。
<Inspection Object Imaging Device 100>
Next, an example of the configuration of the inspection object imaging apparatus 100 according to the present embodiment will be described in detail with reference to FIGS. 2A to 4C.
2A and 2B are explanatory diagrams schematically showing an example of the configuration of the inspection subject imaging apparatus according to the present embodiment. FIG. 3A is an explanatory diagram for explaining a state of reflection on the surface of a normal object to be inspected, and FIG. 3B is an explanatory diagram for explaining a state of reflection on a rough surface portion of the object to be inspected. FIG. 4A is an explanatory view schematically showing a light section image when a normal surface of an object to be inspected is imaged. FIG. 4B is an explanatory view schematically showing a light section image when the surface of the object to be inspected including the concave portion is imaged. FIG. 4C is an explanatory view schematically showing a light section image when the surface of the inspection object including the rough surface portion is imaged.
 図2Aの上段に示した図は、被検査体撮像装置100を、被検査体1の上方から見た場合の模式図であり、図2Aの下段に示した図は、被検査体撮像装置100を、被検査体1の側方から見た場合の模式図である。同様に、図2Bの上段に示した図は、被検査体撮像装置100を、被検査体1の上方から見た場合の模式図であり、図2Bの下段に示した図は、被検査体撮像装置100を、被検査体1の側方から見た場合の模式図である。 2A is a schematic diagram when the inspection object imaging apparatus 100 is viewed from above the inspection object 1. The illustration shown in the lower part of FIG. 2A is the inspection object imaging apparatus 100. FIG. 2 is a schematic view when the device is viewed from the side of the device under test 1. Similarly, the diagram shown in the upper part of FIG. 2B is a schematic diagram when the object imaging device 100 is viewed from above the object 1 to be inspected, and the figure shown in the lower part of FIG. 1 is a schematic diagram when an imaging apparatus 100 is viewed from the side of an object to be inspected 1. FIG.
 また、以下の説明では、便宜的に、被検査体1である機械加工品の移動方向をy軸正方向とし、移動方向に対して直交する方向をx軸正方向とし、機械加工品の高さ方向をz軸正方向とする。 Further, in the following description, for convenience, the moving direction of the machined product that is the inspection object 1 is the y-axis positive direction, and the direction orthogonal to the moving direction is the x-axis positive direction. The vertical direction is the positive z-axis direction.
 図2Aに模式的に示したように、本実施形態に係る被検査体撮像装置100は、照明装置101と、撮像装置の一例であるエリアカメラ103と、を備える。照明装置101及びエリアカメラ103は、これらの設置位置が変化しないように、非図示の公知の手段により固定されている。 As schematically shown in FIG. 2A, the inspected object imaging device 100 according to the present embodiment includes an illumination device 101 and an area camera 103 which is an example of the imaging device. The illumination device 101 and the area camera 103 are fixed by known means (not shown) so that their installation positions do not change.
 照明装置101は、被検査体1である機械加工品の表面に対して所定の光を照射することで、被検査体1の表面を照明する装置である。この照明装置101は、被検査体1の表面に対して線状のレーザ光Lを照射するレーザ光源を少なくとも有している。 The illumination device 101 is a device that illuminates the surface of the inspection object 1 by irradiating the surface of the machined product that is the inspection object 1 with predetermined light. The illuminating device 101 includes at least a laser light source that irradiates the surface of the inspection object 1 with a linear laser beam L.
 かかる照明装置101は、例えば、可視光帯域等のように所定波長のレーザ光を射出する光源ユニットと、光源ユニットから射出されたレーザ光をx軸方向に拡げながら線幅方向に集光して線状光にするためのレンズ(例えば、シリンドリカルレンズやロッドレンズやパウエルレンズ等)と、から構成される。このレンズの線幅方向の合焦度合いを変更することで、レーザ照射位置での線状のレーザ光Lのy軸方向に沿った太さ(すなわち、線状のレーザ光Lの線幅)を調整することが可能となる。被検査体1の表面に到達する直前での線状のレーザ光Lの線幅は、例えば、数百μm程度(例えば、200μm程度)とすることができる。 The illuminating device 101, for example, condenses light source units that emit laser light of a predetermined wavelength, such as a visible light band, and the line width direction while expanding the laser light emitted from the light source unit in the x-axis direction. And a lens (for example, a cylindrical lens, a rod lens, a Powell lens, etc.) for making linear light. By changing the degree of focusing in the line width direction of this lens, the thickness along the y-axis direction of the linear laser light L at the laser irradiation position (that is, the line width of the linear laser light L) is changed. It becomes possible to adjust. The line width of the linear laser beam L immediately before reaching the surface of the device under test 1 can be, for example, about several hundred μm (for example, about 200 μm).
 被検査体1の表面の線状レーザ光Lが照射された部分には、x軸方向に沿って線状の明るい部位が形成される。この線状の明るい部位に対応する線分は、光切断線と呼ばれる。 In the portion irradiated with the linear laser beam L on the surface of the inspection object 1, a linear bright portion is formed along the x-axis direction. A line segment corresponding to this linear bright part is called a light cutting line.
 撮像装置の一例であるエリアカメラ103は、線状のレーザ光Lが照射された被検査体1の表面を全面にわたって撮像する装置である。このエリアカメラ103は、所定の開放絞り値及び焦点距離を有するレンズと、撮像素子として機能するCCD(Charge Coupled Device)又はCMOS(Complementary Metal Oxide Semiconductor)等の各種センサと、を有している。かかるエリアカメラ103は、モノクロカメラであってもよいし、カラーカメラであってもよい。 The area camera 103, which is an example of an imaging device, is a device that images the entire surface of the inspection object 1 irradiated with the linear laser light L over the entire surface. The area camera 103 includes a lens having a predetermined open aperture value and a focal length, and various sensors such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) that function as an imaging device. The area camera 103 may be a monochrome camera or a color camera.
 エリアカメラ103に搭載されるレンズの焦点距離や画角、及び、照明装置101とエリアカメラ103の撮像素子との間の距離は、特に規定するものではないが、被検査体1の表面のx方向全体が視野VF内に収まるように、選択することが好ましい。また、エリアカメラ103に搭載される撮像素子の大きさや画素サイズも特に規定するものではないが、生成される画像の画質や画像分解能を考慮すると、より大きな撮像素子を利用することが好ましい。また、以下で説明する画像処理の観点から、線状のレーザ光Lの線幅が、撮像素子上で2~4画素程度であるように調整されることが好ましい。 Although the focal length and angle of view of the lens mounted on the area camera 103 and the distance between the illumination device 101 and the image sensor of the area camera 103 are not particularly specified, x on the surface of the object 1 to be inspected The selection is preferably made so that the entire direction is within the field of view VF. Further, the size and pixel size of the image sensor mounted on the area camera 103 are not particularly specified, but it is preferable to use a larger image sensor in consideration of the image quality and image resolution of the generated image. Further, from the viewpoint of image processing described below, it is preferable that the line width of the linear laser beam L is adjusted to be about 2 to 4 pixels on the image sensor.
 エリアカメラ103が、被検査体1の表面に照射されている線状のレーザ光Lを撮像することで、線状のレーザ光Lの照射部分に対応する線分である光切断線が撮像された、いわゆる光切断画像が生成される。エリアカメラ103は、かかる光切断画像を生成すると、生成した光切断画像を、演算処理装置200へと出力する。 The area camera 103 images the linear laser light L applied to the surface of the object 1 to be inspected, so that an optical cutting line that is a line segment corresponding to the irradiated portion of the linear laser light L is imaged. In addition, a so-called light cut image is generated. When the area camera 103 generates the light cut image, the area camera 103 outputs the generated light cut image to the arithmetic processing device 200.
 ここで、照明装置101とエリアカメラ103との間の光学的な位置関係は、適宜設定することが可能である。例えば図2Aに示したように、照明装置101が被検査体1の鉛直方向上方に設けられて、被検査体1に対して垂直に線状のレーザ光Lを照射し、かつ、エリアカメラ103が鉛直方向(z軸方向)に対して角度φの方向から線状のレーザ光Lの反射光を撮像するように、照明装置101及びエリアカメラ103を配置することができる。 Here, the optical positional relationship between the illumination device 101 and the area camera 103 can be set as appropriate. For example, as shown in FIG. 2A, the illumination device 101 is provided vertically above the inspection object 1, irradiates the inspection object 1 with the linear laser beam L vertically, and the area camera 103. The illumination device 101 and the area camera 103 can be arranged so that the reflected light of the linear laser beam L is imaged from the direction of the angle φ with respect to the vertical direction (z-axis direction).
 図2Aに示した角度φの大きさは、エリアカメラ103の設置上の制約が存在しない範囲で、なるべく大きな角度とすることが好ましい。これにより、光切断線の乱反射をエリアカメラ103で撮像することが可能となる。図2Aに示したφの大きさは、例えば30度~60度程度とすることが好ましい。 The size of the angle φ shown in FIG. 2A is preferably as large as possible within a range where there are no restrictions on the installation of the area camera 103. Thereby, it is possible to capture the irregular reflection of the light section line with the area camera 103. The size of φ shown in FIG. 2A is preferably about 30 to 60 degrees, for example.
 また、図2Aでは、被検査体撮像装置100において、撮像装置の一例であるエリアカメラが一台だけ設けられている場合を図示しているが、図2Bに示したように、被検査体撮像装置100は、線状のレーザ光Lが照射された被検査体1の表面を、移動方向の上流側及び移動方向の下流側のそれぞれから撮像できるように、少なくとも2つのエリアカメラ103,105を有していても良い。この際、図2Bに示したように、エリアカメラ103とエリアカメラ105とは、被検査体1の移動方向の上流側及び下流側に、角度φで均等に配設されることが好ましい。 2A illustrates a case where the inspection object imaging apparatus 100 includes only one area camera which is an example of the imaging apparatus, but as illustrated in FIG. 2B, the inspection object imaging is performed. The apparatus 100 includes at least two area cameras 103 and 105 so that the surface of the inspection object 1 irradiated with the linear laser beam L can be imaged from each of the upstream side in the movement direction and the downstream side in the movement direction. You may have. At this time, as shown in FIG. 2B, it is preferable that the area camera 103 and the area camera 105 are equally arranged at an angle φ on the upstream side and the downstream side in the moving direction of the device under test 1.
 以下で具体例を示して説明するように、地肌残存部や凹凸キズ等は、移動方向に沿った表面の傾きに方向性が存在する場合がある。図2Bに示したように移動方向の上流側及び下流側にそれぞれエリアカメラを設け、それぞれのエリアカメラから出力される光切断画像を利用することで、かかる傾きの方向性に左右されずに、被検査体1の表面性状を更に正確に検査することが可能となる。 As will be described below with a specific example, there are cases where the background surface remaining portion, uneven scratches, and the like have directionality in the inclination of the surface along the moving direction. As shown in FIG. 2B, each area camera is provided on the upstream side and the downstream side in the moving direction, and by using the light cut image output from each area camera, the direction of the tilt is not affected. It becomes possible to inspect the surface property of the inspection object 1 more accurately.
 以下に、本実施形態に係る被検査体撮像装置100が有する各装置について、その具体的な構成や設定値等を列挙する。かかる構成や設定値等はあくまでも一例であって、本発明に係る被検査体撮像装置100が、以下の具体例に限定されるものではない。 Hereinafter, specific configurations, setting values, and the like of the devices included in the inspected object imaging device 100 according to the present embodiment will be listed. Such a configuration, setting values, and the like are merely examples, and the inspected subject imaging apparatus 100 according to the present invention is not limited to the following specific examples.
○被検査体(機械加工品)
 幅(x軸方向の長さ):600mm~1750mm程度
○照明装置101
 100mWの出力でレーザ光源から赤色レーザ光を照射。被検査体1の表面に照射される線状のレーザ光Lの線幅は、0.25mm(250μm)である。ただし、この場合の線幅とは、ピーク強度値から13.5%で定義されるものである。
○エリアカメラ
 2048画素×2048画素のCCD(画素サイズ:5.5μm×5.5μm)が撮像素子として搭載されており、フレームレートは、200fpsである。レンズの焦点距離は24mmであり、画角は26°である。撮像される画像の画素サイズは0.25mm×0.25mmであり、線状のレーザ光Lの線幅は、撮像画像上では、2~4画素の輝線の幅で撮像される。なお、被検査体の幅が大きい場合等においては、画素分解能の確保等の理由から、必要に応じて、エリアカメラを幅方向に複数台並べるようにすることも可能である。
○画像取得方法
 被検査体1が移動している際にPLGから出力される信号に同期して、連続的に画像を取得する。具体的には、被検査体1が5msecの時間で前進する毎に、撮像を行う。この際、図2Bに示したように設置された2台のエリアカメラを同期させて、同一視野を同時に撮像する。照明装置101は、鉛直方向上方に設けられ、鉛直方向下向きに線状のレーザ光Lを照射する。2台のエリアカメラの設置角度φは、±45度とする。なお、撮像分解能は、対象とするキズ等の大きさに応じて決定するものであるが、取りこみピッチを0.25mmとして、撮像素子の画素サイズに一致させた。
○ Inspected object (machined product)
Width (length in the x-axis direction): about 600 mm to 1750 mm ○ Lighting device 101
Irradiates red laser light from a laser light source with an output of 100 mW. The line width of the linear laser beam L irradiated on the surface of the inspection object 1 is 0.25 mm (250 μm). However, the line width in this case is defined as 13.5% from the peak intensity value.
○ Area camera A CCD of 2048 pixels × 2048 pixels (pixel size: 5.5 μm × 5.5 μm) is mounted as an image sensor, and the frame rate is 200 fps. The focal length of the lens is 24 mm and the field angle is 26 °. The pixel size of the image to be captured is 0.25 mm × 0.25 mm, and the line width of the linear laser beam L is captured with a width of 2 to 4 bright lines on the captured image. When the width of the object to be inspected is large, for example, it is possible to arrange a plurality of area cameras in the width direction as necessary for securing pixel resolution.
Image acquisition method When the object to be inspected 1 is moving, images are continuously acquired in synchronization with a signal output from the PLG. Specifically, imaging is performed every time the inspected object 1 moves forward in a time of 5 msec. At this time, two area cameras installed as shown in FIG. 2B are synchronized, and the same visual field is simultaneously imaged. The illumination device 101 is provided above the vertical direction and irradiates the linear laser beam L downward in the vertical direction. The installation angle φ of the two area cameras is ± 45 degrees. Note that the imaging resolution is determined according to the size of the target scratch or the like, but the capture pitch is set to 0.25 mm to match the pixel size of the image sensor.
 続いて、図3A~図4Cを参照しながら、かかる被検査体撮像装置100によって撮像される光切断画像について、詳細に説明する。 Subsequently, the light section image captured by the inspection subject imaging apparatus 100 will be described in detail with reference to FIGS. 3A to 4C.
 本実施形態で被検査体1として着目する機械加工品は、研磨や研削等といった機械加工が施されているが故に、その表面は、ほぼ一定の表面性状を有しており、正反射性が強い。従って、凹凸を伴うキズや、地肌残存部や錆等を有していない正常な部分(以下、単に「正常部」ともいう。)の表面は、10μm以下という極めて小さな凹凸しか存在していない。その結果、正常部に照射された線状のレーザ光Lは、図3Aに模式的に示したように、ほぼ同一の反射特性を示して、エリアカメラにより撮像される。この場合、エリアカメラによって撮像された光切断画像では、図4Aに模式的に示したように、光切断線の線幅は、ほぼ一定の値を取る。また、機械加工品の表面に凹部又は凸部が存在する場合には、図4Bに凹部での光切断画像を模式的に示したように、光切断画像では、ほぼ一定の光切断線の線幅を保持したままで、光切断線に屈曲が生じる。 Since the machined product focused on as the object to be inspected 1 in the present embodiment is subjected to machining such as polishing or grinding, the surface thereof has a substantially constant surface property and has regular reflectivity. strong. Therefore, the surface of a normal part (hereinafter, also simply referred to as “normal part”) that does not have scratches with unevenness, a background remaining part, rust, or the like has only extremely small unevenness of 10 μm or less. As a result, as shown schematically in FIG. 3A, the linear laser light L irradiated to the normal part shows substantially the same reflection characteristics and is imaged by the area camera. In this case, in the light section image captured by the area camera, the line width of the light section line takes a substantially constant value as schematically shown in FIG. 4A. In addition, when there is a concave or convex portion on the surface of the machined product, as shown schematically in FIG. 4B, the light cutting image at the concave portion, the light cutting image shows a line of a substantially constant light cutting line. The optical cutting line is bent while maintaining the width.
 他方、本実施形態で着目する地肌残存部及び錆の表面は、100μm程度の凹凸を有しており、正常部と比較して粗面となっている。地肌残存部及び錆(以下、これらをまとめて「粗面部」ともいう。)は、図3Bに模式的に示したように、その表面の粗さゆえに拡散面として機能する。そのため、粗面部に照射された線状のレーザ光Lは乱反射を起こす。エリアカメラで撮像される光切断画像での光切断線において、粗面部に対応する部分の線幅は、上記のような乱反射のため、図4Cに模式的に示したように正常部と比較して拡大される。 On the other hand, the remaining surface of the ground and the surface of the rust to which attention is paid in this embodiment have irregularities of about 100 μm, and are rougher than the normal part. The background remaining portion and rust (hereinafter collectively referred to as “rough surface portion”) function as a diffusion surface because of the roughness of the surface, as schematically shown in FIG. 3B. Therefore, the linear laser beam L irradiated to the rough surface portion causes irregular reflection. In the light section line in the light section image captured by the area camera, the line width of the portion corresponding to the rough surface portion is compared with the normal portion as schematically shown in FIG. 4C because of the irregular reflection as described above. Is expanded.
 以下で詳述する演算処理装置200では、図4A~図4Cに模式的に示したような光切断画像での光切断線の線幅に基づいて、地肌残存部及び錆を顕在化させる。また、演算処理装置200は、光切断画像を利用したいわゆる光切断法により、被検査体1の表面形状に関する情報を生成して、被検査体1の表面に存在する凹凸キズ等を検出する。 In the arithmetic processing unit 200 described in detail below, the remaining surface portion of the ground and the rust are made visible based on the line width of the light section line in the light section image schematically shown in FIGS. 4A to 4C. Further, the arithmetic processing device 200 generates information on the surface shape of the inspection object 1 by a so-called light cutting method using the light cutting image, and detects unevenness scratches and the like existing on the surface of the inspection object 1.
 以上、図2A~図4Cを参照しながら、本実施形態に係る被検査体撮像装置100の構成と、かかる被検査体撮像装置100により生成される光切断画像について、詳細に説明した。 The configuration of the inspection object imaging device 100 according to the present embodiment and the light section image generated by the inspection object imaging device 100 have been described in detail above with reference to FIGS. 2A to 4C.
<演算処理装置200の全体構成について>
 続いて、再び図1に戻って、本実施形態に係る演算処理装置200の全体構成について説明する。
 本実施形態に係る演算処理装置200は、例えば図1に示したように、撮像制御部201と、画像処理部203と、表示制御部205と、記憶部207と、を主に備える。
<About Overall Configuration of Arithmetic Processing Device 200>
Next, returning to FIG. 1 again, the overall configuration of the arithmetic processing apparatus 200 according to the present embodiment will be described.
For example, as illustrated in FIG. 1, the arithmetic processing device 200 according to the present embodiment mainly includes an imaging control unit 201, an image processing unit 203, a display control unit 205, and a storage unit 207.
 撮像制御部201は、CPU(Central Processing Unit)、ROM(Read Only Memory)、RAM(Random Access Memory)、通信装置等により実現される。撮像制御部201は、本実施形態に係る被検査体撮像装置100による被検査体1の撮像制御を実施する。より詳細には、撮像制御部201は、被検査体1の撮像を開始する場合に、照明装置101に対してレーザ光の発振を開始させるための制御信号を送出する。 The imaging control unit 201 is realized by a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), a communication device, and the like. The imaging control unit 201 performs imaging control of the inspection object 1 by the inspection object imaging apparatus 100 according to the present embodiment. More specifically, the imaging control unit 201 sends a control signal for starting oscillation of laser light to the illumination device 101 when imaging of the inspection object 1 is started.
 先だって言及したように、被検査体1の搬送ラインからは、定期的にPLG信号が送出される(例えば、被検査体1が0.25mm移動する毎に1パルスのPLG信号が送出される)。撮像制御部201は、PLG信号を取得する毎にエリアカメラ103,105に対して撮像を開始するためのトリガ信号を送出する。 As mentioned earlier, a PLG signal is periodically sent from the conveyance line of the device under test 1 (for example, one pulse of PLG signal is sent every time the device under test 1 moves 0.25 mm). . The imaging control unit 201 sends a trigger signal for starting imaging to the area cameras 103 and 105 every time a PLG signal is acquired.
 画像処理部203は、例えば、CPU、ROM、RAM、通信装置等により実現される。画像処理部203は、被検査体撮像装置100(より詳細には、被検査体撮像装置100のエリアカメラ103,105)から取得した光切断画像の撮像データを利用して、後述する縞画像フレームを生成する。その後、画像処理部203は、生成した縞画像フレームに対して、以下で説明するような画像処理を行って被検査体1である機械加工品の表面性状を検査し、表面に存在する可能性のある各種欠陥等を検出する。画像処理部203は、被検査体1の表面の欠陥検出処理を終了すると、得られた検出結果に関する情報を、表示制御部205に伝送する。 The image processing unit 203 is realized by, for example, a CPU, a ROM, a RAM, a communication device, and the like. The image processing unit 203 uses the imaging data of the light section image acquired from the inspected object imaging device 100 (more specifically, the area cameras 103 and 105 of the inspected object imaging device 100) to form a fringe image frame to be described later. Is generated. Thereafter, the image processing unit 203 performs image processing as will be described below on the generated striped image frame to inspect the surface property of the machined product that is the inspection object 1 and may exist on the surface. Detects various types of defects. When the image processing unit 203 finishes the defect detection process on the surface of the inspection object 1, the image processing unit 203 transmits information about the obtained detection result to the display control unit 205.
 なお、この画像処理部203については、以下で改めて詳細に説明する。 The image processing unit 203 will be described in detail later.
 表示制御部205は、例えば、CPU、ROM、RAM、出力装置等により実現される。表示制御部205は、画像処理部203から伝送された、被検査体1である機械加工品の表面性状検査結果を、演算処理装置200が備えるディスプレイ等の出力装置や演算処理装置200の外部に設けられた出力装置等に表示する際の表示制御を行う。これにより、表面性状検査装置10の利用者は、被検査体1である機械加工品の表面性状に関する検検査結果を、その場で把握することが可能となる。 The display control unit 205 is realized by, for example, a CPU, a ROM, a RAM, an output device, and the like. The display control unit 205 transmits the surface property inspection result of the machined product that is the inspection object 1 transmitted from the image processing unit 203 to an output device such as a display provided in the arithmetic processing device 200 or the outside of the arithmetic processing device 200. Display control when displaying on the provided output device or the like is performed. Thereby, the user of the surface texture inspection apparatus 10 can grasp the inspection result regarding the surface texture of the machined product that is the inspection object 1 on the spot.
 記憶部207は、例えば本実施形態に係る演算処理装置200が備えるRAMやストレージ装置等により実現される。記憶部207には、本実施形態に係る演算処理装置200が、何らかの処理を行う際に保存する必要が生じた様々なパラメータや処理の途中経過等、または、各種のデータベースやプログラム等が、適宜記録される。この記憶部207は、撮像制御部201、画像処理部203、表示制御部205等が、リード/ライト処理を実行することが可能である。 The storage unit 207 is realized by, for example, a RAM or a storage device included in the arithmetic processing device 200 according to the present embodiment. The storage unit 207 stores various parameters, intermediate progress of processing, and various databases and programs that need to be saved when the arithmetic processing apparatus 200 according to the present embodiment performs some processing, as appropriate. To be recorded. The storage unit 207 allows the imaging control unit 201, the image processing unit 203, the display control unit 205, and the like to execute read / write processing.
<画像処理部203について>
 続いて、図5を参照しながら、本実施形態に係る演算処理装置200が備える画像処理部203について、詳細に説明する。図5は、本実施形態に係る演算処理装置が有する画像処理部の構成の一例を示したブロック図である。
<Regarding Image Processing Unit 203>
Next, the image processing unit 203 included in the arithmetic processing apparatus 200 according to the present embodiment will be described in detail with reference to FIG. FIG. 5 is a block diagram illustrating an example of a configuration of an image processing unit included in the arithmetic processing apparatus according to the present embodiment.
 本実施形態に係る画像処理部203は、図5に示したように、データ取得部211と、縞画像フレーム生成部213と、画像算出部215と、検出処理部225と、を主に備える。 As shown in FIG. 5, the image processing unit 203 according to the present embodiment mainly includes a data acquisition unit 211, a fringe image frame generation unit 213, an image calculation unit 215, and a detection processing unit 225.
 データ取得部211は、例えば、CPU、ROM、RAM、通信装置等により実現される。データ取得部211は、被検査体撮像装置100(より詳細には、エリアカメラ103,105)から出力された光切断画像の画像データ(デジタル多値画像データ)を取得して、記憶部207等に設けられた画像メモリに順次記憶する。これらのデジタル多値画像データを、被検査体1の移動方向に沿って順次利用することにより、後述するような縞画像フレームが生成される。 The data acquisition unit 211 is realized by, for example, a CPU, a ROM, a RAM, a communication device, and the like. The data acquisition unit 211 acquires the image data (digital multi-valued image data) of the light section image output from the inspection subject imaging apparatus 100 (more specifically, the area cameras 103 and 105), and the storage unit 207 and the like. Are sequentially stored in the image memory provided. By using these digital multi-valued image data sequentially along the moving direction of the object 1 to be inspected, a fringe image frame as described later is generated.
 データ取得部211により取得される光切断画像は、先だって説明したように、被検査体1の表面の移動方向に沿ったある位置において、被検査体1の表面に照射された線状のレーザ光Lを撮像したものである。光切断画像は、予めエリアカメラのゲインやレンズの絞りを適切に設定することにより、例えば、線状のレーザ光Lが照射された部分が白く表示され、その他の部分は黒く表示されている濃淡画像とすることができる。また、光切断画像中に存在する光切断線に重畳している凹凸や光切断線そのものの線幅が、被検査体1の表面の断面形状や、表面に存在する粗面部を含む各種の欠陥等に関する情報を含んでいる。 As described above, the light section image acquired by the data acquisition unit 211 is a linear laser beam irradiated on the surface of the inspection object 1 at a certain position along the moving direction of the surface of the inspection object 1. L is taken. The light-cut image is displayed by setting the area camera gain and the lens aperture appropriately in advance, for example, the portion irradiated with the linear laser light L is displayed in white, and the other portions are displayed in black and white. It can be an image. In addition, the irregularities superimposed on the light cutting line existing in the light cutting image and the line width of the light cutting line itself are various defects including the cross-sectional shape of the surface of the inspection object 1 and the rough surface portion existing on the surface. Contains information about etc.
 縞画像フレーム生成部213は、例えば、CPU、ROM、RAM等により実現される。縞画像フレーム生成部213は、記憶部207等に設けられた画像メモリから、被検査体1の移動方向に沿って格納された光切断画像を順に取得する。その後、縞画像フレーム生成部213は、取得した各光切断画像のうち、光切断線を含む領域を利用し、かかる光切断線を含む領域の画像を被検査体1の移動方向に沿って順に配列することで、縞画像フレームを生成する。 The striped image frame generation unit 213 is realized by, for example, a CPU, a ROM, a RAM, and the like. The fringe image frame generation unit 213 sequentially acquires the light section images stored along the moving direction of the device under test 1 from the image memory provided in the storage unit 207 or the like. Thereafter, the fringe image frame generation unit 213 uses an area including the optical cutting line among the obtained optical cutting images, and sequentially displays the image of the area including the optical cutting line along the moving direction of the inspection object 1. By arranging, a fringe image frame is generated.
 1つの縞画像フレームを構成する光切断画像の個数は、適宜設定すればよいが、例えば、256個の光切断画像で1つの縞画像フレームを構成するようにしてもよい。各光切断画像は、上述のように光切断画像の撮像間隔毎(例えば、0.25mm間隔)に存在している。そのため、0.25mm間隔で撮像された光切断画像に基づく、256個の光切断画像からなる1つの縞画像フレームは、被検査体1の表面を、移動方向に沿って64mm(=256×0.25mm)の範囲で撮像した結果に相当する。 The number of light cut images constituting one stripe image frame may be set as appropriate, but for example, one stripe image frame may be constituted by 256 light cut images. Each light section image exists at every imaging interval (for example, 0.25 mm interval) of the light section image as described above. Therefore, one striped image frame composed of 256 light cut images based on the light cut images taken at intervals of 0.25 mm is 64 mm (= 256 × 0) along the moving direction on the surface of the inspection object 1. .25 mm) corresponding to the result of imaging.
 図6に、縞画像フレーム生成部213によって生成される縞画像フレームの一例を示した。図6に示した縞画像フレームは、256個の光切断画像のうち、16個の光切断画像を示したものである。図6に示した縞画像フレームにおいて、図面の横方向に伸びた1本の線分が、1枚の光切断画像に対応しており、図面の横方向が図2A等におけるx軸方向に対応している。また、図6に示した縞画像フレームにおいて、図面の縦方向が、図2A等におけるy軸方向(すなわち、被検査体1の移動方向)に相当している。 FIG. 6 shows an example of a stripe image frame generated by the stripe image frame generation unit 213. The striped image frame shown in FIG. 6 shows 16 light cut images out of 256 light cut images. In the striped image frame shown in FIG. 6, one line segment extending in the horizontal direction of the drawing corresponds to one light cut image, and the horizontal direction of the drawing corresponds to the x-axis direction in FIG. 2A and the like. doing. In the striped image frame shown in FIG. 6, the vertical direction of the drawing corresponds to the y-axis direction (that is, the moving direction of the device under test 1) in FIG. 2A and the like.
 縞画像フレーム生成部213は、図6に示したような縞画像フレームを生成すると、生成した縞画像フレームを、後述する画像算出部215に出力する。また、縞画像フレーム生成部213は、生成した縞画像フレームに対応するデータに、当該縞画像フレームを生成した日時等に関する時刻情報を関連付けて、履歴情報として記憶部207等に格納してもよい。 When the stripe image frame generation unit 213 generates the stripe image frame as illustrated in FIG. 6, the stripe image frame generation unit 213 outputs the generated stripe image frame to the image calculation unit 215 described later. Further, the fringe image frame generation unit 213 may associate the time information related to the date and time when the fringe image frame is generated with the data corresponding to the generated fringe image frame and store the data in the storage unit 207 or the like as history information. .
 画像算出部215は、例えば、CPU、ROM、RAM等により実現される。画像算出部215は、縞画像フレーム生成部213が生成した縞画像フレームに基づいて、被検査体1の表面に凹凸状態を表す深さ画像と、被検査体1の表面における線状のレーザ光Lの輝度分布を表す輝度画像と、被検査体1の表面における線状のレーザ光Lの移動方向の線幅の分布を輝度値に対応づけた線幅画像と、をそれぞれ算出する。この画像算出部215は、図5に示したように、光切断線処理部217と、深さ画像算出部219と、輝度画像算出部221と、線幅画像算出部223と、を備える。 The image calculation unit 215 is realized by, for example, a CPU, a ROM, a RAM, and the like. Based on the fringe image frame generated by the fringe image frame generation unit 213, the image calculation unit 215 has a depth image representing an uneven state on the surface of the inspection object 1 and linear laser light on the surface of the inspection object 1. A luminance image representing the luminance distribution of L and a line width image in which the line width distribution in the moving direction of the linear laser beam L on the surface of the inspection object 1 is associated with the luminance value are calculated. As shown in FIG. 5, the image calculation unit 215 includes a light section line processing unit 217, a depth image calculation unit 219, a luminance image calculation unit 221, and a line width image calculation unit 223.
 光切断線処理部217は、例えば、CPU、ROM、RAM等により実現される。光切断線処理部217は、縞画像フレームに含まれる各光切断線について、光切断線の変位量(輝線の曲がり具合)を含む光切断線特徴量を算出する。以下では、図7A及び図7Bを参照しながら、光切断線処理部217が実施する処理及び算出する光切断線特徴量について、詳細に説明する。
 図7Aは、縞画像フレームを模式的に示した説明図である。図7Bは、光切断線処理部が実施する光切断線処理について説明するための説明図である。
The light section line processing unit 217 is realized by, for example, a CPU, a ROM, a RAM, and the like. The light section line processing unit 217 calculates, for each light section line included in the fringe image frame, a light section line feature amount including a displacement amount of the light section line (bending degree of the bright line). Hereinafter, the processing performed by the light section line processing unit 217 and the calculated light section line feature amount will be described in detail with reference to FIGS. 7A and 7B.
FIG. 7A is an explanatory diagram schematically showing a fringe image frame. FIG. 7B is an explanatory diagram for explaining the optical section line processing performed by the optical section line processing unit.
 図7Aでは、1つの縞画像フレームの中にN本の光切断線が存在しており、縞画像フレームの横方向の長さは、M画素であるものとする。また、1本の光切断線を含む1つの光切断画像は、縦y画素×横M画素から構成されている。なお、1本の光切断線を含む1つの光切断画像の縦画素数yの大きさ(換言すれば、1つの光切断画像から、縞画像フレームが生成される際に切り出される縦画素数yの大きさ)は、被検査体1に存在しうる凹部又は凸部の高さの範囲を過去の操業データ等に基づいて予め大まかに算出しておくことで、決定することが可能である。 In FIG. 7A, it is assumed that there are N light cutting lines in one striped image frame, and the horizontal length of the striped image frame is M pixels. Further, one optical section image which includes a single optical cutting line, and a vertical y n pixels × horizontal M pixels. Incidentally, if one vertical pixel number y size of n in the light section image (in other words, including one optical cutting line, from one light section image, the number of vertical pixels to be cut out when fringe image frame is generated ( the size of yn) can be determined by roughly calculating in advance the range of the height of the concave portion or the convex portion that may exist in the inspection object 1 based on past operation data or the like. is there.
 ここで、説明の便宜上、被検査体1の移動方向に直交するx軸方向(図7Aにおける縞画像フレームの横方向)にX軸をとり、被検査体1の移動方向に対応するy軸方向(図7Aにおける縞画像フレームの縦方向)にY軸をとって、縞画像フレーム中の画素の位置をXY座標で表すものとする。以下の説明では、縞画像フレーム中に存在するj(1≦j≦N)番目の光切断線の左側からm画素目(1≦m≦M)の位置(すなわち、Xj,mで表される位置)に着目する。 Here, for convenience of explanation, the X-axis is taken in the x-axis direction (lateral direction of the striped image frame in FIG. 7A) orthogonal to the moving direction of the inspection object 1, and the y-axis direction corresponding to the moving direction of the inspection object 1 The Y-axis is taken in the vertical direction of the striped image frame in FIG. 7A, and the pixel position in the striped image frame is expressed by XY coordinates. In the following description, the position of the m-th pixel (1 ≦ m ≦ M) from the left side of the j (1 ≦ j ≦ N) th light section line existing in the fringe image frame (ie, represented by X j, m ). The position).
 光切断線処理部217は、まず、着目すべき光切断線(以下、単に「ライン」とも称する。)の着目すべきX座標位置(本説明では、Xj,mで表される位置)を選択すると、図7Bに示したように、着目したラインの着目したX座標位置における画素に対応付けられている画素値(すなわち、光切断線の輝度値)の分布を参照する。この際、光切断線処理部217は、光切断画像中の当該X座標位置における全ての画素について、以下で説明する処理を実施するのではなく、光切断画像中におけるY座標の基準位置Yの前後Wの範囲に属する画素(すなわち、Y-W~Y+Wの範囲に属する画素)について、以下で説明する処理を実施する。 The light section line processing unit 217 first determines an X coordinate position (a position represented by X j, m in this description) of a light section line (hereinafter also simply referred to as “line”) to be focused. When the selection is made, as shown in FIG. 7B, the distribution of pixel values (that is, the luminance value of the light section line) associated with the pixel at the focused X coordinate position of the focused line is referred to. At this time, the light section line processing unit 217 does not perform the process described below for all the pixels at the X coordinate position in the light section image, but the reference position Y s of the Y coordinate in the light section image. The processing described below is performed for pixels belonging to the range of W before and after W (that is, pixels belonging to the range of Y s −W to Y s + W).
 ここで、Y座標の基準位置Yは、縞画像フレームのjライン目の光切断画像に対して予め指定されるy軸方向の位置である。また、処理範囲を規定するパラメータWは、以下のように決定することが可能である。すなわち、過去の操業データ等に基づいて、被検査体1に存在しうる凹部や凸部の高さの範囲を特定し、光切断画像中におけるY座標の基準位置Yの前後Wの範囲が光切断画像に収まるようにパラメータWの大きさを予め大まかに算出しておいた上で、適宜決定すればよい。パラメータWの値を小さくすることができれば、後述する光切断線処理部217の処理負荷の低減を図ることができる。 Here, the reference position Y s of the Y coordinate is a position in the y-axis direction that is designated in advance with respect to the j-th light section image of the striped image frame. The parameter W that defines the processing range can be determined as follows. That is, the range of the height of the concave portion and the convex portion that may exist in the inspection object 1 is specified based on past operation data and the like, and the range of W before and after the reference position Y s of the Y coordinate in the light section image is determined. What is necessary is just to determine suitably the magnitude | size of the parameter W previously calculated so that it may be settled in a light cutting image. If the value of the parameter W can be reduced, it is possible to reduce the processing load of the optical section line processing unit 217 described later.
 光切断線処理部217は、まず、Y-W~Y+Wの範囲に含まれる画素の中から、光切断線に対応する画素を特定するための所定の閾値Th以上の輝度値を有する画素を特定する。図7Bに示した例では、Yj,k、Yj,k+1、Yj,k+2で表される3つの画素が、それぞれ閾値Th以上の輝度値Ij,k、Ij,k+1、Ij,k+2を有している。従って、光切断線処理部217は、所定の閾値Th以上の輝度値を有する画素を線幅方向に加算した数pj,m=3と設定する。この所定の閾値Th以上の輝度値を有する画素を線幅方向に加算した数pj,mは、いわば位置(j,m)における輝線の画素数に対応する値であり、光切断線特徴量の一つである。また、光切断線処理部217は、以下の処理において、抽出された画素に関する情報(Yj,k、Ij,k)、(Yj,k+1、Ij,k+1)、(Yj,k+2、Ij,k+2)(以下、単に(Y,I)と略記することもある。)の情報を利用して、更なる光切断線特徴量を算出していく。 First, the light section line processing unit 217 has a luminance value equal to or higher than a predetermined threshold Th for specifying a pixel corresponding to the light section line from among pixels included in the range of Y s −W to Y s + W. Identify the pixel. In the example illustrated in FIG. 7B, three pixels represented by Y j, k , Y j, k + 1 , Y j, k + 2 have luminance values I j, k , I j, k + 1 , I j that are equal to or higher than the threshold Th, respectively. , K + 2 . Accordingly, the light section line processing unit 217 sets the number p j, m = 3 obtained by adding pixels having a luminance value equal to or higher than the predetermined threshold Th in the line width direction. The number p j, m obtained by adding pixels having luminance values equal to or greater than the predetermined threshold Th in the line width direction is a value corresponding to the number of pixels of the bright line at the position (j, m). one of. Further, the light section line processing unit 217 performs information (Y j, k , I j, k ), (Y j, k + 1 , I j, k + 1 ), (Y j, k + 2 ) regarding the extracted pixels in the following processing. , I j, k + 2 ) (hereinafter, sometimes simply abbreviated as (Y, I)), further light section line feature quantities are calculated.
 また、光切断線処理部217は、パラメータpj,m及び抽出した画素に関する情報(Y,I)を利用して、抽出された画素の輝度の総和Kj,mを算出する。図7Bに示した例の場合、光切断線処理部217が算出する輝度の総和は、Kj,m=Ij,k+Ij,k+1+Ij,k+2となる。この輝度の総和Kj,mも、光切断線特徴量の一つである。 In addition, the light section line processing unit 217 calculates the total sum K j, m of the luminances of the extracted pixels using the parameters p j, m and the information (Y, I) regarding the extracted pixels. In the case of the example illustrated in FIG. 7B, the sum of the luminances calculated by the light section line processing unit 217 is K j, m = I j, k + I j, k + 1 + I j, k + 2 . This total luminance K j, m is also one of the features of the light section line.
 更に、光切断線処理部217は、抽出された画素に関する情報(Y,I)とY座標の基準位置Yとを利用して、抽出された画素のY方向の重心位置Y(j,m)を算出するとともに、重心位置Y(j,m)の基準位置Yからの変位量Δdj,m=Y-Y(j,m)を算出する。 Further, the light section line processing unit 217 uses the information (Y, I) regarding the extracted pixel and the reference position Y s of the Y coordinate, and the barycentric position Y C (j, m) and a displacement amount Δd j, m = Y s −Y C (j, m) of the center of gravity Y C (j, m) from the reference position Y s are calculated.
 ここで、重心位置Y(j,m)は、抽出された画素の集合をAと表すこととすると、以下の式101で表される値となる。従って、図7Bに示した例の場合、重心位置Y(j,m)は、以下の式101aで表される値となる。 Here, the center-of-gravity position Y C (j, m) is a value represented by the following expression 101, where A is a set of extracted pixels. Therefore, in the case of the example shown in FIG. 7B, the center-of-gravity position Y C (j, m) is a value represented by the following expression 101a.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 ここで、画素に対応するY軸方向の位置は、いわば被検査体撮像装置100における取りこみピッチ(例えば、0.25mm)で量子化された値である。他方、上記式101で示したような演算により算出される重心位置Y(j,m)は、割り算という数値演算を利用することで算出される値であるため、被検査体撮像装置100の取りこみピッチ(いわば量子化単位)よりも小さな値となりうる。従って、かかる重心位置Y(j,m)を利用して算出される変位量Δdj,mについても、移動幅よりも小さな値を有しうる値となる。このようにして算出される変位量Δdj,mも、光切断線特徴量の一つである。 Here, the position in the Y-axis direction corresponding to the pixel is a value quantized with a take-in pitch (for example, 0.25 mm) in the inspected object imaging apparatus 100. On the other hand, the center-of-gravity position Y C (j, m) calculated by the calculation shown in the above equation 101 is a value calculated by using a numerical calculation called division. It can be smaller than the take-in pitch (so-called quantization unit). Therefore, the displacement amount Δd j, m calculated using the center-of-gravity position Y C (j, m) is also a value that can have a value smaller than the movement width. The displacement amount Δd j, m calculated in this way is also one of the light section line feature amounts.
 光切断線処理部217は、以上のような3種類の特徴量を、各切断線に含まれるM個の要素に関して算出する。その結果、図8A~図8Cに示したように、光切断線の変位量Δd、輝度の総和K、及び、輝線の画素数pに関して、M列×N行の二次元配列が生成される。 The light section line processing unit 217 calculates the above three types of feature amounts with respect to M elements included in each section line. As a result, as shown in FIGS. 8A to 8C, a two-dimensional array of M columns × N rows is generated with respect to the amount of displacement Δd of the light section line, the luminance sum K, and the number of pixels p of the bright line.
 光切断線処理部217は、算出した光切断線特徴量のうち、光切断線の変位量Δdに関する特徴量を、後述する深さ画像算出部219に出力する。また、光切断線処理部217は、算出した光切断線特徴量のうち、輝度の総和K、及び、輝線の画素数pに関する特徴量を、後述する輝度画像算出部221に出力する。更に、光切断線処理部217は、算出した光切断線特徴量のうち、輝線の画素数pに関する特徴量を、後述する線幅画像算出部223に出力する。 The light section line processing unit 217 outputs the feature amount related to the displacement amount Δd of the light section line among the calculated light section line feature amounts to the depth image calculation unit 219 described later. In addition, the light section line processing unit 217 outputs, to the brightness image calculation unit 221, which will be described later, among the calculated light section line feature amounts, the brightness sum K and the feature amount related to the number of bright line pixels p. Further, the light section line processing unit 217 outputs the feature amount related to the number of pixels p of the bright line among the calculated light section line feature amounts to the line width image calculation unit 223 described later.
 深さ画像算出部219は、例えば、CPU、ROM、RAM等により実現される。深さ画像算出部219は、光切断線処理部217が生成した光切断線特徴量(特に、変位量Δdに関する特徴量)に基づいて、被検査体1の表面の凹凸状態を表す深さ画像を算出する。 The depth image calculation unit 219 is realized by, for example, a CPU, a ROM, a RAM, and the like. The depth image calculation unit 219 is a depth image that represents the uneven state of the surface of the object 1 to be inspected based on the optical cutting line feature value (particularly, the feature value related to the displacement amount Δd) generated by the optical cutting line processing unit 217. Is calculated.
 具体的には、深さ画像算出部219は、図8Aに示したような変位量Δdに関する特徴量(二次元配列)と、線状のレーザ光Lとエリアカメラの光軸とのなす角(図2A及び図2Bにおける角度φ)と、を利用して、深さ画像を算出する。かかる深さ画像は、Y軸方向のそれぞれの位置での凹凸状態の一次元分布がY軸方向に沿って順に配列された、二次元の凹凸状態の分布を表す画像である。 Specifically, the depth image calculation unit 219 performs an angle (a two-dimensional arrangement) regarding the amount of displacement Δd as shown in FIG. 8A and the angle formed by the linear laser beam L and the optical axis of the area camera ( The depth image is calculated using the angle φ) in FIGS. 2A and 2B. This depth image is an image representing a two-dimensional uneven state distribution in which the one-dimensional distribution of the uneven state at each position in the Y-axis direction is sequentially arranged along the Y-axis direction.
 まず、図9を参照しながら、被検査体1の表面に存在する凹凸の高さと、光切断線の変位量Δdとの関係について説明する。図9は、光切断線の変位と欠陥の高さとの関係を示した説明図である。 First, the relationship between the height of the unevenness present on the surface of the inspection object 1 and the amount of displacement Δd of the optical cutting line will be described with reference to FIG. FIG. 9 is an explanatory diagram showing the relationship between the displacement of the optical cutting line and the height of the defect.
 図9では、被検査体1の表面に凹部が存在した場合を模式的に示している。ここで、被検査体1の表面に凹部が存在しない場合の表面位置の高さと凹部の底部の高さとの差分をΔhと表すこととする。垂直入射した線状のレーザ光Lが表面反射をする場合に着目すると、被検査体1の表面に凹部が存在しない場合には、図9の光線Aのように反射光は伝播するが、被検査体1の表面に凹部が存在する場合には、図9の光線Bのように反射光が伝播する。光線Aと光線Bとのズレが、本実施形態において光切断線の変位量Δdとして観測されることとなる。ここで、幾何学的な位置関係から明らかなように、光切断線の変位量Δdと凹みの深さΔhとは、Δd=Δh・sinφの関係が成立する。 FIG. 9 schematically shows a case where a concave portion exists on the surface of the inspection object 1. Here, the difference between the height of the surface position and the height of the bottom of the recess when the recess is not present on the surface of the inspection object 1 is represented by Δh. When attention is paid to the case where the linearly incident linear laser beam L is surface-reflected, when there is no concave portion on the surface of the object 1 to be inspected, the reflected light propagates like the light ray A in FIG. When there is a recess on the surface of the inspection object 1, the reflected light propagates like a light beam B in FIG. The deviation between the light beam A and the light beam B is observed as the displacement Δd of the light cutting line in this embodiment. Here, as is apparent from the geometrical positional relationship, the relationship Δd = Δh · sinφ is established between the displacement Δd of the light cutting line and the depth Δh of the recess.
 なお、図9では、被検査体1の表面に凹部が存在する場合について説明したが、被検査体1の表面に凸部が存在する場合であっても、同様の関係が成立する。 In addition, although FIG. 9 demonstrated the case where a recessed part exists in the surface of the to-be-inspected object 1, even if it is a case where a convex part exists in the surface of the to-be-inspected object 1, the same relationship is materialized.
 深さ画像算出部219は、以上説明したような関係を利用して、光切断線処理部217が算出した光切断線の変位量Δdに関する特徴量に基づき、被検査体1の表面の凹凸に関する量Δhを算出する。 The depth image calculation unit 219 uses the relationship as described above, and relates to the unevenness of the surface of the object 1 to be inspected based on the feature amount related to the displacement amount Δd of the optical cutting line calculated by the optical cutting line processing unit 217. The amount Δh is calculated.
 ここで、深さ画像の算出に用いられる光切断線の変位量Δdは、先だって説明したように光切断線の重心位置に基づいて算出されたものであり、取りこみピッチよりも小さな値を有しうる値となっている。従って、深さ画像算出部219により算出される深さ画像は、撮像素子の画素サイズよりも細かい分解能で凹凸が再現されている画像となる。 Here, the displacement amount Δd of the light cutting line used for the calculation of the depth image is calculated based on the barycentric position of the light cutting line as described above, and has a value smaller than the capture pitch. It is a possible value. Therefore, the depth image calculated by the depth image calculation unit 219 is an image in which unevenness is reproduced with a resolution finer than the pixel size of the image sensor.
 なお、被検査体1の表面の形状の変化により、図10に示したように、光切断線に湾曲等の歪みが生じる場合がある。他方、本実施形態に係る表面性状検査方法では、光切断線に重畳している凹凸が、被検査体1の表面の断面形状と表面に存在する表面欠陥に関する情報となっている。そのため、深さ画像算出部219は、光切断線の変位量Δdに基づいて深さ画像を算出する際に、光切断線毎に歪み補正処理を行って、光切断線に重畳している凹凸に関する情報のみを抽出してもよい。このような歪み補正処理を実施することにより、光切断線に湾曲等の歪みが生じた場合であっても、被検査体1の表面に存在する凹凸疵の情報のみを得ることが可能となる。 Note that, as shown in FIG. 10, due to a change in the shape of the surface of the object 1 to be inspected, the optical cutting line may be distorted such as a curve. On the other hand, in the surface property inspection method according to the present embodiment, the unevenness superimposed on the optical cutting line is information on the cross-sectional shape of the surface of the object 1 and surface defects existing on the surface. Therefore, when the depth image calculation unit 219 calculates the depth image based on the displacement amount Δd of the light cutting line, the depth image calculation unit 219 performs distortion correction processing for each light cutting line, and the unevenness superimposed on the light cutting line. Only the information regarding may be extracted. By performing such a distortion correction process, it is possible to obtain only information on the uneven ridges present on the surface of the inspected object 1 even when the optical cutting line has a distortion such as curvature. .
 かかる歪み補正処理の具体例として、(i)多次元関数や各種の非線形関数を利用したフィッティング処理を行い、得られたフィッティング曲線と観測された光切断線との差分演算を行う処理や、(ii)凹凸に関する情報が高周波成分であることを利用して、浮動フィルタやメディアンフィルタ等のローパスフィルタを適用する処理等を挙げることができる。このような歪み補正処理を実施することにより、被検査体1の表面に存在する凹凸疵の情報を保持したまま、光切断線の平坦化を図ることが可能となる。 As specific examples of such distortion correction processing, (i) a fitting process using a multidimensional function or various nonlinear functions, and a difference calculation between the obtained fitting curve and the observed light cutting line, ii) A process of applying a low-pass filter such as a floating filter or a median filter by using the fact that the information on the unevenness is a high-frequency component can be mentioned. By carrying out such a distortion correction process, it is possible to flatten the light section line while maintaining the information on the uneven ridges present on the surface of the device under test 1.
 深さ画像算出部219は、以上説明したようにして算出した深さ画像に関する情報を、後述する検出処理部225に出力する。 The depth image calculation unit 219 outputs information on the depth image calculated as described above to the detection processing unit 225 described later.
 輝度画像算出部221は、例えば、CPU、ROM、RAM等により実現される。輝度画像算出部221は、光切断線処理部217が生成した光切断線特徴量(特に、輝度の総和K及び輝線の画素数pに関する特徴量)に基づいて、被検査体1の表面における線状のレーザ光Lの輝度の分布を表す輝度画像を算出する。 The luminance image calculation unit 221 is realized by, for example, a CPU, a ROM, a RAM, and the like. The luminance image calculation unit 221 generates a line on the surface of the object 1 to be inspected based on the light cutting line feature amount generated by the light cutting line processing unit 217 (particularly, the feature amount related to the luminance sum K and the number of pixels p of the bright line). A luminance image representing the luminance distribution of the laser beam L is calculated.
 具体的には、輝度画像算出部221は、図8Bに示したような輝度の総和Kに関する特徴量(二次元配列)、及び、図8Cに示したような輝線の画素数pに関する特徴量(二次元配列)を利用して、総和輝度の線幅方向の平均値である平均輝度KAVE(j,m)=Kj,m/pj,m(1≦j≦N、1≦m≦M)を算出する。その後、輝度画像算出部221は、算出した平均輝度KAVE(j,m)からなるデータ配列を、着目している被検査体1の輝度画像とする。かかる輝度画像は、Y軸方向のそれぞれの位置での線状のレーザ光Lの輝度の一次元分布がY軸方向に沿って順に配列された、二次元の輝度分布を表す画像である。 Specifically, the luminance image calculation unit 221 performs a feature amount (two-dimensional array) related to the luminance sum K as shown in FIG. 8B and a feature amount related to the number of pixels p of bright lines as shown in FIG. 8C ( The average luminance K AVE (j, m) = K j, m / p j, m (1 ≦ j ≦ N, 1 ≦ m ≦), which is the average value of the total luminance in the line width direction. M) is calculated. Thereafter, the luminance image calculation unit 221 sets the data array formed of the calculated average luminance K AVE (j, m) as the luminance image of the object 1 to be inspected. Such a luminance image is an image representing a two-dimensional luminance distribution in which the one-dimensional luminance distribution of the linear laser beam L at each position in the Y-axis direction is sequentially arranged along the Y-axis direction.
 輝度画像算出部221は、以上説明したようにして算出した輝度画像に関する情報を、後述する検出処理部225に出力する。 The luminance image calculation unit 221 outputs information on the luminance image calculated as described above to the detection processing unit 225 described later.
 線幅画像算出部223は、例えば、CPU、ROM、RAM等により実現される。線幅画像算出部223は、光切断線処理部217が生成した光切断線特徴量(特に、輝線の画素数pに関する特徴量)に基づいて、被検査体1の表面における線状のレーザ光Lの移動方向の線幅の分布を輝度値に対応づけた線幅画像を算出する。 The line width image calculation unit 223 is realized by, for example, a CPU, a ROM, a RAM, and the like. The line width image calculation unit 223 is configured to generate a linear laser beam on the surface of the inspected object 1 based on the light section line feature amount generated by the light section line processing unit 217 (particularly, the feature amount regarding the number of pixels p of the bright line). A line width image in which the distribution of the line width in the moving direction of L is associated with the luminance value is calculated.
 以下、図11~図13を参照しながら、線幅画像算出部223における線幅画像の算出処理について説明する。図11は、正常部及び粗面部での光切断線の線幅について説明するための説明図である。図12Aは、正常部での光切断線の線幅について説明するための説明図であり、図12Bは、粗面部での光切断線の線幅について説明するための説明図である。図13は、本実施形態に係る線幅画像について説明するための説明図である。 Hereinafter, the line width image calculation processing in the line width image calculation unit 223 will be described with reference to FIGS. FIG. 11 is an explanatory diagram for explaining the line widths of the light cutting lines in the normal part and the rough surface part. FIG. 12A is an explanatory diagram for explaining the line width of the optical cutting line in the normal portion, and FIG. 12B is an explanatory diagram for explaining the line width of the optical cutting line in the rough surface portion. FIG. 13 is an explanatory diagram for describing a line width image according to the present embodiment.
 先だって説明したように、本実施形態で着目する地肌残存部や錆などの粗面部では、粗面部が散乱面として機能した結果、粗面部に照射された線状のレーザ光Lが粗面部によって散乱される。そのため、図4C及び図11に模式的に示したように、粗面部に対応する光切断画像の位置では、光切断線の線幅の拡大が認められる。 As described earlier, in the rough surface portion and the rough surface portion such as rust focused on in the present embodiment, as a result of the rough surface portion functioning as a scattering surface, the linear laser light L irradiated to the rough surface portion is scattered by the rough surface portion. Is done. Therefore, as schematically shown in FIGS. 4C and 11, an increase in the line width of the light section line is recognized at the position of the light section image corresponding to the rough surface portion.
 図11に模式的に示したような光切断線が得られたとすると、A-A’切断線に対応する位置は、正常部に対応し、B-B’切断線に対応する位置は、粗面部に対応することとなる。この場合に、A-A’切断線での正常部の輝度分布において、光切断線に対応する画素を特定するための所定の閾値Th以上の輝度値を有する画素が、図12Aに模式的に示したように3画素であったとする。同様に、B-B’切断線での粗面部の輝度分布において、閾値Th以上の輝度値を有する画素が、図12Bに模式的に示したように8画素であったとする。 If a light cutting line as schematically shown in FIG. 11 is obtained, the position corresponding to the AA ′ cutting line corresponds to the normal part, and the position corresponding to the BB ′ cutting line is a rough line. It corresponds to the surface part. In this case, in the luminance distribution of the normal part along the AA ′ cutting line, a pixel having a luminance value equal to or higher than a predetermined threshold Th for specifying a pixel corresponding to the light cutting line is schematically illustrated in FIG. 12A. Assume that there are 3 pixels as shown. Similarly, in the luminance distribution of the rough surface portion along the B-B ′ cutting line, it is assumed that the pixels having the luminance value equal to or higher than the threshold Th are 8 pixels as schematically shown in FIG. 12B.
 被検査体撮像装置100において、例えば、線状のレーザ光Lの線幅が2~4画素に対応するように設定されていたとする。粗面部では線状のレーザ光Lの線幅の拡大が認められるわけであるから、粗面部における線状のレーザ光Lの線幅は、被検査体撮像装置100における設定値である4画素を超える値となる。従って、線状のレーザ光Lの線幅に関して、粗面部に対応する画素を特定するための所定の閾値Th2を予め設定しておくことで、正常部と粗面部とを区別することが可能となる。例えば、被検査体撮像装置100において、正常部の線状のレーザ光Lの線幅が2~4画素に設定されていたとする。この場合に、余裕を見て閾値Th2を5画素と設定することで、線幅が5画素以上となっている部位を粗面部として区別することができる。 In the inspected object imaging apparatus 100, for example, it is assumed that the line width of the linear laser beam L is set to correspond to 2 to 4 pixels. Since the line width of the linear laser light L is recognized in the rough surface portion, the line width of the linear laser light L in the rough surface portion is set to 4 pixels which are set values in the inspection object imaging apparatus 100. Exceeds value. Therefore, regarding the line width of the linear laser beam L, it is possible to distinguish between the normal portion and the rough surface portion by setting in advance a predetermined threshold Th2 for specifying a pixel corresponding to the rough surface portion. Become. For example, in the imaging device 100 to be inspected, it is assumed that the line width of the normal portion of the linear laser light L is set to 2 to 4 pixels. In this case, by setting the threshold Th2 as 5 pixels with a margin, a portion having a line width of 5 pixels or more can be distinguished as a rough surface portion.
 以上のような観点から、線幅画像算出部223は、かかる閾値Th2に基づき、光切断線処理部217が生成した輝線の画素数pに関する特徴量を閾値判定することで、粗面部に対応する画素位置を特定することが可能となる。 From the above viewpoint, the line width image calculation unit 223 corresponds to the rough surface portion by determining the threshold value of the feature amount related to the number of pixels p of the bright lines generated by the light cutting line processing unit 217 based on the threshold value Th2. It becomes possible to specify the pixel position.
 続いて、図13を参照しながら、線幅画像算出部223における線幅画像の具体的な算出方法について説明する。
 上記のような閾値Th2を設定することで、粗面部の位置の特定が可能となる一方で、線幅画像算出部223が生成する線幅画像では、正常部を顕在化しないように線幅画像を構成する輝度値を決定することが重要となる。そこで、本実施形態に係る線幅画像算出部223では、縞画像フレームにおけるそれぞれの光切断線について、かかる光切断線の延伸方向(Y軸方向)の各位置での線幅と所定の閾値線幅との差分を算出し、算出された差分の大きさに応じて輝度値を割り当てることで、線幅画像を算出する。
Next, a specific calculation method of the line width image in the line width image calculation unit 223 will be described with reference to FIG.
By setting the threshold value Th2 as described above, the position of the rough surface portion can be specified, while the line width image generated by the line width image calculation unit 223 does not reveal the normal portion. It is important to determine the luminance value that constitutes. Therefore, in the line width image calculation unit 223 according to the present embodiment, for each light cutting line in the striped image frame, the line width and a predetermined threshold line at each position in the extending direction (Y-axis direction) of the light cutting line. A line width image is calculated by calculating a difference from the width and assigning a luminance value according to the calculated magnitude of the difference.
 更に具体的には、線幅画像算出部223は、光切断線処理部217が生成した輝線の画素数pに関する特徴量を参照し、各位置での輝線の画素数pと、閾値線幅に対応する閾値Th2と、の差分を算出していく。なお、正常部について、閾値Th2との差分を算出すると、差分が負の値となることも考えられるが、この場合の差分値は、負の値とせずにゼロとしておくことが好ましい。その後、線幅画像算出部223は、算出した差分の値が大きくなるにつれて、線幅画像を構成する画素の輝度値が大きくなるように、各画素位置での輝度値を決定していく。このように、各位置での輝線の画素数pと閾値Th2との差分値に対して輝度値を決定していくことで、微細な線幅の変化量をより高精細に画像化することができ、後段で実施される検出処理の精度を向上させることができる。 More specifically, the line width image calculation unit 223 refers to the feature quantity related to the number of pixels p of the bright lines generated by the light section line processing unit 217, and determines the number of bright lines pixels p at each position and the threshold line width. The difference from the corresponding threshold value Th2 is calculated. Note that, if the difference from the threshold Th2 is calculated for the normal part, the difference may be a negative value. However, the difference value in this case is preferably set to zero instead of being a negative value. Thereafter, the line width image calculation unit 223 determines the luminance value at each pixel position so that the luminance value of the pixels constituting the line width image increases as the calculated difference value increases. In this way, by determining the luminance value with respect to the difference value between the number of bright line pixels p at each position and the threshold value Th2, it is possible to image a fine line width change amount with higher definition. It is possible to improve the accuracy of detection processing performed at a later stage.
 例えば、閾値Th2が5画素であり、線幅画像を8bit画像として生成する場合を考える。この場合に、8bit画像において全ての輝度値の範囲(0~255)を利用しようとすると、線幅画像算出部223は、例えば以下のように、各画素位置での輝度値を決定していけばよい。なお、以下に示す例では、8bitフルスケールを利用する場合について示しているが、8bitフルスケールを用いなくともよい。ただ、フルスケールの範囲で輝度値を設定していくことで、生成される線幅画像の視認性を高めることができる。 For example, consider a case where the threshold Th2 is 5 pixels and a line width image is generated as an 8-bit image. In this case, if an attempt is made to use the range of all luminance values (0 to 255) in the 8-bit image, the line width image calculation unit 223 can determine the luminance value at each pixel position as follows, for example. That's fine. In addition, although the example shown below has shown about the case where 8 bit full scale is used, it is not necessary to use 8 bit full scale. However, the visibility of the generated line width image can be improved by setting the luminance value in the full scale range.
  差分値=0(又は0以下):輝度値0
  差分値=1:輝度値95
  以降、差分値が+1となる毎に、輝度値32を加算する。(例えば、差分値=2では、輝度値=95+32=127)
  差分値=5以上は、255(フルスケール)で固定値とする。
Difference value = 0 (or less than 0): luminance value 0
Difference value = 1: Luminance value 95
Thereafter, every time the difference value becomes +1, the luminance value 32 is added. (For example, when the difference value = 2, the luminance value = 95 + 32 = 127)
When the difference value is 5 or more, the fixed value is 255 (full scale).
 なお、差分値に応じた輝度値の割り当て方法は、上記の例に限定されるものではなく、差分値が大きくなるほど輝度値が高くなるような割り当て方法であれば、任意の割り当て方法を採用することが可能である。また、上記の例では、線幅画像を8bitの輝度値を有する画像として生成する場合について説明したが、8bitを超える輝度値を有する画像として線幅画像を生成してもよいことは、言うまでもない。 Note that the method of assigning the luminance value according to the difference value is not limited to the above example, and any assignment method may be adopted as long as the luminance value increases as the difference value increases. It is possible. In the above example, the case where the line width image is generated as an image having a luminance value of 8 bits has been described, but it is needless to say that the line width image may be generated as an image having a luminance value exceeding 8 bits. .
 このような処理を行うことで、線幅画像算出部223は、例えば図13の左側に示したような光切断線(及び、かかる光切断線から生成される輝線の画素数pに関する特徴量)から、図13の右側に示したような線幅画像を生成することができる。なお、図13に示した例では、輝度値=0(すなわち、正常部)の位置では線幅画像が黒色となり、輝度値が大きくなる(すなわち、粗面部に対応し、かつ、粗面部での表面凹凸がより大きくなる)ほど、白色となるように図示を行っている。かかる処理で生成される線幅画像は、Y軸方向のそれぞれの位置での線状のレーザ光Lの線幅の一次元分布がY軸方向に沿って順に配列された、二次元の線幅分布を表す画像である。 By performing such processing, the line width image calculation unit 223, for example, the light section line as shown on the left side of FIG. 13 (and the feature amount regarding the pixel number p of the bright line generated from the light section line). Thus, a line width image as shown on the right side of FIG. 13 can be generated. In the example shown in FIG. 13, the line width image is black at the position where the luminance value = 0 (that is, the normal portion), and the luminance value becomes large (that is, the luminance value corresponds to the rough surface portion and The illustration is performed so that the surface becomes more white as the surface irregularities become larger). The line width image generated by such processing is a two-dimensional line width in which a one-dimensional distribution of the line width of the linear laser light L at each position in the Y-axis direction is sequentially arranged along the Y-axis direction. It is an image showing distribution.
 線幅画像算出部223は、以上説明したようにして算出した線幅画像に関する情報を、後述する検出処理部225に出力する。 The line width image calculation unit 223 outputs information on the line width image calculated as described above to the detection processing unit 225 described later.
 再び図5に戻って、検出処理部225について説明する。
 検出処理部225は、例えば、CPU、ROM、RAM等により実現される。検出処理部225は、深さ画像算出部219、輝度画像算出部221及び線幅画像算出部223でそれぞれ算出された深さ画像、輝度画像及び線幅画像に基づいて、被検査体1の表面性状を検出する。
Returning to FIG. 5 again, the detection processing unit 225 will be described.
The detection processing unit 225 is realized by, for example, a CPU, a ROM, a RAM, and the like. Based on the depth image, the luminance image, and the line width image respectively calculated by the depth image calculation unit 219, the luminance image calculation unit 221, and the line width image calculation unit 223, the detection processing unit 225 detects the surface of the object 1 to be inspected. Detect properties.
 かかる検出処理部225は、深さ画像、輝度画像及び線幅画像に基づいて欠陥部位を特定する欠陥部位特定機能と、特定した欠陥部位の形態及び画素値に関する特徴量を抽出する特徴量抽出機能と、抽出した特徴量に基づいて欠陥の種別や有害度等を判別する欠陥判別機能と、を有している。以下、これらの機能について、簡単に説明する。 The detection processing unit 225 includes a defect part specifying function for specifying a defective part based on a depth image, a luminance image, and a line width image, and a feature quantity extracting function for extracting a feature quantity regarding the form and pixel value of the specified defective part. And a defect discriminating function for discriminating the type of defect, the degree of harmfulness, etc. based on the extracted feature quantity. Hereinafter, these functions will be briefly described.
○欠陥部位特定機能
 検出処理部225は、取得した線幅画像の各画素に対して、必要に応じて、周辺画素との輝度値の線形和を得るフィルタ処理によって粗面部の領域を強調しながら、得られた値が、粗面部特定のための第1の閾値以上となるか否かの判定を行う。このようなフィルタ処理及び当該フィルタ処理結果に基づく判定処理を実施することで、検出処理部225は、粗面部の部位を特定するための2値化画像を生成することができる。かかる2値化画像において、算出した値が第1の閾値未満であった画素が正常部(すなわち、2値化画像の画素値=0)に該当し、算出した値が第1の閾値以上であった画素が粗面部(すなわち、2値化画像の画素値=1)に該当する。更に、検出処理部225は、連続して発生している粗面部の部位を結合していくことで、一つ一つの粗面部(すなわち、地肌残存部や錆)の位置を特定する。
○ Defect site identification function The detection processing unit 225 emphasizes the area of the rough surface portion by filtering processing that obtains a linear sum of luminance values with peripheral pixels as necessary for each pixel of the acquired line width image. Then, it is determined whether or not the obtained value is equal to or greater than the first threshold value for specifying the rough surface portion. By performing such a filtering process and a determination process based on the filtering process result, the detection processing unit 225 can generate a binarized image for specifying the portion of the rough surface portion. In such a binarized image, a pixel whose calculated value is less than the first threshold corresponds to a normal part (that is, pixel value of the binarized image = 0), and the calculated value is equal to or greater than the first threshold. A given pixel corresponds to the rough surface portion (that is, the pixel value of the binarized image = 1). Furthermore, the detection processing unit 225 identifies the position of each rough surface portion (that is, the remaining surface portion or rust) by combining the portions of the rough surface portions that are continuously generated.
 同様に、検出処理部225は、取得した深さ画像及び輝度画像の各画素に対して、周辺画素との画素値(深さを表す値、又は、輝度値)の線形和を得るフィルタ処理によって縦線状疵、横線状疵、微小疵等の領域を強調し、得られた値が、欠陥部位特定のための第2の閾値以上となるか否かの判定を行う。このようなフィルタ処理及び当該フィルタ処理結果に基づく判定処理を実施することで、検出処理部225は、欠陥部位を特定するための2値化画像を生成することができる。かかる2値化画像において、算出した値が第2の閾値未満であった画素が正常箇所(すなわち、2値化画像の画素値=0)に該当し、算出した値が第2の閾値以上であった画素が欠陥箇所(すなわち、2値化画像の画素値=1)に該当する。更に、検出処理部225は、連続して発生している欠陥箇所を結合していくことで、一つ一つの欠陥部位を特定する。 Similarly, the detection processing unit 225 performs a filter process for obtaining a linear sum of pixel values (values representing depth or luminance values) with peripheral pixels for each pixel of the acquired depth image and luminance image. Regions such as vertical line wrinkles, horizontal line wrinkles, and fine wrinkles are emphasized, and it is determined whether or not the obtained value is equal to or greater than a second threshold value for specifying a defective part. By performing such a filtering process and a determination process based on the filtering process result, the detection processing unit 225 can generate a binarized image for specifying a defective part. In such a binarized image, a pixel whose calculated value is less than the second threshold corresponds to a normal location (that is, pixel value of the binarized image = 0), and the calculated value is equal to or greater than the second threshold. The pixel in question corresponds to a defective portion (that is, the pixel value of the binarized image = 1). Furthermore, the detection processing unit 225 identifies each defective portion by combining consecutively generated defect portions.
○特徴量抽出機能
 検出処理部225は、欠陥部位特定機能により線幅画像の欠陥部位(すなわち、粗面部)を特定すると、特定した欠陥部位ごとに、欠陥部位の形態及び輝度値に関する特徴量を抽出する。欠陥部位の形態に関する特徴量として、例えば、欠陥部位の幅、欠陥部位の長さ、欠陥部位の周囲長、欠陥部位の面積、欠陥部位の外接長方形の面積等を挙げることができる。また、欠陥部位の輝度値に関する特徴量として、欠陥部位の輝度の最大値、最小値、平均値等を挙げることができる。
○ Feature Extraction Function When the detection processing unit 225 specifies a defective part (that is, a rough surface part) of the line width image by the defective part specifying function, a feature amount related to the form and luminance value of the defective part is determined for each specified defective part. Extract. Examples of the feature quantity related to the form of the defective part include the width of the defective part, the length of the defective part, the peripheral length of the defective part, the area of the defective part, and the area of the circumscribed rectangle of the defective part. In addition, examples of the feature amount related to the luminance value of the defective part include a maximum value, a minimum value, and an average value of the luminance of the defective part.
 また、検出処理部225は、欠陥部位特定機能により深さ画像及び輝度画像の欠陥部位を特定すると、特定した欠陥部位ごとに、欠陥部位の形態及び画素値に関する特徴量を抽出する。欠陥部位の形態に関する特徴量として、例えば、欠陥部位の幅、欠陥部位の長さ、欠陥部位の周囲長、欠陥部位の面積、欠陥部位の外接長方形の面積等を挙げることができる。また、欠陥部位の画素値に関する特徴量として、深さ画像に関しては、欠陥部位の深さの最大値、最小値、平均値等を挙げることができ、輝度画像に関しては、欠陥部位の輝度の最大値、最小値、平均値等を挙げることができる。 In addition, when the defect processing part 225 specifies the defective part of the depth image and the luminance image by the defective part specifying function, the detection processing unit 225 extracts a feature quantity regarding the form and pixel value of the defective part for each specified defective part. Examples of the feature quantity related to the form of the defective part include the width of the defective part, the length of the defective part, the peripheral length of the defective part, the area of the defective part, and the area of the circumscribed rectangle of the defective part. In addition, as the feature amount related to the pixel value of the defective part, for the depth image, the maximum value, the minimum value, the average value, etc. of the depth of the defective part can be mentioned, and for the luminance image, the maximum luminance of the defective part can be mentioned. Value, minimum value, average value, and the like.
○欠陥判別機能
 検出処理部225は、特徴量抽出機能により各欠陥部位の特徴量を抽出すると、欠陥部位ごとに、抽出した特徴量に基づいて欠陥の種別や有害度等を判別する。特徴量に基づく欠陥の種別や有害度等の判別処理は、例えば図14に示したようなロジックテーブルを利用して行われる。すなわち、検出処理部225は、図14に例示したようなロジックテーブルによって表される判別条件に基づき、欠陥の種別や有害度を判別する。
Defect determination function When the detection processing unit 225 extracts the feature amount of each defective portion by the feature amount extraction function, the detection processing unit 225 determines the type of defect, the degree of harm, and the like for each defective portion based on the extracted feature amount. The determination processing such as the type of defect and the degree of harmfulness based on the feature amount is performed using a logic table as shown in FIG. 14, for example. That is, the detection processing unit 225 determines the type of defect and the degree of harmfulness based on the determination condition represented by the logic table illustrated in FIG.
 図14に例示したように、ロジックテーブルの縦方向の項目として、欠陥の種別(欠陥A1~欠陥An)が記載されており、ロジックテーブルの横方向の項目として、特徴量の種類(特徴量B1~特徴量Bm)が記載されている。また、欠陥の種別及び特徴量により規定されるテーブルの各セルには、対応する特徴量の大小による判別条件式(条件式C11~条件式Cnm)が記述されている。このようなロジックテーブルの各行が一組となって、一つ一つの欠陥の種別の判別条件となる。判別処理は、最上位の行に記載された種別から順に行われ、何れか一つの行に記載された判別条件を全て満たした時点で終了する。 As illustrated in FIG. 14, defect types (defects A1 to An) are described as vertical items in the logic table, and feature types (feature amounts B1) are displayed as horizontal items in the logic table. To feature amount Bm). Also, in each cell of the table defined by the defect type and the feature amount, a discrimination conditional expression (conditional expression C11 to conditional expression Cnm) based on the size of the corresponding feature amount is described. Each row of such a logic table is a set, and becomes a determination condition for each type of defect. The determination process is performed in order from the type described in the top line, and ends when all the determination conditions described in any one line are satisfied.
 このようなロジックテーブルは、過去の操業データ及び当該操業データに基づく検定員による欠陥の種別及び有害度の特定結果を教師データとした学習処理により構築されたデータベースを利用して、公知の方法により生成することが可能である。 Such a logic table is obtained by a known method using a database constructed by a learning process in which past operation data and a result of specifying a defect type and a hazard level by an examiner based on the operation data are used as teacher data. It is possible to generate.
 検出処理部225は、このようにして検出した欠陥部位ごとに欠陥の種別及び有害度を特定し、得られた検出結果を表示制御部205に出力する。これにより、被検査体1である機械加工品の表面に存在する欠陥に関する情報が、表示部(図示せず。)に出力されることとなる。また、検出処理部225は、得られた検出結果を、製造管理用プロコン等の外部の装置に出力してもよく、得られた検出結果を利用して、製品の欠陥帳票を作成してもよい。また、検出処理部225は、欠陥部位の検出結果に関する情報を、当該情報を算出した日時等に関する時刻情報と関連づけて、記憶部207等に履歴情報として格納してもよい。 The detection processing unit 225 specifies the type of defect and the degree of harm for each defective part detected in this manner, and outputs the obtained detection result to the display control unit 205. Thereby, the information regarding the defect which exists in the surface of the machined goods which are the to-be-inspected objects 1 will be output to a display part (not shown). In addition, the detection processing unit 225 may output the obtained detection result to an external device such as a manufacturing control process computer, or may create a product defect form using the obtained detection result. Good. Further, the detection processing unit 225 may store the information related to the detection result of the defective part as history information in the storage unit 207 or the like in association with time information related to the date and time when the information is calculated.
 なお、以上の説明では、ロジックテーブルを利用して欠陥の種別や有害度を判別する場合について説明したが、欠陥の種別や有害度を判別する方法は上記例に限定されるわけではない。例えば、過去の操業データ及び当該操業データに基づく検定員による欠陥の種別及び有害度の特定結果を教師データとした学習処理により、ニューラルネットやサポートベクターマシン(SVM)等の判別器を生成し、かかる判別器を欠陥の種別や有害度の判別に利用してもよい。 In the above description, the case where the type of the defect and the harmfulness are determined using the logic table has been described. However, the method for determining the type of the defect and the harmfulness is not limited to the above example. For example, a discriminator such as a neural network or a support vector machine (SVM) is generated by learning processing using past operation data and a result of specifying a defect type and a hazard level by a tester based on the operation data as teacher data, Such a discriminator may be used for discriminating the type of defect and the degree of harm.
 以下の実施例で示すように、従来の光切断法で生成される輝度画像に着目することで、粗面部の候補となる部位を見出すことは可能である。しかしながら、粗面部が微小な凹凸から構成されているがために、かかる輝度画像に着目したとしても、ノイズや埃等に起因する輝度画像の領域と粗面部との区別が困難であり、候補となる部位のうち、どの部分が粗面部に対応するかを判別することができなかった。一方、以上説明したように、本実施形態に係る検出処理部225では、線幅画像を用いて所定の閾値判定に基づく欠陥検出処理を行うことにより、従来の光切断法では区別が困難であった粗面部(すなわち、地肌残存部及び錆)の部位を、容易に顕在化させることができる。 As shown in the following examples, it is possible to find a site that is a candidate for a rough surface portion by paying attention to a luminance image generated by a conventional light cutting method. However, since the rough surface portion is composed of minute irregularities, it is difficult to distinguish between the luminance image region and the rough surface portion caused by noise, dust, etc. It was impossible to determine which part of the region corresponding to the rough surface portion. On the other hand, as described above, the detection processing unit 225 according to the present embodiment performs defect detection processing based on a predetermined threshold determination using a line width image, so that it is difficult to distinguish by the conventional light cutting method. The portion of the rough surface portion (that is, the background remaining portion and rust) can be easily revealed.
 また、検出処理部225は、深さ画像及び輝度画像に基づく欠陥検出処理の際に、線幅画像に基づき粗面部(すなわち、地肌残存部及び錆)が検出された部位以外に対して、上記のような欠陥部位の検出処理を実施してもよい。これにより、深さ画像及び輝度画像に基づく欠陥検出処理の高速化を図ることが可能となる。 In addition, the detection processing unit 225 performs the above-described processing on a portion other than the portion where the rough surface portion (that is, the background remaining portion and rust) is detected based on the line width image during the defect detection processing based on the depth image and the luminance image. Such a defect site detection process may be performed. Thereby, it is possible to increase the speed of the defect detection process based on the depth image and the luminance image.
 また、上記のように、粗面部に対応する部位は、輝度画像においても、欠陥候補部位として抽出される。従って、検出処理部225は、線幅画像に基づき特定された粗面部の部位が、輝度画像においても欠陥候補部位として抽出されているか否かを確認して、検出結果のダブルチェックを行うようにしてもよい。 Also, as described above, the part corresponding to the rough surface part is extracted as a defect candidate part in the luminance image. Therefore, the detection processing unit 225 confirms whether or not the rough surface portion specified based on the line width image is extracted as a defect candidate portion in the luminance image, and performs a double check of the detection result. May be.
 以上、本実施形態に係る演算処理装置200が有する画像処理部203の構成について、詳細に説明した。 The configuration of the image processing unit 203 included in the arithmetic processing device 200 according to the present embodiment has been described in detail above.
 なお、上述の説明では、深さ画像算出部219が深さ画像を算出する際に、差分演算処理やローパスフィルタ処理等の近似補正処理を実施する場合について説明した。しかしながら、かかる近似補正処理は、光切断線処理部217が光切断線特徴量を算出するに先立って、当該光切断線処理部217が実施してもよい。 In the above description, when the depth image calculation unit 219 calculates a depth image, an approximate correction process such as a difference calculation process or a low-pass filter process is performed. However, the optical correction line processing unit 217 may execute the approximate correction process before the optical cutting line processing unit 217 calculates the optical cutting line feature amount.
 以上、本実施形態に係る演算処理装置200の機能の一例を示した。上記の各構成要素は、汎用的な部材や回路を用いて構成されていてもよいし、各構成要素の機能に特化したハードウェアにより構成されていてもよい。また、各構成要素の機能を、CPU等が全て行ってもよい。従って、本実施形態を実施する時々の技術レベルに応じて、適宜、利用する構成を変更することが可能である。 Heretofore, an example of the function of the arithmetic processing device 200 according to the present embodiment has been shown. Each component described above may be configured using a general-purpose member or circuit, or may be configured by hardware specialized for the function of each component. In addition, the CPU or the like may perform all functions of each component. Therefore, it is possible to appropriately change the configuration to be used according to the technical level at the time of carrying out the present embodiment.
 なお、上述のような本実施形態に係る演算処理装置の各機能を実現するためのコンピュータプログラムを作製し、パーソナルコンピュータ等に実装することが可能である。また、このようなコンピュータプログラムが格納された、コンピュータで読み取り可能な記録媒体も提供することができる。記録媒体は、例えば、磁気ディスク、光ディスク、光磁気ディスク、フラッシュメモリなどである。また、上記のコンピュータプログラムは、記録媒体を用いずに、例えばネットワークを介して配信してもよい。 It should be noted that a computer program for realizing each function of the arithmetic processing apparatus according to the present embodiment as described above can be produced and mounted on a personal computer or the like. In addition, a computer-readable recording medium storing such a computer program can be provided. The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed via a network, for example, without using a recording medium.
 以上、本実施形態に係る表面性状検査装置10の構成について、詳細に説明した。本実施形態に係る表面性状検査装置10を利用することで、従来の光切断法に基づく装置と同様の装置構成で、機械加工面に残存する地肌残存部及び錆を検出することが可能となる。加えて、凹凸疵、表面模様等といった従来の光切断法に基づく検出も可能であることから、機械加工品の機械加工面全面に渡って、多岐にわたる不良部の検査が可能となる。 The configuration of the surface property inspection apparatus 10 according to this embodiment has been described in detail above. By using the surface texture inspection apparatus 10 according to the present embodiment, it is possible to detect the residual surface portion and rust remaining on the machined surface with the same apparatus configuration as the apparatus based on the conventional light cutting method. . In addition, since detection based on a conventional light cutting method such as an uneven surface or a surface pattern is also possible, a wide variety of defective parts can be inspected over the entire machined surface of a machined product.
 更に、本実施形態に係る表面性状検査装置10の適用先は、上記のような特定の機械加工品に限定されるものではなく、不良部において、正常部と比較して粗度に違いが生じるような製品であれば、本実施形態に係る表面性状検査装置10を適用することができる。特に、視認性の低い、地肌残存部及び錆部の分布状況を2次元画像化することで、画像処理による不良部位の特定が容易となり、目視による検査においても視認性の良好な画像を利用することが可能となる。 Furthermore, the application destination of the surface texture inspection apparatus 10 according to the present embodiment is not limited to the specific machined product as described above, and the roughness of the defective portion is different from that of the normal portion. If it is such a product, the surface texture inspection apparatus 10 according to the present embodiment can be applied. In particular, the two-dimensional image of the distribution of the remaining surface and the rust portion having low visibility makes it easy to identify a defective part by image processing, and uses an image with good visibility even in visual inspection. It becomes possible.
<表面性状検査方法について>
 続いて、図15を参照しながら、本実施形態に係る表面性状検査方法の流れを簡単に説明する。図15は、本実施形態に係る表面性状検査方法の流れの一例を示した流れ図である。
<About surface texture inspection method>
Next, the flow of the surface property inspection method according to the present embodiment will be briefly described with reference to FIG. FIG. 15 is a flowchart showing an example of the flow of the surface texture inspection method according to the present embodiment.
 まず、表面性状検査装置10の被検査体撮像装置100は、演算処理装置200の撮像制御部201による制御のもとで線状のレーザ光Lを用いて被検査体1である機械加工品の表面を撮像して、光切断画像を生成する(ステップS101)。その後、被検査体撮像装置100は、生成した光切断画像を演算処理装置200に出力する。演算処理装置200が備える画像処理部203のデータ取得部211は、かかる光切断画像の画像データを取得すると、取得した光切断画像を、被検査体1の移動方向に沿って、記憶部207等に設けられた画像メモリに順次格納していく。 First, the inspected object imaging device 100 of the surface property inspection apparatus 10 uses a linear laser beam L under the control of the imaging control unit 201 of the arithmetic processing device 200 for the machined product that is the inspected object 1. The surface is imaged to generate a light section image (step S101). Thereafter, the inspection object imaging device 100 outputs the generated light section image to the arithmetic processing device 200. When the data acquisition unit 211 of the image processing unit 203 included in the arithmetic processing device 200 acquires the image data of the light section image, the data section 211 stores the acquired light section image along the moving direction of the object 1 to be inspected. Are sequentially stored in an image memory provided in the storage.
 次に、演算処理装置200が備える画像処理部203の縞画像フレーム生成部213は、取得した光切断画像を被検査体1の移動方向に沿って順に配列して、縞画像フレームを生成する(ステップS103)。縞画像フレーム生成部213は、生成した縞画像フレームを、光切断線処理部217に出力する。 Next, the fringe image frame generation unit 213 of the image processing unit 203 included in the arithmetic processing device 200 generates a fringe image frame by sequentially arranging the acquired light-cut images along the moving direction of the device under test 1 ( Step S103). The stripe image frame generation unit 213 outputs the generated stripe image frame to the light section line processing unit 217.
 光切断線処理部217は、生成された縞画像フレームを利用し、各光切断線について、所定の閾値Th以上の輝度を有する画素の画素数、当該画素の輝度の総和及び光切断線の変位量を算出する(ステップS105)。これら算出結果が、光切断線特徴量として利用される。算出された光切断線特徴量は、深さ画像算出部219、輝度画像算出部221及び線幅画像算出部223にそれぞれ出力される。 The light section line processing unit 217 uses the generated fringe image frame, and for each light section line, the number of pixels having a luminance equal to or higher than a predetermined threshold Th, the sum of the brightness of the pixels, and the displacement of the light section line. The amount is calculated (step S105). These calculation results are used as the feature value of the light section line. The calculated light section line feature amount is output to the depth image calculation unit 219, the luminance image calculation unit 221 and the line width image calculation unit 223, respectively.
 深さ画像算出部219は、算出された光切断線特徴量(特に、光切断線の変位量に関する特徴量)を利用して、深さ画像を算出する(ステップS107)。また、輝度画像算出部221は、算出された光切断線特徴量(特に、輝線の画素数に関する特徴量、及び、輝度の総和に関する特徴量)を利用して、輝度画像を算出する(ステップS107)。更に、線幅画像算出部223は、算出された光切断線特徴量(特に、輝線の画素数に関する特徴量)を利用して、線幅画像を算出する(ステップS107)。深さ画像算出部219、輝度画像算出部221及び線幅画像算出部223は、算出した各画像を、検出処理部225に出力する。 The depth image calculation unit 219 calculates a depth image using the calculated light section line feature amount (particularly, a feature amount related to the displacement amount of the light section line) (step S107). Further, the luminance image calculation unit 221 calculates a luminance image by using the calculated light section line feature amount (particularly, the feature amount relating to the number of pixels of the bright line and the feature amount relating to the sum of luminance) (step S107). ). Further, the line width image calculation unit 223 calculates a line width image using the calculated light section line feature amount (particularly, the feature amount related to the number of pixels of the bright line) (step S107). The depth image calculation unit 219, the luminance image calculation unit 221, and the line width image calculation unit 223 output the calculated images to the detection processing unit 225.
 続いて、検出処理部225は、算出された深さ画像、輝度画像及び線幅画像を利用して、被検査体1の表面性状の検出処理を実施する(ステップS109)。これにより、検出処理部225は、被検査体1の表面に存在する各種の欠陥部位について、欠陥の種別及び有害度を特定することが可能となる。以上のような流れにより、被検査体1である機械加工品の表面に存在する各種の欠陥が検出されることとなる。 Subsequently, the detection processing unit 225 performs a surface property detection process of the inspection object 1 using the calculated depth image, luminance image, and line width image (step S109). As a result, the detection processing unit 225 can specify the type of defect and the degree of harmfulness for various types of defective parts existing on the surface of the inspection object 1. Through the flow as described above, various types of defects existing on the surface of the machined product that is the inspection object 1 are detected.
 以上、本実施形態に係る被検査体検査方法の流れの一例について、簡単に説明した。 Heretofore, an example of the flow of the inspection object inspection method according to this embodiment has been briefly described.
<演算処理装置200のハードウェア構成について>
 次に、図16を参照しながら、本実施形態に係る演算処理装置200のハードウェア構成について、詳細に説明する。図16は、本発明の実施形態に係る演算処理装置200のハードウェア構成を説明するためのブロック図である。
<Hardware Configuration of Arithmetic Processing Device 200>
Next, the hardware configuration of the arithmetic processing apparatus 200 according to the present embodiment will be described in detail with reference to FIG. FIG. 16 is a block diagram for explaining a hardware configuration of the arithmetic processing device 200 according to the embodiment of the present invention.
 演算処理装置200は、主に、CPU901と、ROM903と、RAM905と、を備える。また、演算処理装置200は、更に、バス907と、入力装置909と、出力装置911と、ストレージ装置913と、ドライブ915と、接続ポート917と、通信装置919とを備える。 The arithmetic processing apparatus 200 mainly includes a CPU 901, a ROM 903, and a RAM 905. The arithmetic processing device 200 further includes a bus 907, an input device 909, an output device 911, a storage device 913, a drive 915, a connection port 917, and a communication device 919.
 CPU901は、中心的な処理装置及び制御装置として機能し、ROM903、RAM905、ストレージ装置913、又はリムーバブル記録媒体921に記録された各種プログラムに従って、演算処理装置200内の動作全般又はその一部を制御する。ROM903は、CPU901が使用するプログラムや演算パラメータ等を記憶する。RAM905は、CPU901が使用するプログラムや、プログラムの実行において適宜変化するパラメータ等を一次記憶する。これらはCPUバス等の内部バスにより構成されるバス907により相互に接続されている。 The CPU 901 functions as a central processing device and control device, and controls all or a part of the operation in the arithmetic processing device 200 according to various programs recorded in the ROM 903, the RAM 905, the storage device 913, or the removable recording medium 921. To do. The ROM 903 stores programs used by the CPU 901, calculation parameters, and the like. The RAM 905 primarily stores programs used by the CPU 901, parameters that change as appropriate during execution of the programs, and the like. These are connected to each other by a bus 907 constituted by an internal bus such as a CPU bus.
 バス907は、ブリッジを介して、PCI(Peripheral Component Interconnect/Interface)バスなどの外部バスに接続されている。 The bus 907 is connected to an external bus such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge.
 入力装置909は、例えば、マウス、キーボード、タッチパネル、ボタン、スイッチ及びレバーなどユーザが操作する操作手段である。また、入力装置909は、例えば、赤外線やその他の電波を利用したリモートコントロール手段(いわゆる、リモコン)であってもよいし、演算処理装置200の操作に対応したPDA等の外部接続機器923であってもよい。更に、入力装置909は、例えば、上記の操作手段を用いてユーザにより入力された情報に基づいて入力信号を生成し、CPU901に出力する入力制御回路などから構成されている。ユーザは、この入力装置909を操作することにより、演算処理装置200に対して各種のデータを入力したり処理動作を指示したりすることができる。 The input device 909 is an operation means operated by the user such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever. The input device 909 may be, for example, remote control means (so-called remote control) using infrared rays or other radio waves, or may be an external connection device 923 such as a PDA corresponding to the operation of the arithmetic processing device 200. May be. Furthermore, the input device 909 includes, for example, an input control circuit that generates an input signal based on information input by a user using the operation unit and outputs the input signal to the CPU 901. By operating the input device 909, the user can input various data or instruct processing operations to the arithmetic processing device 200.
 出力装置911は、取得した情報をユーザに対して視覚的又は聴覚的に通知することが可能な装置で構成される。このような装置として、CRTディスプレイ装置、液晶ディスプレイ装置、プラズマディスプレイ装置、ELディスプレイ装置及びランプなどの表示装置や、スピーカ及びヘッドホンなどの音声出力装置や、プリンタ装置、携帯電話、ファクシミリなどがある。出力装置911は、例えば、演算処理装置200が行った各種処理により得られた結果を出力する。具体的には、表示装置は、演算処理装置200が行った各種処理により得られた結果を、テキスト又はイメージで表示する。他方、音声出力装置は、再生された音声データや音響データ等からなるオーディオ信号をアナログ信号に変換して出力する。 The output device 911 is configured by a device that can notify the user of the acquired information visually or audibly. Such devices include display devices such as CRT display devices, liquid crystal display devices, plasma display devices, EL display devices and lamps, audio output devices such as speakers and headphones, printer devices, mobile phones, and facsimiles. The output device 911 outputs results obtained by various processes performed by the arithmetic processing device 200, for example. Specifically, the display device displays the results obtained by various processes performed by the arithmetic processing device 200 as text or images. On the other hand, the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs the analog signal.
 ストレージ装置913は、演算処理装置200の記憶部の一例として構成されたデータ格納用の装置である。ストレージ装置913は、例えば、HDD(Hard Disk Drive)等の磁気記憶デバイス、半導体記憶デバイス、光記憶デバイス、又は光磁気記憶デバイス等により構成される。このストレージ装置913は、CPU901が実行するプログラムや各種データ、及び外部から取得した各種のデータなどを格納する。 The storage device 913 is a data storage device configured as an example of a storage unit of the arithmetic processing device 200. The storage device 913 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device. The storage device 913 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
 ドライブ915は、記録媒体用リーダライタであり、演算処理装置200に内蔵、あるいは外付けされる。ドライブ915は、装着されている磁気ディスク、光ディスク、光磁気ディスク、又は半導体メモリ等のリムーバブル記録媒体921に記録されている情報を読み出して、RAM905に出力する。また、ドライブ915は、装着されている磁気ディスク、光ディスク、光磁気ディスク、又は半導体メモリ等のリムーバブル記録媒体921に記録を書き込むことも可能である。リムーバブル記録媒体921は、例えば、CDメディア、DVDメディア、Blu-ray(登録商標)メディア等である。また、リムーバブル記録媒体921は、コンパクトフラッシュ(登録商標)(CompactFlash:CF)、フラッシュメモリ、又は、SDメモリカード(Secure Digital memory card)等であってもよい。また、リムーバブル記録媒体921は、例えば、非接触型ICチップを搭載したICカード(Integrated Circuit card)又は電子機器等であってもよい。 The drive 915 is a recording medium reader / writer, and is built in or externally attached to the arithmetic processing unit 200. The drive 915 reads information recorded on a removable recording medium 921 such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 905. The drive 915 can also write a record on a removable recording medium 921 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory. The removable recording medium 921 is, for example, a CD medium, a DVD medium, a Blu-ray (registered trademark) medium, or the like. Further, the removable recording medium 921 may be a compact flash (registered trademark) (CompactFlash: CF), a flash memory, an SD memory card (Secure Digital memory card), or the like. Further, the removable recording medium 921 may be, for example, an IC card (Integrated Circuit card) on which a non-contact IC chip is mounted, an electronic device, or the like.
 接続ポート917は、機器を演算処理装置200に直接接続するためのポートである。接続ポート917の一例として、USB(Universal Serial Bus)ポート、IEEE1394ポート、SCSI(Small Computer System Interface)ポート、RS-232Cポート、HDMI(登録商標)(High-Definition Multimedia Interface)ポート等がある。この接続ポート917に外部接続機器923を接続することで、演算処理装置200は、外部接続機器923から直接各種のデータを取得したり、外部接続機器923に各種のデータを提供したりする。 The connection port 917 is a port for directly connecting a device to the arithmetic processing device 200. Examples of the connection port 917 include a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, an RS-232C port, and an HDMI (registered trademark) High-Definition Multimedia interface. By connecting the external connection device 923 to the connection port 917, the arithmetic processing apparatus 200 acquires various data directly from the external connection device 923 or provides various data to the external connection device 923.
 通信装置919は、例えば、通信網925に接続するための通信デバイス等で構成された通信インターフェースである。通信装置919は、例えば、有線もしくは無線LAN(Local Area Network)、Bluetooth(登録商標)、又はWUSB(Wireless USB)用の通信カード等である。また、通信装置919は、光通信用のルータ、ADSL(Asymmetric Digital Subscriber Line)用のルータ、又は、各種通信用のモデム等であってもよい。この通信装置919は、例えば、インターネットや他の通信機器との間で、例えばTCP/IP等の所定のプロトコルに則して信号等を送受信することができる。また、通信装置919に接続される通信網925は、有線又は無線によって接続されたネットワーク等により構成され、例えば、インターネット、家庭内LAN、社内LAN、赤外線通信、ラジオ波通信又は衛星通信等であってもよい。 The communication device 919 is a communication interface configured by a communication device for connecting to the communication network 925, for example. The communication device 919 is, for example, a communication card for a wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB). The communication device 919 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication. The communication device 919 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet and other communication devices. In addition, the communication network 925 connected to the communication device 919 is configured by a wired or wireless network, for example, the Internet, a home LAN, an in-house LAN, infrared communication, radio wave communication, satellite communication, or the like. May be.
 以上、本発明の実施形態に係る演算処理装置200の機能を実現可能なハードウェア構成の一例を示した。上記の各構成要素は、汎用的な部材を用いて構成されていてもよいし、各構成要素の機能に特化したハードウェアにより構成されていてもよい。従って、本実施形態を実施する時々の技術レベルに応じて、適宜、利用するハードウェア構成を変更することが可能である。 Heretofore, an example of the hardware configuration capable of realizing the function of the arithmetic processing device 200 according to the embodiment of the present invention has been shown. Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Therefore, it is possible to change the hardware configuration to be used as appropriate according to the technical level at the time of carrying out this embodiment.
 以下では、実施例を示しながら、本発明に係る表面性状検査装置及び表面性状検査方法について、具体的に説明する。なお、以下に示す実施例は、本発明に係る表面性状検査装置及び表面性状検査方法のあくまでも一例にすぎず、本発明に係る表面性状検査装置及び表面性状検査方法が下記の例に限定されるものではない。 Hereinafter, the surface texture inspection apparatus and the surface texture inspection method according to the present invention will be specifically described with reference to examples. In addition, the Example shown below is only an example of the surface property inspection apparatus and surface property inspection method which concern on this invention, and the surface property inspection apparatus and surface property inspection method which concern on this invention are limited to the following example. It is not a thing.
 以下に示す実施例では、図2Bに示した構成を有する被検査体撮像装置100を有する表面性状検査装置10を利用し、被検査体の表面性状の検査を行った。 In the following example, the surface property inspection device 10 having the inspection object imaging device 100 having the configuration shown in FIG. 2B was used to inspect the surface property of the inspection object.
 本実施例では、被検査体1として、Feを主成分とする中間素材であるビレットを熱間成形した後の材料を研削することで製造された円盤状加工品を利用し、円盤状加工品の研削面(円盤の径方向に広がる面)を検査対象面とした。かかる円盤状加工品の直径は、860mmである。 In this embodiment, a disk-shaped processed product manufactured by grinding a material after hot forming a billet, which is an intermediate material mainly composed of Fe, is used as the object to be inspected 1. The ground surface (the surface spreading in the radial direction of the disk) was the inspection target surface. The diameter of such a disk-shaped processed product is 860 mm.
 被検査体撮像装置100の照明装置として、赤色レーザ光を出射する赤色LDモジュールを利用し、かかるモジュールから照射される点状レーザ光(出力:100mW)をシリンドリカルレンズに入射させることで、線状のレーザ光Lを得た。かかる線状のレーザ光Lは、円盤状加工品の半径(430mm)に対応する程度の長さ(図2Bにおけるx軸方向の長さ)となるように制御されて、回転している円盤状加工品の研削面に照射された。なお、被検査体1の表面に照射される線状のレーザ光Lの線幅を、0.25mm(250μm)とした。 A red LD module that emits red laser light is used as an illuminating device of the inspected object imaging apparatus 100, and a point laser beam (output: 100 mW) emitted from the module is incident on a cylindrical lens to form a linear shape. The laser beam L was obtained. The linear laser beam L is controlled so as to have a length corresponding to the radius (430 mm) of the disk-shaped workpiece (length in the x-axis direction in FIG. 2B), and is rotating. Irradiated to the ground surface of the workpiece. In addition, the line width of the linear laser beam L irradiated on the surface of the inspection object 1 was set to 0.25 mm (250 μm).
 エリアカメラ103,105として、2048画素×2048画素のCCD(画素サイズ:5.5μm×5.5μm)が撮像素子として搭載された市販のカメラを利用した。かかる撮像素子のフレームレートは、200fpsである。また、かかるカメラに装着されたレンズの焦点距離は24mmであり、画角は26°である。撮像される画像の画素サイズは0.25mm×0.25mmであり、線状のレーザ光Lの線幅は、撮像画像上では、2~4画素の輝線の幅で撮像される。 As the area cameras 103 and 105, a commercially available camera in which a CCD (pixel size: 5.5 μm × 5.5 μm) of 2048 pixels × 2048 pixels was mounted as an image sensor was used. The frame rate of such an image sensor is 200 fps. The focal length of the lens mounted on the camera is 24 mm and the field angle is 26 °. The pixel size of the image to be captured is 0.25 mm × 0.25 mm, and the line width of the linear laser beam L is captured with a width of 2 to 4 bright lines on the captured image.
 円盤状加工品が、径方向に直交し、かつ、研削面に平行な方向に回転している際に、PLGから出力される信号に同期して、2台のエリアカメラにより連続的に画像を取得した。具体的には、円盤状加工品が5msecの間で回転する毎に、撮像を行った。なお、照明装置101は、鉛直方向上方に設け、鉛直方向下向きに線状のレーザ光Lを照射し、2台のエリアカメラの設置角度φは、±45度とした。なお、画像の取りこみピッチを0.25mmとして、撮像素子の画素サイズに一致させた。 When the disk-shaped workpiece is rotating in a direction orthogonal to the radial direction and parallel to the grinding surface, images are continuously captured by two area cameras in synchronization with the signal output from the PLG. I got it. Specifically, imaging was performed every time the disk-shaped processed product was rotated for 5 msec. The illumination device 101 is provided above the vertical direction, irradiates the linear laser beam L downward in the vertical direction, and the installation angle φ of the two area cameras is ± 45 degrees. The image capture pitch was set to 0.25 mm to match the pixel size of the image sensor.
 上記のような被検査体撮像装置100により得られた光切断画像を、先だって説明したような構成を有する演算処理装置200により画像処理し、円盤状加工品の研削面の表面性状を検査した。なお、線幅画像の算出に際し、線幅閾値Th2は5画素に設定し、先だって言及したような、8bitフルスケールを利用した輝度値の割り当てを行った。 The light cut image obtained by the inspection object imaging device 100 as described above was subjected to image processing by the arithmetic processing device 200 having the configuration as described above, and the surface property of the ground surface of the disk-like processed product was inspected. In calculating the line width image, the line width threshold Th2 is set to 5 pixels, and the luminance value is assigned using the 8-bit full scale as mentioned above.
 得られた光切断画像のうち、正常部に対応する部分の光切断画像、凹部に対応する部分の光切断画像、及び、地肌残存部に対応する部分の光切断画像を、図17にまとめて示した。図17から明らかなように、正常部に対応する光切断画像では、光切断線の線幅がほぼ一定のまま、ほぼ真っすぐな線分が撮像されており、凹部に対応する光切断画像では、光切断線の線幅はほぼ変わることなく、凹部に対応する位置に光切断線の屈曲が生じていることがわかる。一方、地肌残存部に対応する光切断画像では、正常部に対応する光切断画像と比較すると明らかなように、光切断線の線幅が拡大しているのが確認された。 Among the obtained light cut images, the light cut image of the portion corresponding to the normal portion, the light cut image of the portion corresponding to the concave portion, and the light cut image of the portion corresponding to the remaining background portion are summarized in FIG. Indicated. As is clear from FIG. 17, in the light cut image corresponding to the normal portion, the straight line segment is imaged while the line width of the light cut line is substantially constant, and in the light cut image corresponding to the recess, It can be seen that the light cutting line is bent at a position corresponding to the recess without substantially changing the line width of the light cutting line. On the other hand, in the light cut image corresponding to the background remaining portion, it was confirmed that the line width of the light cut line was enlarged as is clear when compared with the light cut image corresponding to the normal portion.
 それぞれのエリアカメラから得られた光切断画像を利用して、演算処理装置200により算出された輝度画像、深さ画像及び線幅画像を、まとめて図18に示した。なお、図18において、「エリアカメラ1」は、円盤状加工品の回転方向の下流側に設けたエリアカメラであり、「エリアカメラ2」は、円盤状加工品の回転方向の上流側に設けたエリアカメラである。 The luminance image, depth image, and line width image calculated by the arithmetic processing device 200 using the light cut images obtained from the respective area cameras are collectively shown in FIG. In FIG. 18, “area camera 1” is an area camera provided on the downstream side in the rotational direction of the disk-shaped workpiece, and “area camera 2” is provided on the upstream side in the rotational direction of the disk-shaped workpiece. Area camera.
 図18に示した2種類の輝度画像に着目すると明らかなように、輝度画像では、埃や微小凹凸に起因する周辺のノイズからの反射光の影響を受け、欠陥候補部位となる白く見える部分が多数存在していることがわかる。かかる輝度画像を参照したのみでは、本発明で着目している地肌残存部を顕在化させることが困難であることがわかる。 As apparent from the two types of luminance images shown in FIG. 18, in the luminance image, a portion that appears white as a defect candidate site is affected by the reflected light from surrounding noise caused by dust and minute unevenness. It can be seen that there are many. It can be seen that it is difficult to make the remaining surface of the background focused in the present invention obvious only by referring to such a luminance image.
 同様に、図18に示した2種類の深さ画像に注目した場合、深さ画像では、欠陥候補部位となる灰色の部分が多数存在し、かかる深さ画像を参照したのみでは、本発明で着目している地肌残存部を顕在化させることが困難であることがわかる。 Similarly, when attention is paid to the two types of depth images shown in FIG. 18, in the depth image, there are a large number of gray portions as defect candidate sites. It can be seen that it is difficult to make the remaining surface portion of interest noticeable.
 一方、図18に示した2種類の線幅画像では、周辺ノイズの影響はあるものの、地肌残存部とそれ以外の部分とで画像の状態が明確に異なっており、地肌残存部を顕在化させることが可能であることが明らかとなった。かかる図からも明らかなように、線幅画像を利用することで、地肌残存部と周辺ノイズとの識別性も高いことが確認できた。 On the other hand, in the two types of line width images shown in FIG. 18, although there is an influence of peripheral noise, the state of the image is clearly different between the remaining background portion and the other portions, and the remaining surface portion becomes obvious. It became clear that it was possible. As is clear from this figure, it was confirmed that the discrimination between the remaining portion of the background and the surrounding noise was high by using the line width image.
 また、2つのエリアカメラから算出した2種類の線幅画像を比較した結果、エリアカメラ1からの光切断画像に基づく線幅画像の方が、エリアカメラ2からの光切断画像に基づく線幅画像よりも鮮明であることがわかる。かかる結果は、被検査体1とした円盤状加工品の地肌残存部には、表面の方向性があることを示唆している。このような場合であっても、線状のレーザ光Lを複数の方向から撮像することで、より確実に地肌残存部を顕在化することが可能であり、地肌残存部や錆を過小評価してしまう可能性を回避できることが確認できた。 Further, as a result of comparing the two types of line width images calculated from the two area cameras, the line width image based on the light cut image from the area camera 1 is the line width image based on the light cut image from the area camera 2. It is clear that it is clearer. This result suggests that the background portion of the disk-like processed product used as the object to be inspected 1 has surface directionality. Even in such a case, by capturing the linear laser beam L from a plurality of directions, it is possible to more surely reveal the remaining portion of the background, and underestimate the remaining portion of the background and rust. It was confirmed that the possibility of being lost could be avoided.
 以上、添付図面を参照しながら本発明の好適な実施形態について詳細に説明したが、本発明はかかる例に限定されない。本発明の属する技術の分野における通常の知識を有する者であれば、特許請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本発明の技術的範囲に属するものと了解される。 The preferred embodiments of the present invention have been described in detail above with reference to the accompanying drawings, but the present invention is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field to which the present invention pertains can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that these also belong to the technical scope of the present invention.
  10  表面性状検査装置
 100  被検査体撮像装置
 101  照明装置
 103,105  エリアカメラ
 200  演算処理装置
 201  撮像制御部
 203  画像処理部
 205  表示制御部
 207  記憶部
 211  データ取得部
 213  縞画像フレーム生成部
 215  画像算出部
 217  光切断線処理部
 219  深さ画像算出部
 221  輝度画像算出部
 223  線幅画像算出部
 225  検出処理部
DESCRIPTION OF SYMBOLS 10 Surface texture inspection apparatus 100 Test object imaging device 101 Illumination device 103,105 Area camera 200 Arithmetic processing device 201 Imaging control part 203 Image processing part 205 Display control part 207 Storage part 211 Data acquisition part 213 Stripe image frame generation part 215 Image Calculation unit 217 Optical section line processing unit 219 Depth image calculation unit 221 Luminance image calculation unit 223 Line width image calculation unit 225 Detection processing unit

Claims (18)

  1.  中間素材を素材とした機械加工品を被検査体とする表面性状検査装置であって、
     移動する前記被検査体の表面に対して線状のレーザ光を照射する照明装置と、
     前記線状のレーザ光が照射された前記表面を撮像することで、前記表面における前記線状のレーザ光の撮像画像である光切断画像を前記被検査体の移動方向に沿って複数生成する撮像装置と、
     複数の前記光切断画像それぞれにおける前記線状のレーザ光の照射部分に対応する線分である光切断線を前記移動方向に沿って順に配列させた縞画像フレームに基づいて、前記被検査体の表面の凹凸状態を表す深さ画像と、前記被検査体の表面における前記線状のレーザ光の輝度分布を表す輝度画像と、前記被検査体の表面における前記線状のレーザ光の前記移動方向の線幅の分布を輝度値に対応づけた線幅画像と、を算出する画像算出部と、
     算出された前記深さ画像、前記輝度画像及び前記線幅画像に基づいて、前記被検査体の表面性状を検出する検出処理部と、
    を備え、
     前記画像算出部は、前記縞画像フレームにおけるそれぞれの前記光切断線について、当該光切断線の延伸方向の各位置での前記線幅と所定の閾値線幅との差分を算出し、算出された前記差分の大きさに応じて輝度値を割り当てることで、前記線幅画像を算出し、
     前記検出処理部は、前記線幅画像に基づいて、前記中間素材を素材とした地肌残存部又は錆が、前記被検査体の表面に存在しているか否かを検出する、表面性状検査装置。
    A surface texture inspection device that uses a machined product made of an intermediate material as an object to be inspected,
    An illumination device for irradiating the surface of the moving object to be examined with a linear laser beam;
    Imaging that generates a plurality of light-cut images, which are captured images of the linear laser light on the surface, along the moving direction of the inspection object by imaging the surface irradiated with the linear laser light Equipment,
    Based on a fringe image frame in which light cutting lines, which are line segments corresponding to the irradiated portions of the linear laser light in each of the plurality of light cutting images, are sequentially arranged along the moving direction, A depth image representing an uneven state of the surface, a luminance image representing a luminance distribution of the linear laser light on the surface of the inspection object, and the moving direction of the linear laser light on the surface of the inspection object An image calculation unit that calculates a line width image in which the distribution of the line width is associated with the luminance value;
    A detection processing unit that detects a surface property of the object to be inspected based on the calculated depth image, the luminance image, and the line width image;
    With
    The image calculation unit calculates and calculates a difference between the line width at each position in the extending direction of the light cutting line and a predetermined threshold line width for each light cutting line in the striped image frame. By assigning a luminance value according to the size of the difference, the line width image is calculated,
    The said detection process part is a surface property inspection apparatus which detects whether the surface residual part or rust which used the said intermediate material as a raw material exists in the surface of the said to-be-inspected object based on the said line width image.
  2.  前記検出処理部は、前記線幅画像の輝度値が、前記地肌残存部及び錆の検出のための第1の閾値以上か否かに基づき、前記地肌残存部及び前記錆を検出する、請求項1に記載の表面性状検査装置。 The said detection process part detects the said background residual part and the said rust based on whether the luminance value of the said line | wire width image is more than the 1st threshold value for the detection of the said background residual part and rust. The surface texture inspection apparatus according to 1.
  3.  前記画像算出部は、前記被検査体の移動方向に沿った前記光切断線の線幅方向の重心位置を算出し、前記光切断画像に対して予め指定した移動方向の位置である基準位置と前記重心位置との変位量に基づいて、前記深さ画像を算出する、請求項1又は2に記載の表面性状検査装置。 The image calculation unit calculates a barycentric position in a line width direction of the light cutting line along the moving direction of the object to be inspected, and a reference position that is a position in a moving direction designated in advance with respect to the light cutting image; The surface property inspection apparatus according to claim 1, wherein the depth image is calculated based on a displacement amount with respect to the gravity center position.
  4.  前記検出処理部は、
     前記深さ画像及び前記輝度画像の輝度値が、欠陥部位特定のための第2の閾値以上であるか否かに基づいて欠陥部位を特定し、
     特定した前記欠陥部位について、当該欠陥部位の形態及び輝度値に関する特徴量情報を抽出し、
     抽出した特徴量情報に基づいて、前記被検査体の表面に存在する欠陥を判別する、請求項1~3の何れか1項に記載の表面性状検査装置。
    The detection processing unit
    Identifying a defect site based on whether the brightness value of the depth image and the brightness image is greater than or equal to a second threshold for defect site identification;
    For the identified defective part, extract feature quantity information regarding the form and luminance value of the defective part,
    The surface property inspection apparatus according to any one of claims 1 to 3, wherein a defect existing on a surface of the inspection object is determined based on the extracted feature amount information.
  5.  前記検出処理部は、前記深さ画像及び前記輝度画像において、前記線幅画像に基づき前記地肌残存部又は前記錆が検出された部位以外で、前記欠陥部位の検出を実施する、請求項4に記載の表面性状検査装置。 The detection processing unit performs the detection of the defective part in the depth image and the luminance image, except for the background remaining part or the part where the rust is detected based on the line width image. The surface texture inspection apparatus described.
  6.  前記撮像装置として、前記線状のレーザ光が照射された前記表面を、前記移動方向の上流側及び前記移動方向の下流側のそれぞれから撮像する少なくとも2つの撮像装置を有する、請求項1~5の何れか1項に記載の表面性状検査装置。 The imaging apparatus includes at least two imaging apparatuses that image the surface irradiated with the linear laser light from each of an upstream side in the movement direction and a downstream side in the movement direction. The surface property inspection apparatus according to any one of the above.
  7.  中間素材を素材とした機械加工品を被検査体とする表面性状検査方法であって、
     移動する前記被検査体の表面に照明装置から線状のレーザ光を照射し、当該線状のレーザ光が照射された前記表面を撮像装置により撮像することで、前記表面における前記線状のレーザ光の撮像画像である光切断画像を前記被検査体の移動方向に沿って複数生成する光切断画像生成ステップと、
     複数の前記光切断画像それぞれにおける前記線状のレーザ光の照射部分に対応する線分である光切断線を前記移動方向に沿って順に配列させた縞画像フレームに基づいて、前記被検査体の表面の凹凸状態を表す深さ画像と、前記被検査体の表面における前記線状のレーザ光の輝度分布を表す輝度画像と、前記被検査体の表面における前記線状のレーザ光の前記移動方向の線幅の分布を輝度値に対応づけた線幅画像と、を算出する画像算出ステップと、
     算出された前記深さ画像、前記輝度画像及び前記線幅画像に基づいて前記被検査体の表面性状を検出する検出処理ステップと、
    を含み、
     前記画像算出ステップは、前記縞画像フレームにおけるそれぞれの前記光切断線について、当該光切断線の延伸方向の各位置での前記線幅と所定の閾値線幅との差分を算出し、算出された前記差分の大きさに応じて輝度値を割り当てることで、前記線幅画像を算出し、
     前記検出処理ステップは、前記線幅画像に基づいて、前記中間素材を素材とした地肌残存部又は錆が、前記被検査体の表面に存在しているか否かを検出する、表面性状検査方法。
    A surface property inspection method using a machined product made of an intermediate material as an inspection object,
    The surface of the moving object to be inspected is irradiated with linear laser light from an illuminating device, and the surface of the surface irradiated with the linear laser light is imaged by an imaging device, whereby the linear laser on the surface is irradiated. A light section image generation step of generating a plurality of light section images that are captured images of light along the moving direction of the object to be inspected;
    Based on a fringe image frame in which light cutting lines, which are line segments corresponding to the irradiated portions of the linear laser light in each of the plurality of light cutting images, are sequentially arranged along the moving direction, A depth image representing an uneven state of the surface, a luminance image representing a luminance distribution of the linear laser light on the surface of the inspection object, and the moving direction of the linear laser light on the surface of the inspection object An image calculation step for calculating a line width image in which the distribution of the line width is associated with the luminance value;
    A detection processing step of detecting a surface property of the object to be inspected based on the calculated depth image, the luminance image, and the line width image;
    Including
    The image calculating step calculates and calculates the difference between the line width at each position in the extending direction of the light cutting line and a predetermined threshold line width for each light cutting line in the fringe image frame. By assigning a luminance value according to the size of the difference, the line width image is calculated,
    The said detection process step is a surface property inspection method which detects whether the surface residual part or rust which used the said intermediate material as a raw material exists in the surface of the said to-be-inspected object based on the said line width image.
  8.  前記検出処理ステップは、前記線幅画像の輝度値が、前記地肌残存部及び錆の検出のための第1の閾値以上か否かに基づき、前記地肌残存部及び前記錆を検出する、請求項7に記載の表面性状検査方法。 The said detection process step detects the said background remaining part and the said rust based on whether the luminance value of the said line | wire width image is more than the 1st threshold value for the detection of the said background remaining part and rust. 8. The surface property inspection method according to 7.
  9.  前記画像算出ステップは、前記被検査体の移動方向に沿った前記光切断線の線幅方向の重心位置を算出し、前記光切断画像に対して予め指定した移動方向の位置である基準位置と前記重心位置との変位量に基づいて、前記深さ画像を算出する、請求項7又は8に記載の表面性状検査方法。 The image calculating step calculates a barycentric position in the line width direction of the light cutting line along the moving direction of the object to be inspected, and a reference position that is a position in a moving direction designated in advance with respect to the light cutting image; The surface property inspection method according to claim 7 or 8, wherein the depth image is calculated based on a displacement amount with respect to the gravity center position.
  10.  前記検出処理ステップは、
     前記深さ画像及び前記輝度画像の輝度値が、欠陥部位特定のための第2の閾値以上であるか否かに基づいて欠陥部位を特定し、
     特定した前記欠陥部位について、当該欠陥部位の形態及び輝度値に関する特徴量情報を抽出し、
     抽出した特徴量情報に基づいて、前記被検査体の表面に存在する欠陥を判別する、請求項7~9の何れか1項に記載の表面性状検査方法。
    The detection processing step includes
    Identifying a defect site based on whether the brightness value of the depth image and the brightness image is greater than or equal to a second threshold for defect site identification;
    For the identified defective part, extract feature quantity information regarding the form and luminance value of the defective part,
    The surface property inspection method according to any one of claims 7 to 9, wherein a defect existing on a surface of the inspection object is determined based on the extracted feature amount information.
  11.  前記検出処理ステップは、前記深さ画像及び前記輝度画像において、前記線幅画像に基づき前記地肌残存部又は前記錆が検出された部位以外で、前記欠陥部位の検出を実施する、請求項10に記載の表面性状検査方法。 The detection process step performs the detection of the defective part in the depth image and the luminance image other than the part where the background remaining portion or the rust is detected based on the line width image. The surface property inspection method described.
  12.  前記撮像装置として、前記線状のレーザ光が照射された前記表面を、前記移動方向の上流側及び前記移動方向の下流側のそれぞれから撮像する少なくとも2つの撮像装置を有する、請求項7~11の何れか1項に記載の表面性状検査方法。 The imaging device includes at least two imaging devices that image the surface irradiated with the linear laser light from each of an upstream side in the movement direction and a downstream side in the movement direction. The surface property inspection method according to any one of the above.
  13.  中間素材を素材とした機械加工品を被検査体とし、移動する前記被検査体の表面に線状のレーザ光を照射する照明装置、及び、前記線状のレーザ光が照射された前記表面を撮像することで、前記表面における前記線状のレーザ光の撮像画像である光切断画像を前記被検査体の移動方向に沿って複数生成する撮像装置のそれぞれと相互に通信が可能なコンピュータに、
     生成された複数の前記光切断画像それぞれにおける前記線状のレーザ光の照射部分に対応する線分である光切断線を前記移動方向に沿って順に配列させた縞画像フレームに基づいて、前記被検査体の表面の凹凸状態を表す深さ画像と、前記被検査体の表面における前記線状のレーザ光の輝度分布を表す輝度画像と、前記被検査体の表面における前記線状のレーザ光の前記移動方向の線幅の分布を輝度値に対応づけた線幅画像と、を算出する画像算出機能と、
     算出された前記深さ画像、前記輝度画像及び前記線幅画像に基づいて前記被検査体の表面性状を検出する検出処理機能と、
    を実現させ、
     前記画像算出機能は、前記縞画像フレームにおけるそれぞれの前記光切断線について、当該光切断線の延伸方向の各位置での前記線幅と所定の閾値線幅との差分を算出し、算出された前記差分の大きさに応じて輝度値を割り当てることで、前記線幅画像を算出し、
     前記検出処理機能は、前記線幅画像に基づいて、前記中間素材を素材とした地肌残存部又は錆が、前記被検査体の表面に存在しているか否かを検出する、プログラム。
    A machined product made of an intermediate material is used as an object to be inspected, and an illumination device that irradiates a surface of the moving object to be inspected with linear laser light, and the surface irradiated with the linear laser light By imaging, a computer capable of mutual communication with each of the imaging devices that generate a plurality of light-cut images that are images of the linear laser light on the surface along the moving direction of the inspection object,
    Based on a fringe image frame in which light cutting lines, which are line segments corresponding to the irradiated portions of the linear laser light in each of the generated plurality of light cutting images, are sequentially arranged along the moving direction. A depth image representing an uneven state on the surface of the inspection object, a luminance image representing a luminance distribution of the linear laser light on the surface of the object to be inspected, and the linear laser light on the surface of the inspection object An image calculation function for calculating a line width image in which the distribution of the line width in the moving direction is associated with a luminance value;
    A detection processing function for detecting a surface property of the object to be inspected based on the calculated depth image, the luminance image, and the line width image;
    Realized,
    The image calculation function is calculated by calculating a difference between the line width at each position in the extending direction of the light cutting line and a predetermined threshold line width for each light cutting line in the fringe image frame. By assigning a luminance value according to the size of the difference, the line width image is calculated,
    The detection processing function is a program for detecting, based on the line width image, whether a background remaining portion or rust made of the intermediate material is present on the surface of the object to be inspected.
  14.  前記検出処理機能は、前記線幅画像の輝度値が、前記地肌残存部及び錆の検出のための第1の閾値以上か否かに基づき、前記地肌残存部及び前記錆を検出する、請求項13に記載のプログラム。 The said detection process function detects the said background remaining part and the said rust based on whether the luminance value of the said line width image is more than the 1st threshold value for the said background remaining part and the detection of rust. 13. The program according to 13.
  15.  前記画像算出機能は、前記被検査体の移動方向に沿った前記光切断線の線幅方向の重心位置を算出し、前記光切断画像に対して予め指定した移動方向の位置である基準位置と前記重心位置との変位量に基づいて、前記深さ画像を算出する、請求項13又は14に記載のプログラム。 The image calculation function calculates a barycentric position in the line width direction of the light cutting line along the moving direction of the object to be inspected, and a reference position that is a position in a moving direction designated in advance with respect to the light cutting image; The program according to claim 13 or 14, wherein the depth image is calculated based on a displacement amount with respect to the gravity center position.
  16.  前記検出処理機能は、
     前記深さ画像及び前記輝度画像の輝度値が、欠陥部位特定のための第2の閾値以上であるか否かに基づいて欠陥部位を特定し、
     特定した前記欠陥部位について、当該欠陥部位の形態及び輝度値に関する特徴量情報を抽出し、
     抽出した特徴量情報に基づいて、前記被検査体の表面に存在する欠陥を判別する、請求項13~15の何れか1項に記載のプログラム。
    The detection processing function is:
    Identifying a defect site based on whether the brightness value of the depth image and the brightness image is greater than or equal to a second threshold for defect site identification;
    For the identified defective part, extract feature quantity information regarding the form and luminance value of the defective part,
    The program according to any one of claims 13 to 15, wherein a defect existing on a surface of the object to be inspected is determined based on the extracted feature amount information.
  17.  前記検出処理機能は、前記深さ画像及び前記輝度画像において、前記線幅画像に基づき前記地肌残存部又は前記錆が検出された部位以外で、前記欠陥部位の検出を実施する、請求項16に記載のプログラム。 The detection processing function implements detection of the defective portion in the depth image and the luminance image, except for the portion where the background remaining portion or the rust is detected based on the line width image. The listed program.
  18.  前記撮像装置として、前記線状のレーザ光が照射された前記表面を、前記移動方向の上流側及び前記移動方向の下流側のそれぞれから撮像する少なくとも2つの撮像装置を有する、請求項13~17の何れか1項に記載のプログラム。 The imaging device includes at least two imaging devices that image the surface irradiated with the linear laser light from an upstream side in the movement direction and a downstream side in the movement direction, respectively. The program according to any one of the above.
PCT/JP2018/008605 2018-03-06 2018-03-06 Surface property inspection device, surface property inspection method, and program WO2019171474A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/008605 WO2019171474A1 (en) 2018-03-06 2018-03-06 Surface property inspection device, surface property inspection method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/008605 WO2019171474A1 (en) 2018-03-06 2018-03-06 Surface property inspection device, surface property inspection method, and program

Publications (1)

Publication Number Publication Date
WO2019171474A1 true WO2019171474A1 (en) 2019-09-12

Family

ID=67845926

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/008605 WO2019171474A1 (en) 2018-03-06 2018-03-06 Surface property inspection device, surface property inspection method, and program

Country Status (1)

Country Link
WO (1) WO2019171474A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113211190A (en) * 2021-06-03 2021-08-06 洛阳迈锐网络科技有限公司 Numerical control machining center cutter damage and wear online detection device and detection method
CN114979460A (en) * 2021-02-26 2022-08-30 宝元数控股份有限公司 Method and device for capturing image of plate

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004003930A (en) * 2002-04-04 2004-01-08 Nippon Steel Corp Optical shape measuring device and optical shape measuring method
JP2005030812A (en) * 2003-07-08 2005-02-03 Nippon Steel Corp Method for inspecting surface of steel panel, inspection system, image processor, and computer program
JP2012159491A (en) * 2011-01-14 2012-08-23 Nippon Steel Corp Defect detecting device and defect detecting method
US20150116727A1 (en) * 2012-04-04 2015-04-30 Siemens Vai Metals Technologies Gmbh Method and device for measuring the flatness of a metal product
JP2018048979A (en) * 2016-09-23 2018-03-29 新日鐵住金株式会社 Surface property inspection apparatus, surface property insection method, and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004003930A (en) * 2002-04-04 2004-01-08 Nippon Steel Corp Optical shape measuring device and optical shape measuring method
JP2005030812A (en) * 2003-07-08 2005-02-03 Nippon Steel Corp Method for inspecting surface of steel panel, inspection system, image processor, and computer program
JP2012159491A (en) * 2011-01-14 2012-08-23 Nippon Steel Corp Defect detecting device and defect detecting method
US20150116727A1 (en) * 2012-04-04 2015-04-30 Siemens Vai Metals Technologies Gmbh Method and device for measuring the flatness of a metal product
JP2018048979A (en) * 2016-09-23 2018-03-29 新日鐵住金株式会社 Surface property inspection apparatus, surface property insection method, and program

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114979460A (en) * 2021-02-26 2022-08-30 宝元数控股份有限公司 Method and device for capturing image of plate
CN113211190A (en) * 2021-06-03 2021-08-06 洛阳迈锐网络科技有限公司 Numerical control machining center cutter damage and wear online detection device and detection method

Similar Documents

Publication Publication Date Title
JP6515344B2 (en) Defect detection apparatus and defect detection method
JP5672480B2 (en) Apparatus and method for determining shape of terminal end of bead
JP4343911B2 (en) Defect inspection equipment
JP5351673B2 (en) Appearance inspection device, appearance inspection method
WO2017179243A1 (en) Device for capturing image of object to be inspected, method for capturing image of object to be inspected, surface inspecting device and surface inspecting method
JP5742655B2 (en) Defect detection apparatus and defect detection method
JP6119926B1 (en) Metal body shape inspection apparatus and metal body shape inspection method
JP6696278B2 (en) Drift inspection equipment
JP6683088B2 (en) Surface texture inspection device, surface texture inspection method and program
JP4150390B2 (en) Appearance inspection method and appearance inspection apparatus
JP6079948B1 (en) Surface defect detection device and surface defect detection method
US6344897B2 (en) Inspection apparatus for foreign matter and pattern defect
WO2019171474A1 (en) Surface property inspection device, surface property inspection method, and program
JP6481217B1 (en) Tubular body surface inspection apparatus and tubular body surface inspection method
JP5655045B2 (en) Optical surface defect inspection apparatus and optical surface defect inspection method
US20230020684A1 (en) Laser based inclusion detection system and methods
JP5784796B2 (en) Surface inspection apparatus and method
JP5506243B2 (en) Defect inspection equipment
JP5605010B2 (en) Surface inspection method
JP2003329428A (en) Device and method for surface inspection
JP2003247954A (en) Defect detection method for round body circumference
Yan et al. A boosted decision tree approach to shadow detection in scanning electron microscope (SEM) images for machine vision applications
JP2004125629A (en) Defect detection apparatus
JP2007199089A (en) Surface inspection apparatus
JPH0772909B2 (en) Welding condition judgment method by visual inspection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18908982

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18908982

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP