WO2022137579A1 - Inspection system and inspection method - Google Patents

Inspection system and inspection method Download PDF

Info

Publication number
WO2022137579A1
WO2022137579A1 PCT/JP2021/009554 JP2021009554W WO2022137579A1 WO 2022137579 A1 WO2022137579 A1 WO 2022137579A1 JP 2021009554 W JP2021009554 W JP 2021009554W WO 2022137579 A1 WO2022137579 A1 WO 2022137579A1
Authority
WO
WIPO (PCT)
Prior art keywords
illumination
image
polarizing
polarized
light
Prior art date
Application number
PCT/JP2021/009554
Other languages
French (fr)
Japanese (ja)
Inventor
泰之 池田
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2022137579A1 publication Critical patent/WO2022137579A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles

Definitions

  • This disclosure relates to an inspection system and an inspection method.
  • the normal of the surface of an object is estimated using a plurality of images obtained by taking multiple images while changing the direction of the light source.
  • the unevenness of the surface is inspected without being affected by the dirt on the surface of the object.
  • Patent Document 1 discloses an image processing device that acquires an image of return light from an object each time the object is sequentially irradiated with P-type illumination light. ing.
  • the above-mentioned conventional method When the object is moving, in the above-mentioned conventional method, the positions of the objects in the plurality of images obtained by multiple imaging are different from each other. Therefore, the above-mentioned conventional method cannot be applied to a moving object.
  • the present disclosure has been made in view of the above problems, and an object thereof is an inspection system and an inspection method capable of inspecting a moving object using a plurality of images having different lighting conditions. To provide.
  • the inspection system includes a plurality of lighting units for illuminating an object, a polarizing camera in which unit regions including a plurality of polarizing elements are repeatedly arranged, and an inspection device.
  • the plurality of lighting units irradiate illumination light having different polarization states from each other.
  • the plurality of splitters transmit light in different polarization directions from each other.
  • the polarizing camera outputs a plurality of polarized images corresponding to the plurality of polarizing elements by taking an image in a state where the plurality of lighting units are lit at the same time.
  • the inspection device inspects the object using a plurality of polarized images.
  • each of the plurality of polarized images is an image captured under illumination conditions mainly including the illumination condition of the illumination unit that irradiates the polarized light having the smallest angle with the polarization direction of the corresponding polarizing element.
  • illumination conditions mainly including the illumination condition of the illumination unit that irradiates the polarized light having the smallest angle with the polarization direction of the corresponding polarizing element.
  • the plurality of lighting units are arranged so that the azimuth angles around the optical axis of the polarizing camera are different from each other.
  • the plurality of lighting units include the first to Nth lighting units.
  • the plurality of modulators include the first to Nth substituents.
  • N is an integer of 2 or more.
  • the polarization direction of the illumination light of the first to Nth illumination units is parallel to the polarization direction of the light transmitted through the first to Nth polarizing elements.
  • the inspection device generates a normal image showing the normal direction of the surface of the object from a plurality of polarized images, and inspects the object based on the normal image.
  • the polarized image corresponding to each of the first to Nth polarizing elements is imaged so that the light of the illuminating unit that irradiates the illumination light parallel to the polarization direction of the polarizing element is the strongest.
  • the illumination units having the greatest influence on the polarized image are different from each other in the plurality of polarized images.
  • the first to Nth illumination units are arranged so that the azimuth angles around the optical axis of the polarizing camera are different from each other. Therefore, a plurality of polarized images having different conditions regarding the azimuth angle of the illumination light can be obtained by one imaging.
  • a normal image is generated from the plurality of polarized images. By using the normal image, the unevenness of the surface of the object is inspected with high accuracy.
  • N is 4.
  • the first illumination unit and the third illumination unit are arranged at positions symmetrical with respect to the optical axis of the polarizing camera.
  • the second illumination unit and the fourth illumination unit are arranged at positions symmetrical with respect to the optical axis of the polarizing camera.
  • the difference between the first azimuth in which the first illumination unit is arranged and the second azimuth in which the second illumination unit is arranged is 90 °.
  • the plurality of polarized images include first to fourth polarized images corresponding to the first to fourth modulators, respectively.
  • the inspection device generates a first normal image showing the magnitude of a component along the direction of the first azimuth in the normal vector of the surface of the object based on the first to fourth polarized images. ..
  • the inspection device generates a second normal image showing the magnitude of the component along the direction of the second azimuth in the normal vector of the surface of the object based on the first to fourth polarized images. ..
  • the inspection device generates a shape image showing the shape of the surface of the object based on the first normal image and the second normal image, and inspects the object based on the shape image.
  • a first normal image showing the size of the component along the direction of the first azimuth and a second normal image showing the size of the component along the direction of the second azimuth A normal image is generated.
  • the angle formed by the polarization direction of the illumination light of the second illumination unit, the third illumination unit, and the fourth illumination unit and the polarization direction of the illumination light of the first illumination unit is 45 °, respectively. 90 ° and 135 °.
  • the first to fourth polarized images correspond to the images captured under the illumination conditions mainly the illumination conditions of the first to fourth illumination units, respectively. Then, the influence of the illumination light of the third illumination unit on the first polarized image can be minimized. Similarly, the influence of the illumination light of the fourth, first and second illumination units on the second, third and fourth polarized images can be minimized as much as possible.
  • the plurality of lighting units are arranged so that the elevation angles with respect to the object are different from each other.
  • a plurality of polarized images having different elevation angles with respect to an object can be obtained by one imaging.
  • the plurality of illumination units include a first illumination unit that irradiates the illumination light along the optical axis of the polarized camera, and a ring-shaped second illumination unit centered on the optical axis of the polarized camera.
  • the plurality of splitters include a first modulator and a second modulator. The polarization directions of the illumination light emitted from the first illumination unit and the second illumination unit coincide with the polarization directions of the light transmitted through the first and second polarizing elements, respectively.
  • the first polarized image corresponding to the first polarizing element shows the brightness of the light that is radiated from the first illumination unit and is specularly reflected on the surface of the object. That is, the first polarized image corresponds to an image obtained under brightfield illumination conditions. In an image obtained under bright-field illumination conditions, the brightness of the scratched portion is reduced. Therefore, by using the first polarized image, scratches can be inspected with high accuracy.
  • the second polarized image corresponding to the second polarizing element shows the brightness of the light emitted from the second illumination unit and diffusely reflected on the surface of the object. That is, the second polarized image corresponds to the image obtained under the conditions of dark field illumination. In the image obtained under the condition of dark field illumination, the brightness of the portion where the stain is present is different from the brightness of the surrounding portion. Therefore, by using the second polarized image, dirt can be inspected with high accuracy.
  • the first polarized image suitable for the inspection of scratches and the second polarized image suitable for the inspection of stains are acquired by one imaging.
  • the plurality of illumination units include a plurality of concentric ring-shaped illumination units centered on the optical axis of the polarizing camera.
  • the inspection device generates a phase image showing an angle formed by the normal direction of the surface of the object and the optical axis direction of the polarized camera based on a plurality of polarized images. Inspect the object based on the phase image.
  • the polarization states of the illumination lights of the plurality of ring-shaped illumination units are different from each other, the amount of light emitted from the plurality of ring-type illumination units, specularly reflected on the surface of the object, and transmitted through each polarizing element. Are different from each other. Therefore, a plurality of polarized images show fringe patterns having different phases from each other. That is, a plurality of polarized images in which fringe patterns having different phases are captured can be obtained by one imaging.
  • the phase image generated from the plurality of polarized images indicates the normal direction of the surface of the object. Therefore, by using the phase image, the unevenness of the surface of the object can be inspected with high accuracy.
  • the plurality of illumination units further include an illumination unit that irradiates the illumination light along the optical axis of the polarizing camera.
  • the plurality of illumination units include a first illumination unit that irradiates the illumination light along the optical axis of the polarizing camera, and a second illumination unit that irradiates the illumination light from the back side of the object.
  • the plurality of splitters include a first modulator and a second modulator. The polarization directions of the illumination light emitted from the first illumination unit and the second illumination unit coincide with the polarization directions of the light transmitted through the first and second polarizing elements, respectively.
  • the first polarized image corresponding to the first polarizing element shows the brightness of the light that is radiated from the first illumination unit and is specularly reflected on the surface of the object.
  • the first polarized image shows the brightness of the portion where the scratch is present. Therefore, by using the first polarized image, scratches can be inspected with high accuracy.
  • the second polarized image corresponding to the second polarizing element shows the brightness of the light emitted from the second illumination unit and incident on the polarizing camera. Therefore, when the object has a light-shielding property, the shape of the outer periphery of the object is clearly recognized in the second polarized image. Therefore, by using the second polarized image, burrs or chips on the outer periphery of the object can be inspected with high accuracy.
  • the first polarized image suitable for the inspection of scratches and the second polarized image suitable for the inspection of burrs or chips on the outer periphery of the object are acquired by one imaging.
  • the plurality of illumination units include a first illumination unit that irradiates linearly polarized illumination light and a second illumination unit that irradiates non-polarized illumination light.
  • the plurality of splitters include the first to third substituents, and the first substituent transmits light in the same polarization direction as the polarization direction of the illumination light of the first illumination unit.
  • the second polarizing element transmits light in the polarization direction having an angle of 45 ° with the polarization direction of the light transmitted through the first polarizing element.
  • the third polarizing element transmits light in the polarization direction having an angle of 90 ° with the polarization direction of the light transmitted through the first polarizing element.
  • the plurality of polarized images include the first to third polarized images corresponding to the first to third modulators, respectively.
  • the inspection device synthesizes the first to third polarized images in a high dynamic range to generate a composite image, and inspects the object based on the composite image.
  • the non-polarized light emitted from the second illumination unit uniformly transmits the first to third polarizing elements.
  • the linearly polarized light emitted from the first illumination unit transmits the first polarizing element, but does not transmit the third polarizing element.
  • the intensity of the light emitted from the first illumination unit and transmitted through the second polarizing element is 1 ⁇ 2 of the intensity of the light emitted from the first illumination unit and transmitted through the first polarizing element. .. Therefore, the first to third polarized images correspond to a plurality of images captured under illumination conditions in which the illumination intensities are different from each other. That is, first to third polarized images having different illumination intensities can be obtained by one imaging. Then, the first to third polarized images are combined in a high dynamic range, and a combined image is generated. By using the composite image, the inspection accuracy of the object is improved.
  • the inspection method uses a plurality of lighting units for illuminating an object and a polarizing camera in which unit regions including a plurality of polarizing elements are repeatedly arranged.
  • the plurality of lighting units irradiate illumination light having different polarization states from each other.
  • the plurality of splitters transmit light in different polarization directions from each other.
  • the inspection method includes a step of acquiring a plurality of polarized images corresponding to a plurality of polarizing elements by imaging an object using a polarizing camera in a state where a plurality of lighting units are lit at the same time, and a plurality of inspection methods. It comprises a step of inspecting an object using a polarized image.
  • a moving object can be inspected using a plurality of images having different lighting conditions.
  • FIG. 10e It is a figure which shows the structure of the illumination part 10e. It is a figure which shows an example of the plurality of polarized images obtained in the 2nd inspection example. It is a figure which shows an example of the arrangement of the plurality of lighting parts which concerns on 3rd inspection example. It is a figure which shows the amount of the light which is irradiated from the illumination part 10e, 10g-10i and is specularly reflected by the object W, and is transmitted through each polarizing element. It is a figure which shows another example of arrangement of the plurality of lighting parts which concerns on 3rd inspection example.
  • FIG. 28 is a diagram showing an example of polarized images 50a to 50d obtained by using a plurality of illumination units shown in FIG. 28. It is a figure which shows the arrangement of the plurality of lighting parts which concerns on 4th inspection example. It is a figure which shows an example of the plurality of polarized images obtained in the 4th inspection example. It is a figure which shows the arrangement of the plurality of lighting parts which concerns on 5th inspection example. It is a figure which shows an example of the plurality of polarized images obtained in the 5th inspection example.
  • FIG. 1 is a schematic view showing the overall configuration of the inspection system 1 according to the present embodiment.
  • the inspection system 1 inspects the object W by using an image obtained by imaging the object W.
  • the object W has a glossy surface such as metal or glass, that is, a surface on which incident light is mainly specularly reflected.
  • the inspection system 1 is incorporated in, for example, a production line, and inspects the moving object W for defects by a transport belt 2.
  • Defects include scratches, unevenness, dirt, dust adhesion, burrs, chips, etc.
  • the lighting conditions that make the defective part stand out differ depending on the type of defect. Therefore, when it is desired to inspect a plurality of types of defects, a plurality of images captured under a plurality of lighting conditions are required.
  • a defective portion may be conspicuous in an image obtained by synthesizing a plurality of images captured under a plurality of illumination conditions. Even when it is desired to inspect such a defect, a plurality of images captured under a plurality of illumination conditions are required.
  • the inspection system 1 shown in FIG. 1 acquires a plurality of images corresponding to a plurality of lighting conditions for a moving object W by one imaging (one-shot imaging), and acquires a plurality of acquired images. Inspect the object W using.
  • the inspection system 1 includes a plurality of lighting units 10 for illuminating the object W, a polarizing camera 20, and an inspection device 30.
  • the plurality of lighting units 10 irradiate illumination light having different polarization states from each other.
  • the inspection system 1 shown in FIG. 1 includes lighting units 10a, 10b, 10c, 10d, ....
  • the illumination units 10a, 10b, 10c, 10d, ... are not particularly distinguished, each of the illumination units 10a, 10b, 10c, 10d, ... Is described as "illumination unit 10".
  • each of the plurality of illumination units 10 includes a light emitting unit that emits non-polarized light and a linear polarization filter arranged between the light emitting unit and the object W.
  • the polarization directions of the light transmitted through the linear polarization filter are different from each other in the plurality of illumination units 10. It should be noted that one of the plurality of illumination units 10 does not have to include a linear polarization filter. In this case, one lighting unit 10 irradiates the object W with non-polarized light.
  • the polarizing camera 20 has a plurality of light detection sensors arranged in a matrix, and image data having a light receiving amount (luminance) detected by the light detection sensor as a pixel value (hereinafter, simply referred to as “image”). To generate.
  • image a pixel value
  • the size of multiple photodetectors is the same.
  • the photodetection sensor is, for example, a CCD (Coupled Charged Device) or CMOS (Complementary Metal Oxide Semiconductor) sensor.
  • a polarizing element is provided on the light incident side of each photodetector sensor.
  • FIG. 2 is a diagram showing an example of the arrangement of the polarizing elements of the polarizing camera 20.
  • the unit region 21 including the plurality of polarizing elements 22 is repeatedly arranged in the polarizing camera 20.
  • the plurality of unit regions 21 are arranged in a matrix.
  • the sizes of the plurality of polarizing elements 22 are the same.
  • the unit region 21 includes four polarizing elements 22a to 22d overlapping with four photodetector sensors as a plurality of polarizing elements 22.
  • each of the splitters 22a to 22d is referred to as a "polarizer 22".
  • Each of the stators 22a to 22d transmits linearly polarized light in a predetermined polarization direction.
  • the polarization angles of the splitters 22a to 22d are 0 °, 45 °, 90 °, and 135 °, respectively. ..
  • the polarization angle may include manufacturing errors, mounting errors, and the like. For example, if the maximum sum of manufacturing and mounting errors is ⁇ °, the “polarization angle 45 °” includes the range 45 ° ⁇ ⁇ °.
  • the polarizing camera 20 outputs a plurality of polarized images 50 (see FIG. 1) corresponding to the plurality of polarizing elements 22 by taking an image in a state where the plurality of lighting units 10 are lit at the same time.
  • the plurality of polarized images 50 include polarized images 50a, 50b, 50c, 50d, ....
  • polarized images 50a, 50b, 50c, 50d, ... are not particularly distinguished, each of the polarized images 50a, 50b, 50c, 50d, ... Is described as "polarized image 50".
  • Each pixel value of the plurality of polarized images 50 indicates the brightness of the light transmitted through the corresponding polarizing element 22 among the plurality of polarizing elements 22.
  • the polarized images 50a, 50b, 50c, and 50d corresponding to the splitters 22a to 22d are output.
  • the inspection device 30 inspects the object W using a plurality of polarized images 50 received from the polarizing camera 20.
  • Each of the plurality of polarizing elements 22 transmits more illumination light from the illumination unit 10 that irradiates linearly polarized light having the smallest angle with the polarization direction of the polarizing element 22 among the plurality of illumination units 10.
  • the polarization direction of the splitter 22a and the polarization direction of the illumination unit 10a are parallel, and the polarization direction of the splitter 22c and the polarization direction of the illumination unit 10c are parallel will be described.
  • the polarization directions of the transducers 22a and 22c are orthogonal to each other.
  • the light irradiated from the illumination unit 10a and specularly reflected on the surface of the object W passes through the polarizing element 22a but does not pass through the polarizing element 22c.
  • the light irradiated from the illumination unit 10c and specularly reflected on the surface of the object W does not pass through the polarizing element 22a, but passes through the polarizing element 22c. Therefore, in the polarized image 50a corresponding to the polarizing element 22a, the light of the illumination unit 10a is imaged so as to be the strongest. In the polarized image 50c corresponding to the polarizing element 22c, the light of the illumination unit 10c is imaged so as to be the strongest.
  • each of the plurality of polarized images 50 is imaged so that the light of the illumination unit 10 that irradiates the linearly polarized light having the smallest angle with the polarization direction of the corresponding polarizing element 22 becomes the strongest. That is, a plurality of polarized images 50 corresponding to a plurality of lighting conditions are acquired by one-shot imaging. Therefore, the moving object W can be inspected by using a plurality of polarized images 50 having different illumination conditions.
  • FIG. 3 is a schematic diagram showing the hardware configuration of the inspection device.
  • the inspection device 30 typically has a structure that follows a general-purpose computer architecture, and a processor executes a pre-installed program to perform various processes as described later. To realize.
  • the inspection device 30 includes a processor 310 such as a CPU (Central Processing Unit) and an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 312, a display controller 314, and a system controller 316. It includes an I / O (Input Output) controller 318, a hard disk 320, a camera interface 322, an input interface 324, a communication interface 328, and a memory card interface 330. Each of these parts is connected to each other so as to be capable of data communication, centering on the system controller 316.
  • a processor 310 such as a CPU (Central Processing Unit) and an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 312, a display controller 314, and a system controller 316. It includes an I / O (Input Output) controller 318, a hard disk 320, a camera interface 322, an input interface 324, a communication interface 328, and a memory card interface 330.
  • the processor 310 realizes the target arithmetic processing by exchanging programs (codes) and the like with the system controller 316 and executing them in a predetermined order.
  • the system controller 316 is connected to the processor 310, RAM 312, display controller 314, and I / O controller 318 via a bus, and exchanges data with each part and processes the entire inspection device 30. Control.
  • the RAM 312 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), with respect to a program read from a hard disk 320, a polarized image 50 acquired by a polarized camera 20, and a polarized image 50. Holds processing results and so on.
  • DRAM Dynamic Random Access Memory
  • the display controller 314 is connected to the display unit 302, and outputs signals for displaying various information to the display unit 302 according to an internal command from the system controller 316.
  • the display unit 302 includes, for example, a liquid crystal display, an organic EL (Electro Luminescence) display, an organic EL, and the like.
  • the I / O controller 318 controls data exchange with a recording medium or an external device connected to the inspection device 30. More specifically, the I / O controller 318 is connected to the hard disk 320, the camera interface 322, the input interface 324, the communication interface 328, and the memory card interface 330.
  • the hard disk 320 is typically a non-volatile magnetic storage device that stores the inspection program 350 executed by the processor 310.
  • the inspection program 350 installed on the hard disk 320 is distributed in a state of being stored in a memory card 306 or the like. Further, a camera image is stored in the hard disk 320.
  • a semiconductor storage device such as a flash memory or an optical storage device such as a DVD-RAM (Digital Versatile Disk Random Access Memory) may be adopted.
  • the camera interface 322 corresponds to an input unit that receives the polarized image 50 generated by imaging the object W, and mediates data transmission between the processor 310 and the polarized camera 20. More specifically, the camera interface 322 can be connected to the polarized camera 20, and an imaging instruction is output from the processor 310 to the polarized camera 20 via the camera interface 322. As a result, the polarized camera 20 takes an image of the object W and outputs a plurality of generated polarized images 50 to the processor 310 via the camera interface 322.
  • the input interface 324 mediates data transmission between the processor 310 and an input device such as a keyboard 304, a mouse, a touch panel, and a dedicated console. That is, the input interface 324 receives an operation command given by the user operating the input device.
  • an input device such as a keyboard 304, a mouse, a touch panel, and a dedicated console. That is, the input interface 324 receives an operation command given by the user operating the input device.
  • the communication interface 328 mediates data transmission between the processor 310 and another personal computer, server device, or the like (not shown).
  • the communication interface 328 typically comprises Ethernet (registered trademark), USB (Universal Serial Bus), or the like.
  • Ethernet registered trademark
  • USB Universal Serial Bus
  • the program downloaded from the distribution server or the like may be installed in the inspection device 30 via the communication interface 328. good.
  • the memory card interface 330 mediates data transmission between the processor 310 and the memory card 306 which is a recording medium. That is, the memory card 306 is distributed in a state in which the inspection program 350 or the like executed by the inspection device 30 is stored, and the memory card interface 330 reads the inspection program 350 from the memory card 306.
  • the memory card 306 is a general-purpose semiconductor storage device such as SD (Secure Digital), a magnetic recording medium such as a flexible disk (Flexible Disk), or an optical recording medium such as a CD-ROM (Compact Disk Read Only Memory). Etc.
  • the optimum lighting conditions differ depending on the type of defect to be inspected. Therefore, the positions, illumination intensities, polarization directions, and the like of the plurality of illumination units 10 are appropriately set according to the type of defect to be inspected.
  • the first to fifth inspection examples using the inspection system 1 according to the present embodiment will be described below.
  • the inspection system 1 is not limited to the following first to fifth inspection examples, and may be used for other inspection examples.
  • the photometric stereo method is a method of estimating the inclination (normal vector) of the surface of an object by using a plurality of images obtained under conditions where the incident direction of the illumination light is different.
  • FIG. 4 is a diagram showing an example of a conventional lighting device used in the photometric stereo method. As shown in FIG. 4, the four arc regions 110a to 110d of the ring-shaped lighting device 110 are sequentially turned on and imaged using a non-polarized camera. The center of the illuminating device 110 is located on the optical axis 225 of the unpolarized camera.
  • the first imaging is performed with only the arc region 110a lit.
  • the second imaging is performed with only the arc region 110b lit.
  • the third imaging is performed with only the arc region 110c lit.
  • the fourth imaging is performed with only the arc region 110d lit.
  • FIG. 5 is a diagram showing the arrangement of a plurality of lighting units according to the first inspection example. As shown in FIG. 5, the plurality of lighting units 10 include the lighting units 10a to 10d.
  • the illumination units 10a and 10c are arranged at positions symmetrical with respect to the optical axis 25 of the polarizing camera 20.
  • the illumination units 10b and 10d are arranged at positions symmetrical with respect to the optical axis 25 of the polarizing camera 20.
  • the difference between the azimuth angle in which the illumination unit 10a is arranged and the azimuth angle in which the illumination unit 10b is arranged is 90 °.
  • the direction of the orientation in which the illumination unit 10a is arranged is defined as the X direction with respect to the optical axis 25 of the polarizing camera 20, and the direction of the orientation in which the illumination unit 10b is arranged is defined as the Y direction.
  • the illumination units 10a to 10d have light emitting units 11a to 11d and linear polarizing filters 12a to 12d, respectively.
  • the light emitting units 11a to 11d emit unpolarized light.
  • Each of the light emitting portions 11a to 11d has an arc shape with a central angle of 90 °.
  • the arc-shaped light emitting portions 11a to 11d are arranged so that their centers coincide with each other and do not overlap with each other. Specifically, one end of the light emitting portions 11a to 11d is in contact with the other ends of the light emitting portions 11b to 11d and 11a, respectively. That is, one ring-shaped illumination is configured by combining the light emitting units 11a to 11d.
  • the light emitting units 11a to 11d are four arc regions having a central angle of 90 ° in one ring-shaped illumination.
  • the center of the arc-shaped light emitting portions 11a to 11d is located on the optical axis 25 of the polarizing camera 20.
  • the linear polarizing filters 12a to 12d are attached to the light emitting surfaces of the light emitting units 11a to 11d, respectively.
  • FIG. 6 is a diagram showing an example of the polarization direction of the linear polarization filters 12a to 12d.
  • the polarization angles of the linear polarization filters 12a to 12d are 0 °, 45 °, and 90, respectively. °, 135 °.
  • the polarization angle may include manufacturing errors, mounting errors, and the like. For example, if the maximum sum of manufacturing and mounting errors is ⁇ °, the “polarization angle 45 °” includes the range 45 ° ⁇ ⁇ °.
  • the angles formed by the polarization direction of the illumination light of the illumination units 10b to 10d and the polarization direction of the illumination light of the illumination unit 10a are 45 °, 90 °, and 135 °, respectively.
  • the inspection system 1 includes a polarizing camera 20 including the polarizing elements 22a to 22d shown in FIG.
  • the polarization directions of the splitters 22a to 22d are parallel to the polarization directions of the linear polarization filters 12a to 12d, respectively. Therefore, the light emitted from the illumination units 10a to 10d and specularly reflected by the object W passes through the polarizing elements 22a to 22d, respectively.
  • the light emitted from the illumination unit 10a has components parallel to the polarization directions of the splitters 22b and 22d. Therefore, some of the components of the light that is irradiated from the illumination unit 10a and specularly reflected by the object W pass through the splitters 22b and 22d.
  • the amount of light emitted from the illumination unit 10a and transmitted through each of the polarizing elements 22b and 22d is about 1 ⁇ 2 of the amount of light emitted from the illumination unit 10a and transmitted through the splitter 22a.
  • some of the components of the light emitted from the illumination unit 10b and specularly reflected by the object W pass through the stators 22a and 22c.
  • Some of the components of the light emitted from the illumination unit 10c and specularly reflected by the object W pass through the transducers 22b and 22d.
  • Some of the components of the light emitted from the illumination unit 10d and specularly reflected by the object W pass through the stators 22a and 22c.
  • FIG. 7 is a flowchart showing an example of the flow of the inspection process in the first inspection example.
  • the polarizing camera 20 takes an image of the object W in a state where the illumination units 10a to 10d are lit at the same time, and outputs polarized images 50a to 50d corresponding to the polarizing elements 22a to 22d (step S1).
  • the inspection system 1 starts steps S2, S3, S10 in parallel.
  • step S2 the processor 310 of the inspection device 30 uses the polarized images 50a to 50d to generate an X-direction normal image showing the magnitude of the component along the X direction in the normal vector of the surface of the object W. .. Similarly, in step S3, the processor 310 uses the polarized images 50a to 50d to generate a Y-direction normal image showing the magnitude of the Y-direction component in the normal vector of the surface of the object W. Details of the X-direction normal image and the method of generating the X-direction normal image will be described later.
  • steps S4 and S5 are carried out, respectively.
  • steps S2 and S3 as will be described later, an X-direction normal image and a Y-direction normal image are generated on the assumption that the glossiness ⁇ on the surface of the object W is 1.
  • the glossiness ⁇ on the surface of the object W is not always 1.
  • the ratio of the specularly reflected component of the incident light differs depending on the glossiness ⁇ . Therefore, in step S4, the processor 310 corrects the X-direction normal image according to the glossiness ⁇ of the surface of the object W.
  • step S5 the processor 310 corrects the Y-direction normal image according to the glossiness ⁇ of the surface of the object W. Details of the correction method for the X-direction normal image and the Y-direction normal image will be described later.
  • step S4 and step S5 the processor 310 starts steps S6 and S8 in parallel.
  • step S6 the processor 310 generates a shape image showing the shape of the surface of the object W based on the X direction normal image and the Y direction normal image. Details of the shape image generation method will be described later.
  • step S7 the processor 310 inspects the surface of the object W based on the shape image. For example, the processor 310 inspects for uneven defects such as scratches and dents.
  • the processor 310 may generate a binary image by binarizing the shape image and inspect the presence or absence of defects based on the binary image.
  • step S8 the processor 310 generates an albedo image showing the ratio of specularly reflected light (albedo (reflectance)) to the incident light.
  • the albedo image is generated using the polarized images 50a to 50d, and the X-direction normal image and the Y-direction normal image corrected by steps S4 and S5, respectively. Details of the albedo image generation method will be described later.
  • step S9 the processor 310 inspects the surface of the object W based on the albedo image. For example, the processor 310 checks for defects that cause changes in the albedo, such as dirt.
  • the processor 310 may generate a binary image by binarizing the albedo image and inspect the presence or absence of defects based on the binary image.
  • step S10 the processor 310 generates an average image by averaging the polarized images 50a to 50d.
  • the value of each pixel of the average image is the average value of the values of the pixels of the polarized images 50a to 50d.
  • the average image corresponds to an image obtained by simultaneously lighting the light emitting units 11a to 11d with the linear polarization filters 12a to 12d removed and taking an image using a non-polarized camera instead of the polarized camera 20.
  • step S11 the processor 310 positions the object W using the average image.
  • the processor 310 After the completion of steps S7, S9, and S11, the processor 310 causes the display unit 302 to display the inspection result (step S12). After step S12, the processor 310 ends the inspection process.
  • FIG. 8 is a diagram showing the relationship between the illumination units 10a to 10d and the normal vector n at the point P on the surface of the object W.
  • the optical axis 25 of the polarizing camera 20 coincides with the Z axis.
  • the normal vector n is represented by (nx, ny, nz).
  • nx, ny, and nz are the X component, the Y component, and the Z component of the normal vector n, respectively.
  • FIG. 9 is a diagram showing the relationship between the traveling direction and the normal direction when the traveling direction of the light emitted from the illumination unit 10c and the normal direction of the point P are projected onto the XZ plane.
  • the normal direction is represented by a vector (nx, nz). Assuming that the angle formed by the normal direction and the Z axis is ⁇ x, ⁇ x is expressed by the following equation.
  • ⁇ x arctan (nx / nz)
  • FIG. 10 is a diagram showing the relationship between the traveling direction and the normal direction when the traveling direction of the light emitted from the illumination unit 10c and the normal direction of the point P are projected onto the YZ plane.
  • the normal direction is represented by a vector (ny, nz). Assuming that the angle formed by the normal direction and the Z axis is ⁇ y, ⁇ y is expressed by the following equation.
  • ⁇ x arctan (ny / nz)
  • angles formed by the projection component on the XZ plane and the Z axis in the traveling direction (specular reflection direction) of the light irradiated from the illumination units 10a, 10b, 10d and specularly reflected at the point P are ⁇ -2 ⁇ x, respectively. , 2 ⁇ x, 2 ⁇ x.
  • the angles formed by the projection component on the YZ plane and the Z axis in the traveling direction (specular reflection direction) of the light emitted from the illumination units 10a, 10b, and 10d and specularly reflected at the point P are 2 ⁇ y, ⁇ -2 ⁇ y, and ⁇ + 2 ⁇ y, respectively. It becomes.
  • Ia ⁇ [cos ⁇ ( ⁇ -2 ⁇ x) cos ⁇ (2 ⁇ y)]
  • Ib ⁇ [cos ⁇ (2 ⁇ x) cos ⁇ ( ⁇ -2 ⁇ y)]
  • Ic ⁇ [cos ⁇ ( ⁇ + 2 ⁇ x) cos ⁇ (2 ⁇ y)]
  • Id ⁇ [cos ⁇ (2 ⁇ x) cos ⁇ ( ⁇ + 2 ⁇ y)]
  • glossiness.
  • represents an albedo (reflectance) on the surface of the object W.
  • Jb Ib + (Ia + Ic) / 2
  • Jc Ic + (Id + Ib) / 2
  • Jd Id + (Ic + Ia) / 2 It is represented by.
  • Equation (4) is derived from the above equations (1) and (2).
  • equation (5) is derived from the above equations (1) and (3).
  • tan (2 ⁇ x) ⁇ 4 (Ja-Jc) ⁇ / ⁇ (Ja + Jb + Jc + Jd) tan ( ⁇ ) ⁇ ... Equation (4)
  • tan (2 ⁇ y) ⁇ 4 (Jb-Jd) ⁇ / ⁇ (Ja + Jb + Jc + Jd) tan ( ⁇ ) ⁇ ... Equation (5).
  • the incident angle ⁇ is determined in advance according to the positions of the illumination units 10a to 10d. Therefore, the processor 310 calculates the right side of the equation (4) using the pixel values of the polarized images 50b to 50d and the incident angle ⁇ , and generates an X-direction normal image having Nx as the pixel value as the result. .. Similarly, the processor 310 calculates the right side of the equation (5) using the pixel values of the polarized images 50b to 50d and the incident angle ⁇ , and generates a Y-direction normal image having Ny as the pixel value as a result. do.
  • Tan (2 ⁇ x) ( ⁇ Nx) represented by the equation (4) is a value depending on nx, which is a component in the x direction of the normal vector.
  • Tan (2 ⁇ x) ( ⁇ Ny) represented by the equation (5) is a value depending on ny, which is a component in the y direction of the normal vector. Therefore, the X-direction normal image and the Y-direction normal image represent the normal direction of the surface of the object W.
  • the X-direction normal image and the Y-direction normal image are generated according to the equations (4) and (5), respectively, assuming that the glossiness ⁇ of the surface of the object W is 1. Therefore, the correction of the X-direction normal image and the Y-direction normal image is executed according to the glossiness ⁇ of the object W. As a result, a normal image corresponding to the glossiness ⁇ is generated with high accuracy.
  • the glossiness ⁇ of the object W is determined in advance according to the following pre-inspection procedure.
  • the defect-free object W is placed on the 2-axis goniometer stage.
  • the biaxial goniometer stage is a stage that can be tilted in the X and Y directions.
  • the object W has a horizontal and flat upper surface (hereinafter referred to as “target area”). It will be placed on the 2-axis goniometer stage. Therefore, the angle ⁇ x formed by the X-direction component of the normal vector of the target region and the Z-axis coincides with the inclination angle of the biaxial goniometer stage in the X-direction. Similarly, the angle ⁇ y formed by the Y-direction component of the normal vector of the target region and the Z-axis coincides with the tilt angle of the biaxial goniometer stage in the Y-direction.
  • image is taken M times using the inspection system 1.
  • An X-direction normal image and a Y-direction normal image are generated for each of the M times of imaging.
  • the processor 310 stores the data set [ ⁇ xm, ⁇ ym, Nxm, Nym] in the RAM 312 for any one pixel in the target area in the X-direction normal image and the Y-direction normal image.
  • m indicates an imaging number and is an integer of 1 to M.
  • ⁇ xm is the tilt angle of the biaxial goniometer stage in the X direction in the mth imaging.
  • ⁇ ym is the tilt angle of the biaxial goniometer stage in the Y direction in the mth imaging.
  • Nxm is a pixel value of an X-direction normal image.
  • Nym is a pixel value of a Y-direction normal image.
  • the processor 310 uses the following theoretical formula to determine the gloss ⁇ that best fits the data set [ ⁇ xm, ⁇ ym, Nxm, Nym] stored in the RAM 312.
  • ⁇ Theoretical formula ⁇ Ia cos ⁇ ( ⁇ -2 ⁇ xm) cos ⁇ (2 ⁇ ym)
  • Ib cos ⁇ (2 ⁇ xm) cos ⁇ ( ⁇ -2 ⁇ ym)
  • Ic cos ⁇ ( ⁇ + 2 ⁇ xm) cos ⁇ (2 ⁇ ym)
  • Id cos ⁇ (2 ⁇ xm) cos ⁇ ( ⁇ + 2 ⁇ ym)
  • the processor 310 uses a nonlinear least squares method to determine the gloss ⁇ . By such a pre
  • the relationship between the angles ⁇ x and ⁇ y formed by the X and Y direction components of the normal vector and the Z axis and Ny which is the calculation result on the right side of the equation (5) is , Shows monotonous changes. That is, as ⁇ y increases, Ny also increases. Further, when ⁇ y> 0, Ny decreases as ⁇ x increases. When ⁇ y ⁇ 0, Ny also increases as ⁇ x increases. However, when the glossiness ⁇ is close to 1, the relationship between Ny and ⁇ y is linear, whereas when the glossiness ⁇ is large, the relationship between Ny and ⁇ y becomes non-linear. Therefore, the processor 310 corrects the normal image according to the glossiness ⁇ .
  • the processor 310 back-calculates ( ⁇ x, ⁇ y) from (Nx, Ny) using the above theoretical formula to which the predetermined glossiness ⁇ is substituted.
  • the processor 310 corrects the X-direction normal image by replacing each pixel value Nx in the X-direction normal image with ⁇ x.
  • the processor 310 corrects the Y-direction normal image by replacing each pixel value Ny in the Y-direction normal image with ⁇ y.
  • the processor 310 selects one look-up table corresponding to the glossiness ⁇ from the look-up table group created in advance, and uses the selected look-up table to set (Nx, Ny) to ( ⁇ x, Ny). It may be converted to ⁇ y).
  • the look-up table is pre-created using the above theoretical formula to which the corresponding gloss ⁇ is substituted. By using the look-up table, the time required for the correction process of the normal image is shortened.
  • the relationship between albedo ⁇ and ⁇ x and ⁇ y shows a monotonous change.
  • the processor 310 has the pixel values ⁇ x and ⁇ y of the X-direction normal image and the Y-direction normal image corrected according to the gloss ⁇ , and the pixel values Ja of each of the polarized images 50a to 50d.
  • the shape image is generated based on the X-direction normal image and the Y-direction normal image corrected according to the glossiness ⁇ .
  • FIG. 17 is a diagram showing a process executed for an X-direction normal image for generating a shape image.
  • the pixel value of the X-direction normal image indicates the magnitude of the X-direction component in the normal vector on the surface of the object W.
  • the processor 310 selects one point surrounded by four pixels in the X-direction normal image as the point of interest Q.
  • the lateral direction of the X-direction normal image corresponds to the X-direction. Therefore, as shown in FIG. 17, the processor 310 sets rectangular regions R1 and R2 on the left side and the right side of the point of interest Q, respectively.
  • the height of the rectangular regions R1 and R2 is a predetermined length L, and the width of the rectangular regions R1 and R2 is L / 2.
  • the processor 310 calculates the sum Sx1 of the values of the pixels included in the rectangular area R1. Similarly, the processor 310 calculates the sum Sx2 of the values of the pixels included in the rectangular region R2. The processor 310 calculates the difference Sx between the sum Sx1 and the sum Sx2. The processor 310 calculates the difference Sx for all the points surrounded by the four pixels in the X-direction normal image.
  • FIG. 18 is a diagram showing a process executed for a Y-direction normal image for generating a shape image.
  • the pixel value of the Y-direction normal image indicates the magnitude of the Y-direction component of the normal vector on the surface of the object W.
  • the processor 310 selects one point surrounded by four pixels in the Y-direction normal image as the point of interest Q.
  • the vertical direction of the Y-direction normal image corresponds to the Y-direction. Therefore, as shown in FIG. 18, the processor 310 sets rectangular regions R3 and R4 on the upper side and the lower side of the point of interest Q, respectively.
  • the width of the rectangular regions R3 and R4 is L, and the height of the rectangular regions R3 and R4 is L / 2.
  • the processor 310 calculates Sy1 of the value of the pixel included in the rectangular area R3. Similarly, the processor 310 calculates Sy2, which is the value of the pixel included in the rectangular region R4. The processor 310 calculates the difference Sy between the sum Sy1 and the sum Sy2. The processor 310 calculates the difference Sy for all the points surrounded by the four pixels in the Y-direction normal image.
  • the processor 310 When a defect such as a scratch or a dent is present on the surface of the object W, the normal vector changes significantly in the defect.
  • the difference Sx increases as the X-direction component of the normal vector changes significantly.
  • the difference Sy increases as the Y-direction component of the normal vector changes significantly. That is, the difference Sx and Sy take a large value in a place where a defect having unevenness such as a scratch or a dent is present. Therefore, the processor 310 generates a shape image having S as a pixel value obtained by the following equation.
  • S A (Sx + Sy) + B
  • A is a parameter for determining the contrast of the shape image.
  • B is a parameter for determining the level of the overall pixel value of the shape image.
  • the processor 310 sets the values of the parameters A and B so that the maximum and minimum of the pixel values in the shape image and the difference between the maximum and the minimum are within the specified range.
  • the defined range is predetermined according to the contrast and level suitable for defect inspection. As a result, a shape image suitable for inspection of defects having irregularities is generated.
  • FIG. 19 is a diagram showing a polarized image obtained by imaging a button battery and various images generated from the polarized image.
  • the processor 310 generates an X-direction normal image 51 and a Y-direction normal image 52 from the polarized images 50a to 50d.
  • the processor 310 generates the shape image 53 based on the X-direction normal image 51 and the Y-direction normal image 52.
  • the processor 310 generates the binary image 54 by binarizing the shape image 53. Further, the processor 310 generates an albedo image 55 and an average image 56 from the polarized images 50a to 50d.
  • FIG. 19 there are dents in the area surrounded by the frame line F1 on the surface of the button battery. Further, there is dirt in the area surrounded by the frame line F2 on the surface of the button battery.
  • the shape image 53 a change in the pixel value according to the dented portion is observed in the frame line F1.
  • the binary image 54 the dented portion is represented by white.
  • the shape image 53 or the binary image 54 defects having irregularities such as scratches or dents are inspected with high accuracy.
  • the X-direction normal image 51 and the Y-direction normal image 52 the pixel values are different between the dented portion and its surroundings. Therefore, even if the X-direction normal image 51 and the Y-direction normal image 52 are used, defects having irregularities can be inspected.
  • the albedo image 55 a change in the pixel value according to the stain is observed in the frame line F2. This is because the albedo (reflectance) has changed due to dirt. In this way, by using the albedo image 55, defects that cause changes in the albedo (reflectance), such as stains, are accurately inspected.
  • FIG. 20 is a diagram showing a polarized image obtained by imaging the side surface of a dry cell and various images generated from the polarized image.
  • 20 shows the polarized images 50a to 50d, the X-direction normal image 51, the Y-direction normal image 52, and the average image 56 generated from the polarized images 50a to 50d, and the X-direction normal image 51 and the Y-direction method.
  • the shape image 53 and the binary image 54 generated from the line image 52 are shown.
  • the shape image 53 a change in the pixel value according to the dented portion is observed in the frame line F1.
  • the binary image 54 the dented portion is represented by white.
  • FIG. 21 is a diagram showing a polarized image obtained by imaging the bottom surface of a dry cell and various images generated from the polarized image.
  • the polarized images 50a to 50d, the X-direction normal image 51, the Y-direction normal image 52, and the average image 56 generated from the polarized images 50a to 50d, the X-direction normal image 51, and the Y-direction method are shown.
  • the shape image 53 generated from the line image 52 is shown.
  • the shape image 53 a change in the pixel value according to the dented portion is observed in the frame line F1.
  • defects with irregularities such as scratches or dents can be inspected with high accuracy.
  • FIG. 22 is a diagram showing a polarized image obtained by imaging the textured resin surface and various images generated from the polarized image.
  • the polarized images 50a to 50d, the X-direction normal image 51, the Y-direction normal image 52, and the average image 56 generated from the polarized images 50a to 50d, the X-direction normal image 51, and the Y-direction method are shown.
  • the shape image 53 and the binary image 54 generated from the line image 52 are shown.
  • dents are present in the region surrounded by the frame line F1 on the resin surface. Further, there is dirt in the region surrounded by the frame line F2 on the resin surface.
  • the shape image 53 As shown in FIG. 22, in the shape image 53, a change in the pixel value according to the dented portion is observed in the frame line F1. In the binary image 54, the dented portion is represented by white. Thereby, by using the shape image 53 or the binary image 54, defects having irregularities such as scratches or dents are inspected with high accuracy.
  • Bright-field illumination is an illumination method in which specularly reflected light on the surface of the object W is incident on the polarizing camera 20.
  • the dark field illumination is an illumination method in which the specularly reflected light on the surface of the object W does not enter the polarized camera 20 and only the diffuse reflected light is incident on the polarized camera 20.
  • Light is difficult to specularly reflect in areas with scratches. Therefore, in the image obtained under the condition of bright field illumination, the brightness of the portion where the scratch is present is lowered. As a result, scratches can be inspected with high accuracy by using the image obtained under the condition of bright field illumination.
  • FIG. 23 is a diagram showing the arrangement of a plurality of lighting units according to the second inspection example.
  • the plurality of lighting units 10 include the lighting units 10e and 10f.
  • the illumination units 10e and 10f are arranged so that the elevation angles with respect to the object W are different from each other.
  • the lighting unit 10f is a ring-shaped lighting unit centered on the optical axis 25 of the polarizing camera 20.
  • the illumination unit 10f has a ring-shaped light emitting unit 11f centered on the optical axis 25 of the polarizing camera 20, and a linear polarizing filter 12f attached to the light emitting surface of the light emitting unit 11f.
  • the elevation angle of the illumination unit 10f with respect to the object W is set so that the specularly reflected light of the object W does not enter the polarizing camera 20.
  • FIG. 24 is a diagram showing the configuration of the lighting unit 10e.
  • the illumination unit 10e is coaxial illumination that irradiates the object W with illumination light along the optical axis 25 of the polarizing camera 20.
  • the illumination unit 10e includes a light emitting unit 11e, a linear polarization filter 12e, and a half mirror 13e.
  • the linear polarization filter 12e is arranged between the light emitting surface of the light emitting unit 11e and the half mirror 13e, and transmits linear polarization along a predetermined direction among the unpolarized light emitted from the light emitting unit 11e.
  • the half mirror 13e shoots linearly polarized light transmitted through the linear polarization filter 12e along the optical axis 25 of the polarizing camera 20.
  • the linear polarization filters 12e and 12f are arranged so that the polarization directions of the linear polarizations irradiated to the object W from the illumination units 10e and 10f are orthogonal to each other.
  • the inspection system 1 includes a polarizing camera 20 including the splitters 22a to 22d shown in FIG.
  • the polarization direction of the polarizing element 22a coincides with (parallel to) the polarization direction of the linearly polarized light radiated to the object W from the illumination unit 10e.
  • the polarization direction of the polarizing element 22c coincides with (parallel to) the polarization direction of the linearly polarized light emitted from the illumination unit 10f. Therefore, the light irradiated from the illumination unit 10e and specularly reflected by the object W passes through the polarizing element 22a and does not pass through the polarizing element 22c.
  • a part of the light emitted from the illumination unit 10f and diffusely reflected by the object W passes through the polarizing element 22c.
  • the light emitted from the illumination unit 10f and specularly reflected by the object W does not enter the polarizing camera 20.
  • FIG. 25 is a diagram showing an example of a plurality of polarized images obtained in the second inspection example.
  • the polarized images 50a to 50d correspond to the polarizing elements 22a to 22d, respectively.
  • the polarized image 50a shows the brightness of the light emitted from the illumination unit 10e and specularly reflected by the object W. That is, the polarized image 50a corresponds to an image obtained by imaging under bright-field illumination conditions. Therefore, scratches are easily confirmed in the frame line F1 of the polarized image 50a. In this way, by using the polarized image 50a, scratches can be inspected with high accuracy.
  • the polarized image 50c shows the brightness of the light transmitted from the polarizing element 22c among the light diffusely reflected by the object W, which is irradiated from the illumination unit 10f. That is, the polarized image 50c corresponds to an image obtained by imaging under the condition of dark field illumination. Therefore, dirt is easily confirmed in the frame line F2 of the polarized image 50c. In this way, by using the polarized image 50c, stains are inspected with high accuracy.
  • the polarized images 50b and 50d show an intermediate brightness between the polarized images 50a and 50c. Therefore, the polarized images 50b and 50d may not be used for inspecting the object W.
  • the third inspection example uses a phase shift method.
  • the phase shift method is a method of measuring the three-dimensional shape of an object W from a plurality of images captured by irradiating a striped pattern whose irradiation intensity is modulated into a sine wave while shifting the phase.
  • FIG. 26 is a diagram showing an example of the arrangement of a plurality of lighting units according to the third inspection example.
  • the plurality of lighting units 10 include the lighting units 10e, 10g to 10i.
  • the illumination unit 10e irradiates the object W with illumination light coaxial with the optical axis 25 of the polarizing camera 20 (see FIG. 24).
  • Each of the lighting units 10g to 10i is a ring type lighting.
  • the illumination units 10g to 10i have a ring-shaped light emitting unit 11g to 11i centered on the optical axis 25 of the polarizing camera 20, and a linear polarizing filter 12g to 12i attached to the light emitting surface of the light emitting unit, respectively.
  • the linear polarization filters 12e, 12g to 12i form an angle between the polarization direction of the light emitted from the illumination unit 10e, 10g to 10i and the reference direction.
  • the linear polarization filters 12e, 12g to 12i are arranged so as to be 0 °, 45 °, 90 °, and 135 °, respectively.
  • the lighting units 10e and 10g to 10i are arranged so that the elevation angles with respect to the object W are different from each other. Specifically, the illumination units 10e and 10g to 10i are arranged so that the elevation angle becomes smaller in this order with respect to the object W. In other words, the incident angle of the illumination light from the illumination unit 10e, 10g to 10i to the object W increases in this order.
  • the elevation angle of the illumination unit 10e with respect to the object W is 90 °, and the incident angle of the illumination light from the illumination unit 10e to the object W is 0 °.
  • the inspection system 1 includes a polarizing camera 20 including the polarizing elements 22a to 22d shown in FIG.
  • the polarization directions of the extruders 22a to 22d are parallel to the polarization directions of the light emitted from the illumination units 10e and 10g to 10i, respectively. Therefore, the light emitted from the illumination units 10e and 10g to 10i and specularly reflected by the object W passes through the polarizing elements 22a to 22d, respectively.
  • the light emitted from the illumination unit 10e has components parallel to the polarization directions of the splitters 22b and 22d.
  • the splitters 22b and 22d some of the components of the light that is irradiated from the illumination unit 10e and specularly reflected by the object W pass through the splitters 22b and 22d.
  • the amount of light emitted from the illumination unit 10e and transmitted through each of the splitters 22b and 22d is about 1 ⁇ 2 of the amount of light emitted from the illumination unit 10e and transmitted through the splitter 22a.
  • a part of the light that is irradiated from the illumination unit 10g and is specularly reflected by the object W passes through the polarizing elements 22a and 22c.
  • Some of the components of the light that is radiated from the illumination unit 10h and specularly reflected by the object W pass through the substituents 22b and 22d.
  • Some of the components of the light emitted from the illumination unit 10i and specularly reflected by the object W pass through the stators 22a and 22c.
  • the light emitted from the illumination units 10e, 10g to 10i and specularly reflected by the object W does not pass through the polarizing elements 22c, 22d, 22a, and 22b, respectively.
  • FIG. 27 is a diagram showing the amount of light that is radiated from the illumination units 10e, 10g to 10i and is specularly reflected by the object W and that passes through each polarizing element.
  • the amount of light transmitted through each of the transducers 22a to 22d follows a sine wave. That is, the polarized images 50a to 50d corresponding to the splitters 22a to 22d correspond to the images captured under the condition that the object W is irradiated with the light of the concentric striped pattern. Further, as shown in FIG. 27, the phases of the fringe patterns corresponding to the stators 22a to 22d are shifted by ⁇ / 2.
  • FIG. 28 is a diagram showing another example of the arrangement of the plurality of lighting units according to the third inspection example.
  • the plurality of illumination units 10 shown in FIG. 28 are different from the plurality of illumination units 10 shown in FIG. 26 in that they further include the illumination units 10j to 10l.
  • Each of the lighting units 10j to 10l is a ring type lighting.
  • the illumination units 10j to 10l have ring-shaped light emitting units 11j to 11l centered on the optical axis 25 of the polarizing camera 20, and linear polarizing filters 12j to 12l attached to the light emitting surfaces of the light emitting units 11j to 11l, respectively.
  • the linear polarization filters 12j to 12l When the polarization direction of the light emitted from the illumination unit 10e is set as the reference direction, the linear polarization filters 12j to 12l have an angle formed by the polarization direction and the reference direction of the light emitted from the illumination units 10j to 10l of 22. It is arranged so as to be 5 °, 67.5 °, and 110.5 °.
  • the lighting units 10e, 10j, 10g, 10k, 10h, 10l, and 10i are arranged so that the elevation angle becomes smaller in this order with respect to the object W.
  • the incident angle of the illumination light from the illumination unit 10e, 10j, 10g, 10k, 10h, 10l, 10i to the object W increases in this order.
  • FIG. 29 is a diagram showing the amount of light that is radiated from the illumination unit 10e, 10g to 10l and is specularly reflected by the object W and that passes through each polarizing element.
  • the amount of light transmitted through each of the substituents 22a to 22d follows a sine wave. That is, the polarized images 50a to 50d corresponding to the splitters 22a to 22d correspond to the images captured under the condition that the object W is irradiated with the light of the concentric striped pattern. Further, as shown in FIG. 29, the phases of the fringe patterns corresponding to the stators 22a to 22d are shifted by ⁇ / 2.
  • the processor 310 inspects the object W based on the polarized images 50a to 50d.
  • the processor 310 generates a phase image and a specular reflection image based on the polarized images 50a to 50d by the phase shift method.
  • the two-dimensional coordinates of the image are represented as (x, y), and the values of the pixels (x, y) in the polarized images 50a to 50d are Ja (x, y), Jb (x, y), Jc (x, y),
  • the range of the normal angle of the object W on which the fringe pattern can be imaged is 0 ° to 45 °.
  • the phase value ⁇ (x, y) is a value proportional to the angle formed by the normal direction of the object W and the optical axis 25 of the polarizing camera 20.
  • the phase image becomes white pixels (256) when the angle formed by the normal direction of the object W and the optical axis 25 is 0 °, and the angle formed by the normal direction of the object W and the optical axis 25. When is 45 °, it becomes a black pixel (0). Therefore, by using the phase image, the unevenness of the surface of the object W can be inspected.
  • a (x, y) [ ⁇ Ja (x, y) -Jc (x, y) ⁇ 2 + ⁇ Jb (x, y) -Jd ( x, y) ⁇ 2 ] 1/2
  • the specular reflection image represents the intensity of light that is specularly reflected on the surface of the object W.
  • the specular reflection component becomes relatively weak at the scratched portion. Therefore, the specular reflection image can be used to inspect the unevenness of the surface of the object W.
  • FIG. 30 is a diagram showing an example of polarized images 50a to 50d obtained by using the plurality of illumination units 10 shown in FIG. 26.
  • FIG. 31 is a diagram showing an example of polarized images 50a to 50d obtained by using the plurality of illumination units 10 shown in FIG. 28.
  • a specularly reflected fringe pattern is observed on the surface of the object W, and the phase of the fringe pattern is shifted by ⁇ / 2 in the polarized images 50a to 50d.
  • the striped pattern becomes smooth. This improves the resolution in the normal direction of the surface of the object W.
  • the object W has a light-shielding property
  • the brightness of the region corresponding to the object W becomes 0 in the image obtained under the condition of the backlight. Therefore, the outer circumference of the object W can be easily confirmed. Thereby, by using the image obtained under the condition of the backlight, burrs or chips on the outer periphery of the object W can be inspected with high accuracy.
  • FIG. 32 is a diagram showing the arrangement of a plurality of lighting units according to the fourth inspection example.
  • the plurality of lighting units 10 include the lighting units 10e and 10m.
  • the illumination unit 10e irradiates the object W with illumination light along the optical axis 25 of the polarizing camera 20 (see FIG. 24).
  • the lighting unit 10m is arranged as a backlight on the opposite side of the object W from the polarizing camera 20. That is, the illumination unit 10m irradiates the illumination light from the back surface side of the object W.
  • the illumination unit 10m has a flat light emitting unit 11m and a linear polarizing filter 12m attached to the light emitting surface of the light emitting unit 11f.
  • the linear polarization filter 12e (see FIG. 24) of the illumination unit 10e and the linear polarization filter 12m are arranged so that the polarization directions of the linear polarization irradiated from the illumination units 10e and 10m to the object W are orthogonal to each other.
  • the inspection system 1 includes a polarizing camera 20 including the splitters 22a to 22d shown in FIG.
  • the polarization direction of the polarizing element 22a coincides with (parallel to) the polarization direction of the linearly polarized light radiated to the object W from the illumination unit 10e.
  • the polarization direction of the polarizing element 22c coincides with (parallel to) the polarization direction of the linearly polarized light emitted from the illumination unit 10m. Therefore, the light irradiated from the illumination unit 10e and specularly reflected by the object W passes through the polarizing element 22a and does not pass through the polarizing element 22c.
  • the light emitted from the illumination unit 10m and traveling around the object W passes through the polarizing element 22c and does not pass through the polarizing element 22a.
  • FIG. 33 is a diagram showing an example of a plurality of polarized images obtained in the fourth inspection example.
  • the polarized images 50a to 50d correspond to the polarizing elements 22a to 22d, respectively.
  • the polarized image 50a shows the brightness of the light emitted from the illumination unit 10e and specularly reflected by the object W. Therefore, it is easily confirmed that the brightness of the scratched portion is reduced in the frame line F1 of the polarized image 50a. Therefore, the processor 310 can accurately inspect scratches by using the polarized image 50a.
  • the polarized image 50c shows the brightness of the light emitted from the illumination unit 10m and traveling around the object W. That is, the polarized image 50c corresponds to an image obtained under backlight conditions. Therefore, burrs on the outer periphery of the object W can be easily confirmed in the frame line F3 of the polarized image 50c. Therefore, the processor 310 can accurately inspect burrs or chips on the outer periphery of the object W by using the polarized image 50c.
  • the polarized images 50b and 50d show an intermediate brightness between the polarized images 50a and 50c. Therefore, the polarized images 50b and 50d may not be used for inspecting the object W.
  • the fifth inspection example uses a high dynamic range technique.
  • the high dynamic range technique is a technique for generating an image (high dynamic range image) having a wide dynamic range with less overexposure and underexposure by synthesizing a plurality of images under different lighting intensity conditions.
  • FIG. 34 is a diagram showing the arrangement of a plurality of lighting units according to the fifth inspection example. As shown in FIG. 34, the plurality of lighting units 10 include the lighting units 10n and 10o.
  • the illumination unit 10n has a light emitting unit 11n that emits non-polarized light, and a linear polarization filter 12n that is arranged on the object W of the light emitting unit 11n. Therefore, the object W is irradiated with linearly polarized light from the illumination unit 10n.
  • the illumination unit 10o has a light emitting unit 11o that emits non-polarized light, and does not have a linear polarization filter. Therefore, the object W is irradiated with non-polarized light from the illumination unit 10o.
  • the lighting units 10n and 10o are arranged close to each other.
  • the irradiation directions from the illumination units 10n and 10o to the object W are substantially the same, and as will be described later, a plurality of polarized images 50 having different illumination intensities and substantially the same illumination direction are captured once. Obtained at.
  • the inspection system 1 includes a polarizing camera 20 including the splitters 22a to 22d shown in FIG.
  • the polarization direction of the linearly polarized light emitted from the illumination unit 10n is parallel to the polarization direction of the splitter 22a and orthogonal to the polarization direction of the polarizing element 22c. Therefore, the light irradiated from the illumination unit 10n and specularly reflected by the object W passes through the polarizing element 22a and does not pass through the polarizing element 22c.
  • the light emitted from the illumination unit 10n and specularly reflected by the object W has components parallel to the polarization directions of the splitters 22b and 22d.
  • the polarizing elements 22b and 22d are about 1 ⁇ 2 of the amount of light emitted from the illumination unit 10n and transmitted through the splitter 22a.
  • the non-polarized light emitted from the illumination unit 10o evenly has components parallel to the polarization direction of the splitters 22a to 22d. Therefore, the amount of light emitted from the illumination unit 10o and transmitted through each of the polarizing elements 22a to 22d is the same.
  • the intensities of the light emitted from the illumination units 10n and 10o, specularly reflected by the object W, and transmitted through the polarizing element 22a are defined as In and Io, respectively.
  • the pixel values Ja to Jd of the polarized images 50a to 50d corresponding to the polarizing elements 22a to 22d are represented as follows. Pixel value Ja: In + Io of polarized image 50a Pixel value Jb of polarized image 50b: In / 2 + Io Pixel value of polarized image 50c Jc: Io Pixel value of polarized image 50d Jd: In / 2 + Io
  • the polarized images 50a to 50c correspond to a plurality of images under conditions where the illumination intensities are different from each other. Therefore, the processor 310 synthesizes the polarized images 50a to 50c in a high dynamic range to generate a composite image.
  • the processor 310 may generate a composite image using a known high dynamic range synthesis technique.
  • FIG. 35 is a diagram showing an example of a plurality of polarized images obtained in the fifth inspection example.
  • the luminance of the polarized image 50a is the highest
  • the luminance of the polarized image 50c is the lowest
  • the luminance of the polarized images 50b and 50d is intermediate.
  • the processor 310 synthesizes the polarized images 50a to 50c in a high dynamic range to generate the composite image 57. In the composite image 57, there are few overexposure and underexposure. Therefore, the processor 310 can inspect the object W with high accuracy by using the composite image 57.
  • the number of the polarizing elements 22 included in the unit region 21 of the polarizing camera 20 is not limited to four, and may be a plurality. As described above, the polarized images 50b and 50d may not be used in the second inspection example and the fourth inspection example. Therefore, the unit region 21 may include only the splitters 22a and 22c and may not include the splitters 22b and 22d. Similarly, in the fifth inspection example, the polarized image 50d is not used. Therefore, the unit region 21 may include only the splitters 22a to 22c and may not include the splitter 22d.
  • the plurality of lighting units 10 may include only the lighting units 10a and 10c and may not include the lighting units 10b and 10d.
  • the unit region 21 of the polarizing camera 20 may include only the polarizing elements 22a to 22c and may not include the polarizing elements 22d.
  • Ja + Jc 2 ⁇ cos ( ⁇ ) cos (2 ⁇ x) cos (2 ⁇ y) ... Equation (7)
  • Ja ⁇ Jc 2 ⁇ sin ( ⁇ ) sin (2 ⁇ x) cos (2 ⁇ y) ⁇ ⁇ ⁇ Equation (8).
  • the incident angle ⁇ is determined in advance according to the positions of the illumination units 10a and 10c. Therefore, the processor 310 calculates the right side of the equation (9) using the pixel values Ja and Jc of the polarized images 50a and 50c and the incident angle ⁇ , and the X-direction normal image having Nx as the pixel value as the result. Should be generated. Tan (2 ⁇ x) ( ⁇ Nx) represented by the equation (9) is a value depending on nx, which is a component in the x direction of the normal vector. Therefore, the X-direction normal image represents the normal direction of the surface of the object W.
  • the processor 310 sets rectangular areas R1 and R2 on the left side and the right side of the point of interest Q, respectively. Then, the processor 310 calculates the difference Sx between the sum Sx1 of the values of the pixels included in the rectangular region R1 and the sum Sx2 of the values of the pixels included in the rectangular region R2. The processor 310 calculates the difference Sx for all the points surrounded by the four pixels in the X-direction normal image.
  • the processor 310 may generate a shape image having a difference Sx as a pixel value. As described above, the difference Sx takes a large value in a place where a defect having unevenness such as a scratch or a dent is present. Therefore, the processor 310 can accurately inspect defects having irregularities by using the shape image.
  • the method of generating a normal image is not limited to this.
  • a learning model for estimating a normal image from a plurality of polarized images is constructed using a deep learning image generation technique.
  • the processor 310 may generate a normal image from a plurality of polarized images using the learning model.
  • this embodiment includes the following disclosures.
  • (Structure 1) Inspection system (1) A plurality of lighting units (10, 10a to 10o) that illuminate the object (W), and A polarizing camera (20) in which a unit region (21) including a plurality of polarizing elements (22, 22a to 22d) is repeatedly arranged, and a polarizing camera (20). Equipped with an inspection device (30) The plurality of lighting units (10, 10a to 10o) irradiate illumination light having different polarization states from each other. The plurality of polarizing elements (22, 22a to 22d) transmit light in different polarization directions from each other.
  • the polarizing camera (20) corresponds to each of the plurality of polarizing elements (22, 22a to 22d) by taking an image in a state where the plurality of lighting units (10, 10a to 10o) are lit at the same time.
  • the polarized image (50, 50a to 50d) of The inspection system (1) inspects the object (W) using the plurality of polarized images (50, 50a to 50d).
  • the plurality of lighting units are arranged so that the azimuth angles around the optical axis (25) of the polarizing camera (20) are different from each other.
  • the plurality of lighting units include first to Nth lighting units (10a to 10d).
  • the plurality of splitters include first to Nth substituents (22a to 22d).
  • N is an integer greater than or equal to 2 and
  • the polarization directions of the illumination light of the first to Nth illumination units (10a to 10d) are parallel to the polarization directions of the light transmitted through the first to Nth modulators (22a to 22d), respectively.
  • the inspection device (30) generates a normal image (51, 52) showing the normal direction of the surface of the object (W) from the plurality of polarized images (50a to 50d), and the normal image (51, 52).
  • 51, 52) The inspection system (1) according to configuration 1, which inspects the object (W) based on the object (W).
  • (Structure 3) N is 4
  • the first lighting unit (10a) and the third lighting unit (10c) are arranged at positions symmetrical with respect to the optical axis (25) of the polarizing camera (20).
  • the second lighting unit (10b) and the fourth lighting unit (10d) are arranged at positions symmetrical with respect to the optical axis (25) of the polarizing camera (20).
  • a first azimuth angle in which the first illumination unit (10a) is arranged and a second azimuth in which the second illumination unit (10b) is arranged are arranged.
  • the difference from the azimuth of is 90 °
  • the plurality of polarized images include first to fourth polarized images (50a to 50b) corresponding to the first to fourth polarizing elements (22a to 22d), respectively.
  • the inspection device (30) is A first indicating the size of a component along the direction of the first azimuth in the normal vector of the surface of the object (W) based on the first to fourth polarized images (50a to 50b).
  • a normal image (51) of A second normal image showing the magnitude of a component along the direction of the second azimuth in the normal vector of the surface of the object (W) based on the first to fourth polarized images ( 52) is generated, Based on the first normal image (51) and the second normal image (52), a shape image (53) showing the shape of the surface of the object (W) is generated.
  • the inspection system (1) according to configuration 2, which inspects the object (W) based on the shape image (53).
  • the plurality of lighting units are A first illumination unit (10e) that irradiates illumination light along the optical axis of the polarizing camera, and A ring-shaped second illumination unit (10f) centered on the optical axis of the polarizing camera is included.
  • the plurality of polarizing elements include a first polarizing element (22a) and a second polarizing element (22c).
  • the polarization direction of the illumination light emitted from the first illuminating unit (10e) and the second illuminating unit (10f) is the same as that of the first polarizing element (22a) and the second polarizing element (22c).
  • the inspection system (1) according to the configuration 5, which coincides with the polarization direction of the transmitted light.
  • the plurality of illumination units include a plurality of concentric ring-shaped illumination units (10 g to 10 l) centered on the optical axis of the polarizing camera.
  • the inspection device (30) is Based on the plurality of polarized images (50a to 50d), a phase image showing an angle formed by the normal direction of the surface of the object (W) and the optical axis direction of the polarized camera is generated.
  • the inspection system (1) according to configuration 5, which inspects the object (W) based on the phase image.
  • the plurality of lighting units are A first illumination unit (10e) that irradiates illumination light along the optical axis of the polarizing camera, and A second illumination unit (10 m) that irradiates illumination light from the back surface side of the object is included.
  • the plurality of polarizing elements include a first polarizing element (22a) and a second polarizing element (22c).
  • the polarization direction of the illumination light emitted from the first illuminating unit (10e) and the second illuminating unit (10 m) is the same as that of the first polarizing element (22a) and the second polarizing element (22c).
  • the inspection system (1) according to configuration 1, which coincides with the polarization direction of the transmitted light.
  • the plurality of illumination units include a first illumination unit (10n) that irradiates linearly polarized illumination light and a second illumination unit (10o) that irradiates non-polarized illumination.
  • the plurality of substituents are A first polarizing element (22a) that transmits light in the same polarization direction as the polarization direction of the illumination light of the first illumination unit (10n).
  • a second polarizing element (22b) that transmits light in a polarization direction having an angle of 45 ° with the polarization direction of the light transmitted through the first polarizing element (22a).
  • a third polarizing element (22c) that transmits light in a polarization direction having an angle of 90 ° with respect to the polarization direction of the light transmitted through the first polarizing element (22a) is included.
  • the plurality of polarized images include first to third polarized images (55a to 55c) corresponding to the first to third polarizing elements (22a to 22c), respectively.
  • the inspection device (30) is The first to third polarized images (55a to 55c) are synthesized in a high dynamic range to generate a composite image.
  • the inspection system (1) according to configuration 1, which inspects the object (W) based on the composite image.
  • the inspection method is In a state where the plurality of lighting units (10, 10a to 10o) are lit at the same time, the object (W) is imaged by the polarizing camera (20), whereby the plurality of polarizing elements (22, A step of acquiring a plurality of polarized images (50, 50a to 50d) corresponding to 22a to 22d), respectively.
  • An inspection method comprising a step of inspecting the object (W) using the plurality of polarized images (50, 50a to 50d).
  • 1 Inspection system 2 Conveyance belt, 10, 10a to 10o Lighting unit, 11a to 11o light emitting unit, 12a to 12n linear polarization filter, 13e half mirror, 20 polarization camera, 21 unit area, 22, 22a to 22d polarizing element, 25 , 225 optical axis, 30 inspection device, 50, 50a-50d polarized image, 51 X direction normal image, 52 Y direction normal image, 53 shape image, 54 binary image, 55 albed image, 56 average image, 57 composite Image, 110 lighting device, 110a-110d arc area, 302 display unit, 304 keyboard, 306 memory card, 310 processor, 312 RAM, 314 display controller, 316 system controller, 318 controller, 320 hard disk, 322 camera interface, 324 input interface 328 communication interface, 330 memory card interface, 350 inspection program, F1, F2 border, P point, Q attention point, R1 to R4 rectangular area, W object, n normal vector.

Landscapes

  • Engineering & Computer Science (AREA)
  • Textile Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

This inspection system comprises: a plurality of illumination parts that illuminate a subject; a polarization camera in which unit regions including a plurality of polarizers are repeatedly aligned; and an inspection device. The plurality of illumination parts emit illumination light of different polarization states. The plurality of polarizers allow the light of the different polarization directions to pass therethrough. The polarization cameras capture images while the plurality of illumination parts are lighted at the same time, thereby outputting a plurality of polarization images corresponding to each of the plurality of polarizers. The inspection device inspects the subject using the plurality of polarization images. As a result, inspection can be performed on the moving subject using a plurality of images having different illumination conditions.

Description

検査システムおよび検査方法Inspection system and inspection method
 本開示は、検査システムおよび検査方法に関する。 This disclosure relates to an inspection system and an inspection method.
 FA(Factory Automation)分野などにおいて、対象物を照明しながら撮像し、得られた画像を用いて対象物の外観が検査される。従来、検査性能の向上のために、照明条件を変更して複数回撮像する方法が知られている。 In the FA (Factory Automation) field, an image is taken while illuminating an object, and the appearance of the object is inspected using the obtained image. Conventionally, in order to improve inspection performance, a method of changing lighting conditions and taking multiple images has been known.
 たとえば、フォトメトリックステレオ法では、光源の方向を変化させながら複数回撮像することにより得られる複数の画像を用いて、対象物の表面の法線が推測される。これにより、対象物の表面の汚れに影響されずに表面の凹凸が検査される。 For example, in the photometric stereo method, the normal of the surface of an object is estimated using a plurality of images obtained by taking multiple images while changing the direction of the light source. As a result, the unevenness of the surface is inspected without being affected by the dirt on the surface of the object.
 さらに、特開2016-105044号公報(特許文献1)には、P種類の照明光で対象物が順次照射されるたびに、対象物からの戻り光の画像を取得する画像処理装置が開示されている。 Further, Japanese Patent Application Laid-Open No. 2016-105044 (Patent Document 1) discloses an image processing device that acquires an image of return light from an object each time the object is sequentially irradiated with P-type illumination light. ing.
特開2016-105044号公報Japanese Unexamined Patent Publication No. 2016-105044
 対象物が移動している場合、上記の従来の方法では、複数回の撮像によりそれぞれ得られる複数の画像中の対象物の位置が互いに異なる。そのため、上記の従来の方法は、移動している対象物に対して適用できない。 When the object is moving, in the above-mentioned conventional method, the positions of the objects in the plurality of images obtained by multiple imaging are different from each other. Therefore, the above-mentioned conventional method cannot be applied to a moving object.
 本開示は、上記の問題に鑑みてなされたものであり、その目的は、移動している対象物に対して、照明条件の異なる複数の画像を用いた検査が可能な検査システムおよび検査方法を提供することである。 The present disclosure has been made in view of the above problems, and an object thereof is an inspection system and an inspection method capable of inspecting a moving object using a plurality of images having different lighting conditions. To provide.
 本開示の一例によれば、検査システムは、対象物を照明する複数の照明部と、複数の偏光子を含む単位領域が繰り返して配列された偏光カメラと、検査装置と、を備える。複数の照明部は、互いに異なる偏光状態の照明光を照射する。複数の偏光子は、互いに異なる偏光方向の光を透過させる。偏光カメラは、複数の照明部が同時に点灯している状態において撮像することにより、複数の偏光子にそれぞれ対応する複数の偏光画像を出力する。検査装置は、複数の偏光画像を用いて対象物を検査する。 According to an example of the present disclosure, the inspection system includes a plurality of lighting units for illuminating an object, a polarizing camera in which unit regions including a plurality of polarizing elements are repeatedly arranged, and an inspection device. The plurality of lighting units irradiate illumination light having different polarization states from each other. The plurality of splitters transmit light in different polarization directions from each other. The polarizing camera outputs a plurality of polarized images corresponding to the plurality of polarizing elements by taking an image in a state where the plurality of lighting units are lit at the same time. The inspection device inspects the object using a plurality of polarized images.
 上記の開示によれば、複数の偏光画像の各々は、対応する偏光子の偏光方向とのなす角度が最も小さい偏光を照射する照明部の照明条件を主とする照明条件下で撮像された画像に対応する。すなわち、ワンショット撮像により、複数の照明条件に対応する複数の偏光画像が取得される。そのため、移動している対象物に対して、照明条件の異なる複数の偏光画像を用いた検査が可能となる。 According to the above disclosure, each of the plurality of polarized images is an image captured under illumination conditions mainly including the illumination condition of the illumination unit that irradiates the polarized light having the smallest angle with the polarization direction of the corresponding polarizing element. Corresponds to. That is, by one-shot imaging, a plurality of polarized images corresponding to a plurality of lighting conditions are acquired. Therefore, it is possible to inspect a moving object using a plurality of polarized images having different illumination conditions.
 上述の開示において、複数の照明部は、偏光カメラの光軸の周りの方位角が互いに異なるように配置される。複数の照明部は、第1~第Nの照明部を含む。複数の偏光子は、第1~第Nの偏光子を含む。Nは、2以上の整数である。第1~第Nの照明部の照明光の偏光方向は、第1~第Nの偏光子を透過する光の偏光方向とそれぞれ平行である。検査装置は、複数の偏光画像から対象物の表面の法線方向を示す法線画像を生成し、法線画像に基づいて対象物を検査する。 In the above disclosure, the plurality of lighting units are arranged so that the azimuth angles around the optical axis of the polarizing camera are different from each other. The plurality of lighting units include the first to Nth lighting units. The plurality of modulators include the first to Nth substituents. N is an integer of 2 or more. The polarization direction of the illumination light of the first to Nth illumination units is parallel to the polarization direction of the light transmitted through the first to Nth polarizing elements. The inspection device generates a normal image showing the normal direction of the surface of the object from a plurality of polarized images, and inspects the object based on the normal image.
 上記の開示によれば、第1~第Nの偏光子の各々に対応する偏光画像は、当該偏光子の偏光方向と平行な照明光を照射する照明部の光が最も強くなるように撮像される。すなわち、第1~第Nの照明部のうち偏光画像に与える影響の最も大きい照明部は、複数の偏光画像において互いに異なる。第1~第Nの照明部は、偏光カメラの光軸の周りの方位角が互いに異なるように配置される。そのため、照明光の方位角に関する条件が互いに異なる複数の偏光画像が1回の撮像で得られる。そして、複数の偏光画像から法線画像が生成される。法線画像を用いることにより、対象物の表面の凹凸が精度良く検査される。 According to the above disclosure, the polarized image corresponding to each of the first to Nth polarizing elements is imaged so that the light of the illuminating unit that irradiates the illumination light parallel to the polarization direction of the polarizing element is the strongest. To. That is, among the first to Nth illumination units, the illumination units having the greatest influence on the polarized image are different from each other in the plurality of polarized images. The first to Nth illumination units are arranged so that the azimuth angles around the optical axis of the polarizing camera are different from each other. Therefore, a plurality of polarized images having different conditions regarding the azimuth angle of the illumination light can be obtained by one imaging. Then, a normal image is generated from the plurality of polarized images. By using the normal image, the unevenness of the surface of the object is inspected with high accuracy.
 上述の開示において、Nは4である。第1の照明部および第3の照明部は、偏光カメラの光軸に対して対称な位置に配置される。第2の照明部および第4の照明部は、偏光カメラの光軸に対して対称な位置に配置される。偏光カメラの光軸の周りにおいて、第1の照明部が配置される第1の方位角と第2の照明部が配置される第2の方位角との差が90°である。複数の偏光画像は、第1~第4の偏光子にそれぞれ対応する第1~第4の偏光画像を含む。検査装置は、第1~第4の偏光画像に基づいて、対象物の表面の法線ベクトルにおける第1の方位角の方向に沿った成分の大きさを示す第1の法線画像を生成する。検査装置は、第1~第4の偏光画像に基づいて、対象物の表面の法線ベクトルにおける第2の方位角の方向に沿った成分の大きさを示す第2の法線画像を生成する。検査装置は、第1の法線画像と第2の法線画像とに基づいて、対象物の表面の形状を示す形状画像を生成し、形状画像に基づいて対象物を検査する。 In the above disclosure, N is 4. The first illumination unit and the third illumination unit are arranged at positions symmetrical with respect to the optical axis of the polarizing camera. The second illumination unit and the fourth illumination unit are arranged at positions symmetrical with respect to the optical axis of the polarizing camera. Around the optical axis of the polarizing camera, the difference between the first azimuth in which the first illumination unit is arranged and the second azimuth in which the second illumination unit is arranged is 90 °. The plurality of polarized images include first to fourth polarized images corresponding to the first to fourth modulators, respectively. The inspection device generates a first normal image showing the magnitude of a component along the direction of the first azimuth in the normal vector of the surface of the object based on the first to fourth polarized images. .. The inspection device generates a second normal image showing the magnitude of the component along the direction of the second azimuth in the normal vector of the surface of the object based on the first to fourth polarized images. .. The inspection device generates a shape image showing the shape of the surface of the object based on the first normal image and the second normal image, and inspects the object based on the shape image.
 上記の開示によれば、第1の方位角の方向に沿った成分の大きさを示す第1の法線画像と、第2の方位角の方向に沿った成分の大きさを示す第2の法線画像とが生成される。これにより、たとえば対象物の表面のキズの形成方向がランダムであっても、当該キズに応じた形状の変化が形状画像に現われる。そのため、対象物の表面のキズが精度良く検査される。 According to the above disclosure, a first normal image showing the size of the component along the direction of the first azimuth and a second normal image showing the size of the component along the direction of the second azimuth. A normal image is generated. As a result, for example, even if the formation direction of scratches on the surface of the object is random, the shape change according to the scratches appears in the shape image. Therefore, scratches on the surface of the object are inspected with high accuracy.
 上述の開示において、第2の照明部、第3の照明部および第4の照明部の照明光の偏光方向と第1の照明部の照明光の偏光方向とのなす角度は、それぞれ45°、90°および135°である。 In the above disclosure, the angle formed by the polarization direction of the illumination light of the second illumination unit, the third illumination unit, and the fourth illumination unit and the polarization direction of the illumination light of the first illumination unit is 45 °, respectively. 90 ° and 135 °.
 上記の開示によれば、第1~第4の偏光画像は、それぞれ第1~第4の照明部の照明条件を主とする照明条件下で撮像された画像に対応する。そして、第1の偏光画像に対する第3の照明部の照明光の影響を最大限に小さくできる。同様に、第2,第3および第4の偏光画像に対する第4,第1および第2の照明部の照明光の影響を最大限にそれぞれ小さくできる。 According to the above disclosure, the first to fourth polarized images correspond to the images captured under the illumination conditions mainly the illumination conditions of the first to fourth illumination units, respectively. Then, the influence of the illumination light of the third illumination unit on the first polarized image can be minimized. Similarly, the influence of the illumination light of the fourth, first and second illumination units on the second, third and fourth polarized images can be minimized as much as possible.
 上述の開示において、複数の照明部は、対象物に対する仰角が互いに異なるように配置される。 In the above disclosure, the plurality of lighting units are arranged so that the elevation angles with respect to the object are different from each other.
 上記の開示によれば、照明条件として対象物に対する仰角が異なる複数の偏光画像が1回の撮像で得られる。 According to the above disclosure, as a lighting condition, a plurality of polarized images having different elevation angles with respect to an object can be obtained by one imaging.
 上述の開示において、複数の照明部は、偏光カメラの光軸に沿って照明光を照射する第1の照明部と、偏光カメラの光軸を中心とするリング型の第2の照明部と、を含む。複数の偏光子は、第1の偏光子および第2の偏光子を含む。第1の照明部および第2の照明部から照射される照明光の偏光方向は、第1の偏光子および第2の偏光子を透過する光の偏光方向とそれぞれ一致する。 In the above disclosure, the plurality of illumination units include a first illumination unit that irradiates the illumination light along the optical axis of the polarized camera, and a ring-shaped second illumination unit centered on the optical axis of the polarized camera. including. The plurality of splitters include a first modulator and a second modulator. The polarization directions of the illumination light emitted from the first illumination unit and the second illumination unit coincide with the polarization directions of the light transmitted through the first and second polarizing elements, respectively.
 上記の開示によれば、第1の偏光子に対応する第1の偏光画像は、第1の照明部から照射され、対象物の表面で正反射した光の輝度を示す。すなわち、第1の偏光画像は、明視野照明の条件下で得られる画像に対応する。明視野照明の条件下で得られる画像において、キズが存在する部分の輝度が低下する。そのため、第1の偏光画像を用いることにより、キズを精度良く検査できる。 According to the above disclosure, the first polarized image corresponding to the first polarizing element shows the brightness of the light that is radiated from the first illumination unit and is specularly reflected on the surface of the object. That is, the first polarized image corresponds to an image obtained under brightfield illumination conditions. In an image obtained under bright-field illumination conditions, the brightness of the scratched portion is reduced. Therefore, by using the first polarized image, scratches can be inspected with high accuracy.
 第2の偏光子に対応する第2の偏光画像は、第2の照明部から照射され、対象物の表面で拡散反射した光の輝度を示す。すなわち、第2の偏光画像は、暗視野照明の条件下で得られる画像に対応する。暗視野照明の条件下で得られる画像において、汚れが存在する部分の輝度がその周囲の輝度と異なる。そのため、第2の偏光画像を用いることにより、汚れを精度良く検査できる。 The second polarized image corresponding to the second polarizing element shows the brightness of the light emitted from the second illumination unit and diffusely reflected on the surface of the object. That is, the second polarized image corresponds to the image obtained under the conditions of dark field illumination. In the image obtained under the condition of dark field illumination, the brightness of the portion where the stain is present is different from the brightness of the surrounding portion. Therefore, by using the second polarized image, dirt can be inspected with high accuracy.
 このように、キズの検査に適した第1の偏光画像と、汚れの検査に適した第2の偏光画像とが、1回の撮像によって取得される。 As described above, the first polarized image suitable for the inspection of scratches and the second polarized image suitable for the inspection of stains are acquired by one imaging.
 上述の開示において、複数の照明部は、偏光カメラの光軸を中心とする同心円状の複数のリング型照明部を含む。検査装置は、複数の偏光画像に基づいて、対象物の表面の法線方向と偏光カメラの光軸方向とのなす角度を示す位相画像を生成する。位相画像に基づいて、対象物を検査する。 In the above disclosure, the plurality of illumination units include a plurality of concentric ring-shaped illumination units centered on the optical axis of the polarizing camera. The inspection device generates a phase image showing an angle formed by the normal direction of the surface of the object and the optical axis direction of the polarized camera based on a plurality of polarized images. Inspect the object based on the phase image.
 上記の開示によれば、複数のリング型照明部の照明光の偏光状態が互いに異なるため、複数のリング型照明部から照射され、対象物の表面で正反射して各偏光子を透過する光量は、互いに異なる。そのため、複数の偏光画像には互いに位相の異なる縞パターンが写る。すなわち、互いに位相の異なる縞パターンの写る複数の偏光画像が1回の撮像で得られる。そして、複数の偏光画像から生成される位相画像は、対象物の表面の法線方向を示す。そのため、位相画像を用いることにより、対象物の表面の凹凸が精度良く検査される。 According to the above disclosure, since the polarization states of the illumination lights of the plurality of ring-shaped illumination units are different from each other, the amount of light emitted from the plurality of ring-type illumination units, specularly reflected on the surface of the object, and transmitted through each polarizing element. Are different from each other. Therefore, a plurality of polarized images show fringe patterns having different phases from each other. That is, a plurality of polarized images in which fringe patterns having different phases are captured can be obtained by one imaging. The phase image generated from the plurality of polarized images indicates the normal direction of the surface of the object. Therefore, by using the phase image, the unevenness of the surface of the object can be inspected with high accuracy.
 上述の開示において、複数の照明部は、さらに、前記偏光カメラの光軸に沿って照明光を照射する照明部を含む。 In the above disclosure, the plurality of illumination units further include an illumination unit that irradiates the illumination light along the optical axis of the polarizing camera.
 上記の開示によれば、同心円の中心部分に関する縞パターンの欠損を低減でき、より正確な検査を行なうことができる。 According to the above disclosure, it is possible to reduce the loss of the stripe pattern regarding the central portion of the concentric circles, and it is possible to perform a more accurate inspection.
 上述の開示において、複数の照明部は、偏光カメラの光軸に沿って照明光を照射する第1の照明部と、対象物の背面側から照明光を照射する第2の照明部と、を含む。複数の偏光子は、第1の偏光子および第2の偏光子を含む。第1の照明部および第2の照明部から照射される照明光の偏光方向は、第1の偏光子および第2の偏光子を透過する光の偏光方向とそれぞれ一致する。 In the above disclosure, the plurality of illumination units include a first illumination unit that irradiates the illumination light along the optical axis of the polarizing camera, and a second illumination unit that irradiates the illumination light from the back side of the object. include. The plurality of splitters include a first modulator and a second modulator. The polarization directions of the illumination light emitted from the first illumination unit and the second illumination unit coincide with the polarization directions of the light transmitted through the first and second polarizing elements, respectively.
 上記の開示によれば、第1の偏光子に対応する第1の偏光画像は、第1の照明部から照射され、対象物の表面で正反射した光の輝度を示す。キズが存在する部分に光が入射すると、乱反射する。そのため、第1の偏光画像において、キズが存在する部分の輝度が低下する。したがって、第1の偏光画像を用いることにより、キズを精度良く検査できる。 According to the above disclosure, the first polarized image corresponding to the first polarizing element shows the brightness of the light that is radiated from the first illumination unit and is specularly reflected on the surface of the object. When light is incident on the part where the scratch is present, it is diffusely reflected. Therefore, in the first polarized image, the brightness of the portion where the scratch is present is reduced. Therefore, by using the first polarized image, scratches can be inspected with high accuracy.
 第2の偏光子に対応する第2の偏光画像は、第2の照明部から照射され、偏光カメラに入射する光の輝度を示す。そのため、対象物が遮光性を有する場合、第2の偏光画像において、対象物の外周の形状が明確に認識される。したがって、第2の偏光画像を用いることにより、対象物の外周のバリまたは欠けを精度良く検査できる。 The second polarized image corresponding to the second polarizing element shows the brightness of the light emitted from the second illumination unit and incident on the polarizing camera. Therefore, when the object has a light-shielding property, the shape of the outer periphery of the object is clearly recognized in the second polarized image. Therefore, by using the second polarized image, burrs or chips on the outer periphery of the object can be inspected with high accuracy.
 このように、キズの検査に適した第1の偏光画像と、対象物の外周のバリまたは欠けの検査に適した第2の偏光画像とが、1回の撮像によって取得される。 As described above, the first polarized image suitable for the inspection of scratches and the second polarized image suitable for the inspection of burrs or chips on the outer periphery of the object are acquired by one imaging.
 上述の開示において、複数の照明部は、直線偏光の照明光を照射する第1の照明部と、非偏光を照射する第2の照明部と、を含む。複数の偏光子は、第1~第3の偏光子を含む、第1の偏光子は、第1の照明部の照明光の偏光方向と同じ偏光方向の光を透過させる。第2の偏光子は、第1の偏光子を透過する光の偏光方向とのなす角度が45°の偏光方向の光を透過させる。第3の偏光子は、第1の偏光子を透過する光の偏光方向とのなす角度が90°の偏光方向の光を透過させる。複数の偏光画像は、第1~第3の偏光子にそれぞれ対応する第1~第3の偏光画像を含む。検査装置は、第1~第3の偏光画像をハイダイナミックレンジ合成して、合成画像を生成し、合成画像に基づいて、対象物を検査する。 In the above disclosure, the plurality of illumination units include a first illumination unit that irradiates linearly polarized illumination light and a second illumination unit that irradiates non-polarized illumination light. The plurality of splitters include the first to third substituents, and the first substituent transmits light in the same polarization direction as the polarization direction of the illumination light of the first illumination unit. The second polarizing element transmits light in the polarization direction having an angle of 45 ° with the polarization direction of the light transmitted through the first polarizing element. The third polarizing element transmits light in the polarization direction having an angle of 90 ° with the polarization direction of the light transmitted through the first polarizing element. The plurality of polarized images include the first to third polarized images corresponding to the first to third modulators, respectively. The inspection device synthesizes the first to third polarized images in a high dynamic range to generate a composite image, and inspects the object based on the composite image.
 上記の開示によれば、第2の照明部から照射された非偏光は、第1~第3の偏光子を均一に透過する。一方、第1の照明部から照射された直線偏光は、第1の偏光子を透過するが、第3の偏光子を透過しない。また、第1の照明部から照射され、第2の偏光子を透過する光の強度は、第1の照明部から照射され、第1の偏光子を透過する光の強度の1/2である。そのため、第1~第3の偏光画像は、照明強度が互いに異なる照明条件下で撮像された複数の画像に対応する。すなわち、照明強度が互いに異なる第1~第3の偏光画像が1回の撮像で得られる。そして、第1~第3の偏光画像がハイダイナミックレンジ合成され、合成画像が生成される。合成画像を用いることにより、対象物の検査精度が向上する。 According to the above disclosure, the non-polarized light emitted from the second illumination unit uniformly transmits the first to third polarizing elements. On the other hand, the linearly polarized light emitted from the first illumination unit transmits the first polarizing element, but does not transmit the third polarizing element. Further, the intensity of the light emitted from the first illumination unit and transmitted through the second polarizing element is ½ of the intensity of the light emitted from the first illumination unit and transmitted through the first polarizing element. .. Therefore, the first to third polarized images correspond to a plurality of images captured under illumination conditions in which the illumination intensities are different from each other. That is, first to third polarized images having different illumination intensities can be obtained by one imaging. Then, the first to third polarized images are combined in a high dynamic range, and a combined image is generated. By using the composite image, the inspection accuracy of the object is improved.
 本開示の一例によれば、検査方法は、対象物を照明する複数の照明部と、複数の偏光子を含む単位領域が繰り返し配列された偏光カメラとを用いる。複数の照明部は、互いに異なる偏光状態の照明光を照射する。複数の偏光子は、互いに異なる偏光方向の光を透過させる。検査方法は、複数の照明部が同時に点灯している状態において、偏光カメラを用いて対象物を撮像することにより、複数の偏光子にそれぞれ対応する複数の偏光画像を取得するステップと、複数の偏光画像を用いて対象物を検査するステップと、を備える。 According to an example of the present disclosure, the inspection method uses a plurality of lighting units for illuminating an object and a polarizing camera in which unit regions including a plurality of polarizing elements are repeatedly arranged. The plurality of lighting units irradiate illumination light having different polarization states from each other. The plurality of splitters transmit light in different polarization directions from each other. The inspection method includes a step of acquiring a plurality of polarized images corresponding to a plurality of polarizing elements by imaging an object using a polarizing camera in a state where a plurality of lighting units are lit at the same time, and a plurality of inspection methods. It comprises a step of inspecting an object using a polarized image.
 上記の開示によっても、移動している対象物に対して、照明条件の異なる複数の画像を用いて検査できる。 Even with the above disclosure, it is possible to inspect a moving object using a plurality of images with different lighting conditions.
 本開示によれば、移動している対象物に対して、照明条件の異なる複数の画像を用いて検査できる。 According to the present disclosure, a moving object can be inspected using a plurality of images having different lighting conditions.
本実施の形態に係る検査システムの全体構成を示す概略図である。It is a schematic diagram which shows the whole structure of the inspection system which concerns on this embodiment. 偏光カメラが有する偏光子の配置の一例を示す図である。It is a figure which shows an example of the arrangement of the polarizing element which a polarizing camera has. 検査装置のハードウェア構成を示す模式図である。It is a schematic diagram which shows the hardware composition of an inspection apparatus. フォトメトリックステレオ法で用いられる従来の照明装置の一例を示す図である。It is a figure which shows an example of the conventional lighting apparatus used in the photometric stereo method. 第1の検査例に係る複数の照明部の配置を示す図である。It is a figure which shows the arrangement of the plurality of lighting parts which concerns on the 1st inspection example. 直線偏光フィルタ12a~12dの偏光方向の一例を示す図である。It is a figure which shows an example of the polarization direction of a linear polarization filter 12a to 12d. 第1の検査例における検査処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of the inspection process in the 1st inspection example. 照明部10a~10dと対象物Wの表面上の点Pにおける法線ベクトルnとの関係を示す図である。It is a figure which shows the relationship between the illumination part 10a to 10d, and the normal vector n at the point P on the surface of an object W. 照明部10cから照射された光の進行方向と点Pの法線方向とをXZ平面に投影したときの、当該進行方向と当該法線方向の関係を示す図である。It is a figure which shows the relationship between the traveling direction and the normal direction when the traveling direction of the light radiated from the illumination unit 10c and the normal direction of a point P are projected on the XZ plane. 照明部10cから照射された光の進行方向と点Pの法線方向とをYZ平面に投影したときの、当該進行方向と当該法線方向の関係を示す図である。It is a figure which shows the relationship between the traveling direction and the normal direction when the traveling direction of the light radiated from the illumination unit 10c and the normal direction of a point P are projected on the YZ plane. 光沢度α=5のときのNyとφxとφyとの関係を示す図である。It is a figure which shows the relationship between Ny, φx, and φy when the glossiness α = 5. 光沢度α=10のときのNyとφxとφyとの関係を示す図である。It is a figure which shows the relationship between Ny, φx, and φy when the glossiness α = 10. 光沢度α=20のときのNyとφxとφyとの関係を示す図である。It is a figure which shows the relationship between Ny, φx, and φy when the glossiness α = 20. 光沢度α=5のときのアルベドμとφxとφyとの関係を示す図である。It is a figure which shows the relationship between the albedo μ, φx, and φy when the glossiness α = 5. 光沢度α=10のときのアルベドμとφxとφyとの関係を示す図である。It is a figure which shows the relationship between the albedo μ, φx, and φy when the glossiness α = 10. 光沢度α=20のときのアルベドμとφxとφyとの関係を示す図である。It is a figure which shows the relationship between the albedo μ, φx, and φy when the glossiness α = 20. 形状画像の生成のために、X方向法線画像に対して実行される処理を示す図である。It is a figure which shows the process performed with respect to the X-direction normal image for generating a shape image. 形状画像の生成のために、Y方向法線画像に対して実行される処理を示す図である。It is a figure which shows the process performed with respect to the Y direction normal image for generating a shape image. ボタン電池を撮像することにより得られる偏光画像と、偏光画像から生成された各種画像とを示す図である。It is a figure which shows the polarized image obtained by taking an image of a button battery, and various images generated from a polarized image. 乾電池の側面を撮像することにより得られる偏光画像と、偏光画像から生成された各種画像とを示す図である。It is a figure which shows the polarized image obtained by photographing the side surface of the dry cell, and various images generated from the polarized image. 乾電池の底面を撮像することにより得られる偏光画像と、偏光画像から生成された各種画像とを示す図である。It is a figure which shows the polarized image obtained by imaging the bottom surface of a dry cell, and various images generated from the polarized image. シボ加工された樹脂表面を撮像することにより得られる偏光画像と、偏光画像から生成された各種画像とを示す図である。It is a figure which shows the polarized image obtained by image | imaging the textured resin surface, and various images generated from the polarized image. 第2の検査例に係る複数の照明部の配置を示す図である。It is a figure which shows the arrangement of the plurality of lighting parts which concerns on the 2nd inspection example. 照明部10eの構成を示す図である。It is a figure which shows the structure of the illumination part 10e. 第2の検査例において得られた複数の偏光画像の一例を示す図である。It is a figure which shows an example of the plurality of polarized images obtained in the 2nd inspection example. 第3の検査例に係る複数の照明部の配置の一例を示す図である。It is a figure which shows an example of the arrangement of the plurality of lighting parts which concerns on 3rd inspection example. 照明部10e,10g~10iから照射され、対象物Wで正反射した光のうち各偏光子を透過する量を示す図である。It is a figure which shows the amount of the light which is irradiated from the illumination part 10e, 10g-10i and is specularly reflected by the object W, and is transmitted through each polarizing element. 第3の検査例に係る複数の照明部の配置の別の例を示す図である。It is a figure which shows another example of arrangement of the plurality of lighting parts which concerns on 3rd inspection example. 照明部10e,10g~10lから照射され、対象物Wで正反射した光のうち各偏光子を透過する量を示す図である。It is a figure which shows the amount of the light which is irradiated from the illuminating part 10e, 10g-10l and is specularly reflected by the object W, and is transmitted through each polarizing element. 図26に示す複数の照明部を用いて得られる偏光画像50a~50dの一例を示す図である。It is a figure which shows an example of the polarized image 50a to 50d obtained by using the plurality of illumination units shown in FIG. 26. 図28に示す複数の照明部を用いて得られる偏光画像50a~50dの一例を示す図である。FIG. 28 is a diagram showing an example of polarized images 50a to 50d obtained by using a plurality of illumination units shown in FIG. 28. 第4の検査例に係る複数の照明部の配置を示す図である。It is a figure which shows the arrangement of the plurality of lighting parts which concerns on 4th inspection example. 第4の検査例において得られた複数の偏光画像の一例を示す図である。It is a figure which shows an example of the plurality of polarized images obtained in the 4th inspection example. 第5の検査例に係る複数の照明部の配置を示す図である。It is a figure which shows the arrangement of the plurality of lighting parts which concerns on 5th inspection example. 第5の検査例において得られた複数の偏光画像の一例を示す図である。It is a figure which shows an example of the plurality of polarized images obtained in the 5th inspection example.
 本発明の実施の形態について、図面を参照しながら詳細に説明する。なお、図中の同一または相当部分については、同一符号を付してその説明は繰返さない。 An embodiment of the present invention will be described in detail with reference to the drawings. The same or corresponding parts in the drawings are designated by the same reference numerals and the description thereof will not be repeated.
 §1 適用例
 図1および図2を参照して、本発明の適用例について説明する。図1は、本実施の形態に係る検査システム1の全体構成を示す概略図である。検査システム1は、対象物Wを撮像することにより得られる画像を用いて、対象物Wを検査する。対象物Wは、金属やガラスなど光沢な表面、すなわち、入射光が主に正反射する表面を有する。検査システム1は、例えば生産ラインに組み込まれ、搬送ベルト2によって移動中の対象物Wの欠陥の有無を検査する。
§1 Application example An application example of the present invention will be described with reference to FIGS. 1 and 2. FIG. 1 is a schematic view showing the overall configuration of the inspection system 1 according to the present embodiment. The inspection system 1 inspects the object W by using an image obtained by imaging the object W. The object W has a glossy surface such as metal or glass, that is, a surface on which incident light is mainly specularly reflected. The inspection system 1 is incorporated in, for example, a production line, and inspects the moving object W for defects by a transport belt 2.
 欠陥には、キズ、凹凸、汚れ、ゴミの付着、バリ、欠けなどが含まれる。欠陥の有無を精度良く検査するためには、画像中において欠陥部分を目立たせる必要がある。欠陥の種類に応じて、欠陥部分を目立たせる照明条件が異なる。そのため、複数種類の欠陥を検査したい場合、複数の照明条件下でそれぞれ撮像された複数の画像が必要となる。あるいは、複数の照明条件下でそれぞれ撮像された複数の画像を合成することにより得られる画像において、欠陥部分が目立つ場合もある。このような欠陥を検査したい場合も、複数の照明条件下でそれぞれ撮像された複数の画像が必要となる。図1に示す検査システム1は、移動中の対象物Wに対して、複数の照明条件にそれぞれ対応する複数の画像を1回の撮像(ワンショット撮像)によって取得し、取得した複数の画像を用いて、対象物Wを検査する。 Defects include scratches, unevenness, dirt, dust adhesion, burrs, chips, etc. In order to accurately inspect the presence or absence of defects, it is necessary to make the defective parts stand out in the image. The lighting conditions that make the defective part stand out differ depending on the type of defect. Therefore, when it is desired to inspect a plurality of types of defects, a plurality of images captured under a plurality of lighting conditions are required. Alternatively, a defective portion may be conspicuous in an image obtained by synthesizing a plurality of images captured under a plurality of illumination conditions. Even when it is desired to inspect such a defect, a plurality of images captured under a plurality of illumination conditions are required. The inspection system 1 shown in FIG. 1 acquires a plurality of images corresponding to a plurality of lighting conditions for a moving object W by one imaging (one-shot imaging), and acquires a plurality of acquired images. Inspect the object W using.
 図1に示されるように、検査システム1は、対象物Wを照明する複数の照明部10と、偏光カメラ20と、検査装置30と、を備える。 As shown in FIG. 1, the inspection system 1 includes a plurality of lighting units 10 for illuminating the object W, a polarizing camera 20, and an inspection device 30.
 複数の照明部10は、互いに異なる偏光状態の照明光を照射する。図1に示す検査システム1は、照明部10a,10b,10c、10d,・・・を含む。以下、照明部10a,10b,10c、10d,・・・を特に区別しない場合、照明部10a,10b,10c、10d,・・・の各々を「照明部10」と記載する。 The plurality of lighting units 10 irradiate illumination light having different polarization states from each other. The inspection system 1 shown in FIG. 1 includes lighting units 10a, 10b, 10c, 10d, .... Hereinafter, when the illumination units 10a, 10b, 10c, 10d, ... Are not particularly distinguished, each of the illumination units 10a, 10b, 10c, 10d, ... Is described as "illumination unit 10".
 典型的には、複数の照明部10の各々は、非偏光を発する発光部と、発光部と対象物Wとの間に配置される直線偏光フィルタとを含む。直線偏光フィルタを透過する光の偏光方向は、複数の照明部10において互いに異なる。なお、複数の照明部10のうちの1つは、直線偏光フィルタを含まなくてもよい。この場合、1つの照明部10は、対象物Wに非偏光を照射する。 Typically, each of the plurality of illumination units 10 includes a light emitting unit that emits non-polarized light and a linear polarization filter arranged between the light emitting unit and the object W. The polarization directions of the light transmitted through the linear polarization filter are different from each other in the plurality of illumination units 10. It should be noted that one of the plurality of illumination units 10 does not have to include a linear polarization filter. In this case, one lighting unit 10 irradiates the object W with non-polarized light.
 偏光カメラ20は、マトリクス状に配置された複数の光検知センサを有し、光検知センサによって検知された受光量(輝度)を画素値とする画像データ(以下、単に「画像」と称する。)を生成する。複数の光検知センサのサイズは同一である。光検知センサは、たとえばCCD(Coupled Charged Device)、CMOS(Complementary Metal Oxide Semiconductor)センサである。各光検知センサの光入射側には偏光子が設けられる。 The polarizing camera 20 has a plurality of light detection sensors arranged in a matrix, and image data having a light receiving amount (luminance) detected by the light detection sensor as a pixel value (hereinafter, simply referred to as “image”). To generate. The size of multiple photodetectors is the same. The photodetection sensor is, for example, a CCD (Coupled Charged Device) or CMOS (Complementary Metal Oxide Semiconductor) sensor. A polarizing element is provided on the light incident side of each photodetector sensor.
 図2は、偏光カメラ20が有する偏光子の配置の一例を示す図である。図2に示されるように、偏光カメラ20において、複数の偏光子22を含む単位領域21が繰り返して配列される。複数の単位領域21は、マトリクス状に配列される。複数の偏光子22のサイズは同一である。 FIG. 2 is a diagram showing an example of the arrangement of the polarizing elements of the polarizing camera 20. As shown in FIG. 2, in the polarizing camera 20, the unit region 21 including the plurality of polarizing elements 22 is repeatedly arranged. The plurality of unit regions 21 are arranged in a matrix. The sizes of the plurality of polarizing elements 22 are the same.
 図2に例示される偏光カメラ20において、単位領域21は、複数の偏光子22として、4つの光検知センサとそれぞれ重なり合う4つの偏光子22a~22dを含む。以下、偏光子22a~22dを特に区別しない場合、偏光子22a~22dの各々を「偏光子22」と記載する。 In the polarizing camera 20 exemplified in FIG. 2, the unit region 21 includes four polarizing elements 22a to 22d overlapping with four photodetector sensors as a plurality of polarizing elements 22. Hereinafter, when the splitters 22a to 22d are not particularly distinguished, each of the splitters 22a to 22d is referred to as a "polarizer 22".
 偏光子22a~22dの各々は、所定の偏光方向の直線偏光を透過させる。偏光子22aの偏光方向を基準方向とし、基準方向と偏光方向とのなす角度を偏光角とすると、偏光子22a~22dの偏光角は、それぞれ0°,45°,90°,135°である。なお、偏光角には、製造上の誤差、取り付け誤差などが含まれ得る。たとえば、製造上の誤差および取り付け誤差の合計の最大がβ°である場合、「偏光角45°」は、45°±β°の範囲を含む。 Each of the stators 22a to 22d transmits linearly polarized light in a predetermined polarization direction. Assuming that the polarization direction of the splitter 22a is the reference direction and the angle formed by the reference direction and the polarization direction is the polarization angle, the polarization angles of the splitters 22a to 22d are 0 °, 45 °, 90 °, and 135 °, respectively. .. The polarization angle may include manufacturing errors, mounting errors, and the like. For example, if the maximum sum of manufacturing and mounting errors is β °, the “polarization angle 45 °” includes the range 45 ° ± β °.
 偏光カメラ20は、複数の照明部10が同時に点灯している状態において撮像することにより、複数の偏光子22にそれぞれ対応する複数の偏光画像50(図1参照)を出力する。図1に示す検査システム1は、複数の偏光画像50は、偏光画像50a,50b,50c,50d,・・・を含む。以下、偏光画像50a,50b,50c,50d,・・・を特に区別しない場合、偏光画像50a,50b,50c,50d,・・・の各々を「偏光画像50」と記載する。 The polarizing camera 20 outputs a plurality of polarized images 50 (see FIG. 1) corresponding to the plurality of polarizing elements 22 by taking an image in a state where the plurality of lighting units 10 are lit at the same time. In the inspection system 1 shown in FIG. 1, the plurality of polarized images 50 include polarized images 50a, 50b, 50c, 50d, .... Hereinafter, when the polarized images 50a, 50b, 50c, 50d, ... Are not particularly distinguished, each of the polarized images 50a, 50b, 50c, 50d, ... Is described as "polarized image 50".
 複数の偏光画像50の各々の画素値は、複数の偏光子22のうち対応する偏光子22を透過した光の輝度を示す。たとえば、偏光カメラ20が図2に示す構成を有する場合、偏光子22a~22dにそれぞれ対応する偏光画像50a,50b,50c,50dが出力される。 Each pixel value of the plurality of polarized images 50 indicates the brightness of the light transmitted through the corresponding polarizing element 22 among the plurality of polarizing elements 22. For example, when the polarizing camera 20 has the configuration shown in FIG. 2, the polarized images 50a, 50b, 50c, and 50d corresponding to the splitters 22a to 22d are output.
 検査装置30は、偏光カメラ20から受けた複数の偏光画像50を用いて対象物Wを検査する。 The inspection device 30 inspects the object W using a plurality of polarized images 50 received from the polarizing camera 20.
 複数の偏光子22の各々は、複数の照明部10のうち、当該偏光子22の偏光方向とのなす角度が最も小さい直線偏光を照射する照明部10からの照明光をより多く透過させる。 Each of the plurality of polarizing elements 22 transmits more illumination light from the illumination unit 10 that irradiates linearly polarized light having the smallest angle with the polarization direction of the polarizing element 22 among the plurality of illumination units 10.
 たとえば、偏光子22aの偏光方向と照明部10aの偏光方向とが平行であり、偏光子22cの偏光方向と照明部10cの偏光方向とが平行である場合について説明する。図2に示されるように、偏光子22a,22cの偏光方向は、互いに直交する。この場合、照明部10aから照射され、対象物Wの表面で正反射した光は、偏光子22aを透過するが、偏光子22cを透過しない。一方、照明部10cから照射され、対象物Wの表面で正反射した光は、偏光子22aを透過しないが、偏光子22cを透過する。そのため、偏光子22aに対応する偏光画像50aでは、照明部10aの光が最も強くなるように撮像される。偏光子22cに対応する偏光画像50cでは、照明部10cの光が最も強くなるように撮像される。 For example, a case where the polarization direction of the splitter 22a and the polarization direction of the illumination unit 10a are parallel, and the polarization direction of the splitter 22c and the polarization direction of the illumination unit 10c are parallel will be described. As shown in FIG. 2, the polarization directions of the transducers 22a and 22c are orthogonal to each other. In this case, the light irradiated from the illumination unit 10a and specularly reflected on the surface of the object W passes through the polarizing element 22a but does not pass through the polarizing element 22c. On the other hand, the light irradiated from the illumination unit 10c and specularly reflected on the surface of the object W does not pass through the polarizing element 22a, but passes through the polarizing element 22c. Therefore, in the polarized image 50a corresponding to the polarizing element 22a, the light of the illumination unit 10a is imaged so as to be the strongest. In the polarized image 50c corresponding to the polarizing element 22c, the light of the illumination unit 10c is imaged so as to be the strongest.
 このように、複数の偏光画像50の各々は、対応する偏光子22の偏光方向とのなす角度が最も小さい直線偏光を照射する照明部10の光が最も強くなるように撮像される。すなわち、ワンショット撮像により、複数の照明条件にそれぞれ対応する複数の偏光画像50が取得される。そのため、移動している対象物Wに対して、照明条件の異なる複数の偏光画像50を用いて検査できる。 In this way, each of the plurality of polarized images 50 is imaged so that the light of the illumination unit 10 that irradiates the linearly polarized light having the smallest angle with the polarization direction of the corresponding polarizing element 22 becomes the strongest. That is, a plurality of polarized images 50 corresponding to a plurality of lighting conditions are acquired by one-shot imaging. Therefore, the moving object W can be inspected by using a plurality of polarized images 50 having different illumination conditions.
 §2 具体例
 <A.検査装置のハードウェア構成>
 図3は、検査装置のハードウェア構成を示す模式図である。図3に示すように、検査装置30は、典型的には、汎用的なコンピュータアーキテクチャに従う構造を有しており、予めインストールされたプログラムをプロセッサが実行することで、後述するような各種の処理を実現する。
§2 Specific example <A. Hardware configuration of inspection equipment>
FIG. 3 is a schematic diagram showing the hardware configuration of the inspection device. As shown in FIG. 3, the inspection device 30 typically has a structure that follows a general-purpose computer architecture, and a processor executes a pre-installed program to perform various processes as described later. To realize.
 より具体的には、検査装置30は、CPU(Central Processing Unit)やMPU(Micro-Processing Unit)などのプロセッサ310と、RAM(Random Access Memory)312と、表示コントローラ314と、システムコントローラ316と、I/O(Input Output)コントローラ318と、ハードディスク320と、カメラインターフェイス322と、入力インターフェイス324と、通信インターフェイス328と、メモリカードインターフェイス330とを含む。これらの各部は、システムコントローラ316を中心として、互いにデータ通信可能に接続される。 More specifically, the inspection device 30 includes a processor 310 such as a CPU (Central Processing Unit) and an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 312, a display controller 314, and a system controller 316. It includes an I / O (Input Output) controller 318, a hard disk 320, a camera interface 322, an input interface 324, a communication interface 328, and a memory card interface 330. Each of these parts is connected to each other so as to be capable of data communication, centering on the system controller 316.
 プロセッサ310は、システムコントローラ316との間でプログラム(コード)などを交換して、これらを所定順序で実行することで、目的の演算処理を実現する。 The processor 310 realizes the target arithmetic processing by exchanging programs (codes) and the like with the system controller 316 and executing them in a predetermined order.
 システムコントローラ316は、プロセッサ310、RAM312、表示コントローラ314、およびI/Oコントローラ318とそれぞれバスを介して接続されており、各部との間でデータ交換などを行うとともに、検査装置30全体の処理を司る。 The system controller 316 is connected to the processor 310, RAM 312, display controller 314, and I / O controller 318 via a bus, and exchanges data with each part and processes the entire inspection device 30. Control.
 RAM312は、典型的には、DRAM(Dynamic Random Access Memory)などの揮発性の記憶装置であり、ハードディスク320から読み出されたプログラムや、偏光カメラ20によって取得された偏光画像50、偏光画像50に対する処理結果などを保持する。 The RAM 312 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), with respect to a program read from a hard disk 320, a polarized image 50 acquired by a polarized camera 20, and a polarized image 50. Holds processing results and so on.
 表示コントローラ314は、表示部302と接続されており、システムコントローラ316からの内部コマンドに従って、各種の情報を表示するための信号を表示部302へ出力する。表示部302は、一例として、液晶ディスプレイや有機EL(Electro Luminescence)ディスプレイや有機ELなどを含む。 The display controller 314 is connected to the display unit 302, and outputs signals for displaying various information to the display unit 302 according to an internal command from the system controller 316. The display unit 302 includes, for example, a liquid crystal display, an organic EL (Electro Luminescence) display, an organic EL, and the like.
 I/Oコントローラ318は、検査装置30に接続される記録媒体や外部機器との間のデータ交換を制御する。より具体的には、I/Oコントローラ318は、ハードディスク320と、カメラインターフェイス322と、入力インターフェイス324と、通信インターフェイス328と、メモリカードインターフェイス330と接続される。 The I / O controller 318 controls data exchange with a recording medium or an external device connected to the inspection device 30. More specifically, the I / O controller 318 is connected to the hard disk 320, the camera interface 322, the input interface 324, the communication interface 328, and the memory card interface 330.
 ハードディスク320は、典型的には、不揮発性の磁気記憶装置であり、プロセッサ310で実行される検査プログラム350を記憶する。このハードディスク320にインストールされる検査プログラム350は、メモリカード306などに格納された状態で流通する。さらに、ハードディスク320には、カメラ画像が格納される。なお、ハードディスク320に代えて、フラッシュメモリなどの半導体記憶装置やDVD-RAM(Digital Versatile Disk Random Access Memory)などの光学記憶装置を採用してもよい。 The hard disk 320 is typically a non-volatile magnetic storage device that stores the inspection program 350 executed by the processor 310. The inspection program 350 installed on the hard disk 320 is distributed in a state of being stored in a memory card 306 or the like. Further, a camera image is stored in the hard disk 320. Instead of the hard disk 320, a semiconductor storage device such as a flash memory or an optical storage device such as a DVD-RAM (Digital Versatile Disk Random Access Memory) may be adopted.
 カメラインターフェイス322は、対象物Wを撮像することで生成された偏光画像50を受け付ける入力部に相当し、プロセッサ310と偏光カメラ20との間のデータ伝送を仲介する。より具体的には、カメラインターフェイス322は、偏光カメラ20と接続が可能であり、プロセッサ310からカメラインターフェイス322を介して偏光カメラ20に撮像指示が出力される。これにより、偏光カメラ20は、対象物Wを撮像し、カメラインターフェイス322を介して、生成された複数の偏光画像50をプロセッサ310に出力する。 The camera interface 322 corresponds to an input unit that receives the polarized image 50 generated by imaging the object W, and mediates data transmission between the processor 310 and the polarized camera 20. More specifically, the camera interface 322 can be connected to the polarized camera 20, and an imaging instruction is output from the processor 310 to the polarized camera 20 via the camera interface 322. As a result, the polarized camera 20 takes an image of the object W and outputs a plurality of generated polarized images 50 to the processor 310 via the camera interface 322.
 入力インターフェイス324は、プロセッサ310とキーボード304、マウス、タッチパネル、専用コンソールなどの入力装置との間のデータ伝送を仲介する。すなわち、入力インターフェイス324は、ユーザが入力装置を操作することで与えられる操作指令を受け付ける。 The input interface 324 mediates data transmission between the processor 310 and an input device such as a keyboard 304, a mouse, a touch panel, and a dedicated console. That is, the input interface 324 receives an operation command given by the user operating the input device.
 通信インターフェイス328は、プロセッサ310と図示しない他のパーソナルコンピュータやサーバ装置などとの間のデータ伝送を仲介する。通信インターフェイス328は、典型的には、イーサネット(登録商標)やUSB(Universal Serial Bus)などからなる。なお、後述するように、メモリカード306に格納されたプログラムを検査装置30にインストールする形態に代えて、通信インターフェイス328を介して、配信サーバなどからダウンロードしたプログラムを検査装置30にインストールしてもよい。 The communication interface 328 mediates data transmission between the processor 310 and another personal computer, server device, or the like (not shown). The communication interface 328 typically comprises Ethernet (registered trademark), USB (Universal Serial Bus), or the like. As will be described later, instead of installing the program stored in the memory card 306 in the inspection device 30, the program downloaded from the distribution server or the like may be installed in the inspection device 30 via the communication interface 328. good.
 メモリカードインターフェイス330は、プロセッサ310と記録媒体であるメモリカード306との間のデータ伝送を仲介する。すなわち、メモリカード306には、検査装置30で実行される検査プログラム350などが格納された状態で流通し、メモリカードインターフェイス330は、このメモリカード306から検査プログラム350を読み出す。なお、メモリカード306は、SD(Secure Digital)などの汎用的な半導体記憶デバイスや、フレキシブルディスク(Flexible Disk)などの磁気記録媒体や、CD-ROM(Compact Disk Read Only Memory)などの光学記録媒体等からなる。 The memory card interface 330 mediates data transmission between the processor 310 and the memory card 306 which is a recording medium. That is, the memory card 306 is distributed in a state in which the inspection program 350 or the like executed by the inspection device 30 is stored, and the memory card interface 330 reads the inspection program 350 from the memory card 306. The memory card 306 is a general-purpose semiconductor storage device such as SD (Secure Digital), a magnetic recording medium such as a flexible disk (Flexible Disk), or an optical recording medium such as a CD-ROM (Compact Disk Read Only Memory). Etc.
 <B.検査例>
 上述したように、検査したい欠陥の種類に応じて、最適な照明条件が異なる。そのため、検査したい欠陥の種類に応じて、複数の照明部10の位置、照明強度、偏光方向などが適切に設定される。以下に、本実施の形態に係る検査システム1を用いた第1~第5の検査例について説明する。なお、検査システム1は、以下の第1~第5の検査例に限定されず、他の検査例に用いられてもよい。
<B. Inspection example>
As described above, the optimum lighting conditions differ depending on the type of defect to be inspected. Therefore, the positions, illumination intensities, polarization directions, and the like of the plurality of illumination units 10 are appropriately set according to the type of defect to be inspected. The first to fifth inspection examples using the inspection system 1 according to the present embodiment will be described below. The inspection system 1 is not limited to the following first to fifth inspection examples, and may be used for other inspection examples.
 (B-1.第1の検査例)
 第1の検査例は、フォトメトリックステレオ法(照度差ステレオ法)を用いる。フォトメトリックステレオ法は、照明光の入射方向が異なる条件下で得られる複数の画像を用いて、対象物の表面の傾き(法線ベクトル)を推定する手法である。
(B-1. First inspection example)
In the first inspection example, a photometric stereo method (illuminance difference stereo method) is used. The photometric stereo method is a method of estimating the inclination (normal vector) of the surface of an object by using a plurality of images obtained under conditions where the incident direction of the illumination light is different.
 図4は、フォトメトリックステレオ法で用いられる従来の照明装置の一例を示す図である。図4に示されるように、リング型の照明装置110の4つの円弧領域110a~110dが順次点灯され、非偏光カメラを用いて撮像される。照明装置110の中心は、非偏光カメラの光軸225上に位置する。 FIG. 4 is a diagram showing an example of a conventional lighting device used in the photometric stereo method. As shown in FIG. 4, the four arc regions 110a to 110d of the ring-shaped lighting device 110 are sequentially turned on and imaged using a non-polarized camera. The center of the illuminating device 110 is located on the optical axis 225 of the unpolarized camera.
 具体的には、円弧領域110aのみが点灯された状態で1回目の撮像が行なわれる。次に、円弧領域110bのみが点灯された状態で2回目の撮像が行なわれる。次に、円弧領域110cのみが点灯された状態で3回目の撮像が行なわれる。次に、円弧領域110dのみが点灯された状態で4回目の撮像が行なわれる。これにより、照明光の入射方向が異なる条件下で得られる4枚の画像が得られる。しかしながら、4回の撮像が行なわれるため、図4に示す照明装置110は、移動している対象物Wには適用できない。そこで、本実施の形態に係る検査システム1は、第1の検査例において、フォトメトリックステレオ法で用いる複数の画像をワンショット撮像で取得する。 Specifically, the first imaging is performed with only the arc region 110a lit. Next, the second imaging is performed with only the arc region 110b lit. Next, the third imaging is performed with only the arc region 110c lit. Next, the fourth imaging is performed with only the arc region 110d lit. As a result, four images obtained under conditions where the incident directions of the illumination light are different can be obtained. However, since the imaging is performed four times, the lighting device 110 shown in FIG. 4 cannot be applied to the moving object W. Therefore, the inspection system 1 according to the present embodiment acquires a plurality of images used in the photometric stereo method by one-shot imaging in the first inspection example.
  (照明部の配置)
 図5は、第1の検査例に係る複数の照明部の配置を示す図である。図5に示されるように、複数の照明部10は、照明部10a~10dを含む。
(Arrangement of lighting unit)
FIG. 5 is a diagram showing the arrangement of a plurality of lighting units according to the first inspection example. As shown in FIG. 5, the plurality of lighting units 10 include the lighting units 10a to 10d.
 照明部10a,10cは、偏光カメラ20の光軸25に対して対称な位置に配置される。照明部10b,10dは、偏光カメラ20の光軸25に対して対称な位置に配置される。偏光カメラ20の光軸25の周りにおいて、照明部10aが配置される方位角と照明部10bが配置される方位角との差が90°である。以下、偏光カメラ20の光軸25に対して、照明部10aが配置される方位の方向をX方向とし、照明部10bが配置される方位の方向をY方向とする。 The illumination units 10a and 10c are arranged at positions symmetrical with respect to the optical axis 25 of the polarizing camera 20. The illumination units 10b and 10d are arranged at positions symmetrical with respect to the optical axis 25 of the polarizing camera 20. Around the optical axis 25 of the polarizing camera 20, the difference between the azimuth angle in which the illumination unit 10a is arranged and the azimuth angle in which the illumination unit 10b is arranged is 90 °. Hereinafter, the direction of the orientation in which the illumination unit 10a is arranged is defined as the X direction with respect to the optical axis 25 of the polarizing camera 20, and the direction of the orientation in which the illumination unit 10b is arranged is defined as the Y direction.
 照明部10a~10dは、発光部11a~11dと直線偏光フィルタ12a~12dとをそれぞれ有する。 The illumination units 10a to 10d have light emitting units 11a to 11d and linear polarizing filters 12a to 12d, respectively.
 発光部11a~11dは、非偏光を発する。発光部11a~11dの各々は、中心角90°の円弧状である。円弧状の発光部11a~11dは、中心が一致し、かつ、互いに重ならないように配置される。具体的には、発光部11a~11dの一端は、発光部11b~11d,11aの他端とそれぞれ接する。すなわち、発光部11a~11dが組み合わさることにより、1つのリング型照明が構成される。言い換えると、発光部11a~11dは、1つのリング型照明のうちの、中心角90°の4つの円弧領域である。なお、円弧状の発光部11a~11dの中心は、偏光カメラ20の光軸25上に位置する。 The light emitting units 11a to 11d emit unpolarized light. Each of the light emitting portions 11a to 11d has an arc shape with a central angle of 90 °. The arc-shaped light emitting portions 11a to 11d are arranged so that their centers coincide with each other and do not overlap with each other. Specifically, one end of the light emitting portions 11a to 11d is in contact with the other ends of the light emitting portions 11b to 11d and 11a, respectively. That is, one ring-shaped illumination is configured by combining the light emitting units 11a to 11d. In other words, the light emitting units 11a to 11d are four arc regions having a central angle of 90 ° in one ring-shaped illumination. The center of the arc-shaped light emitting portions 11a to 11d is located on the optical axis 25 of the polarizing camera 20.
 直線偏光フィルタ12a~12dは、発光部11a~11dの発光面にそれぞれ取り付けられる。 The linear polarizing filters 12a to 12d are attached to the light emitting surfaces of the light emitting units 11a to 11d, respectively.
 図6は、直線偏光フィルタ12a~12dの偏光方向の一例を示す図である。直線偏光フィルタ12aを透過する光の偏光方向を基準方向とし、基準方向と偏光方向とのなす角度を偏光角とすると、直線偏光フィルタ12a~12dの偏光角は、それぞれ0°,45°,90°,135°である。なお、偏光角には、製造上の誤差、取り付け誤差などが含まれ得る。たとえば、製造上の誤差および取り付け誤差の合計の最大がβ°である場合、「偏光角45°」は、45°±β°の範囲を含む。このように、照明部10b~10dの照明光の偏光方向と照明部10aの照明光の偏光方向とのなす角度は、それぞれ45°、90°および135°である。 FIG. 6 is a diagram showing an example of the polarization direction of the linear polarization filters 12a to 12d. Assuming that the polarization direction of the light transmitted through the linear polarization filter 12a is the reference direction and the angle formed by the reference direction and the polarization direction is the polarization angle, the polarization angles of the linear polarization filters 12a to 12d are 0 °, 45 °, and 90, respectively. °, 135 °. The polarization angle may include manufacturing errors, mounting errors, and the like. For example, if the maximum sum of manufacturing and mounting errors is β °, the “polarization angle 45 °” includes the range 45 ° ± β °. As described above, the angles formed by the polarization direction of the illumination light of the illumination units 10b to 10d and the polarization direction of the illumination light of the illumination unit 10a are 45 °, 90 °, and 135 °, respectively.
 第1の検査例において、検査システム1は、図2に示す偏光子22a~22dを含む偏光カメラ20を備える。偏光子22a~22dの偏光方向は、直線偏光フィルタ12a~12dの偏光方向とそれぞれ平行である。そのため、照明部10a~10dから照射され、対象物Wで正反射した光は、偏光子22a~22dをそれぞれ透過する。照明部10aから照射される光は、偏光子22b,22dの偏光方向に平行な成分を有する。そのため、照明部10aから照射され、対象物Wで正反射した光のうちの一部の成分は、偏光子22b,22dを透過する。具体的には、照明部10aから照射され、偏光子22b,22dの各々を透過する光量は、照明部10aから照射され、偏光子22aを透過する光量の約1/2である。同様に、照明部10bから照射され、対象物Wで正反射した光のうちの一部の成分は、偏光子22a,22cを透過する。照明部10cから照射され、対象物Wで正反射した光のうちの一部の成分は、偏光子22b,22dを透過する。照明部10dから照射され、対象物Wで正反射した光のうちの一部の成分は、偏光子22a,22cを透過する。 In the first inspection example, the inspection system 1 includes a polarizing camera 20 including the polarizing elements 22a to 22d shown in FIG. The polarization directions of the splitters 22a to 22d are parallel to the polarization directions of the linear polarization filters 12a to 12d, respectively. Therefore, the light emitted from the illumination units 10a to 10d and specularly reflected by the object W passes through the polarizing elements 22a to 22d, respectively. The light emitted from the illumination unit 10a has components parallel to the polarization directions of the splitters 22b and 22d. Therefore, some of the components of the light that is irradiated from the illumination unit 10a and specularly reflected by the object W pass through the splitters 22b and 22d. Specifically, the amount of light emitted from the illumination unit 10a and transmitted through each of the polarizing elements 22b and 22d is about ½ of the amount of light emitted from the illumination unit 10a and transmitted through the splitter 22a. Similarly, some of the components of the light emitted from the illumination unit 10b and specularly reflected by the object W pass through the stators 22a and 22c. Some of the components of the light emitted from the illumination unit 10c and specularly reflected by the object W pass through the transducers 22b and 22d. Some of the components of the light emitted from the illumination unit 10d and specularly reflected by the object W pass through the stators 22a and 22c.
  (検査処理の流れ)
 図7は、第1の検査例における検査処理の流れの一例を示すフローチャートである。まず、偏光カメラ20は、照明部10a~10dが同時点灯している状態において、対象物Wを撮像し、偏光子22a~22dにそれぞれ対応する偏光画像50a~50dを出力する(ステップS1)。ステップS1の後、検査システム1は、並行してステップS2,S3,S10を開始する。
(Flow of inspection process)
FIG. 7 is a flowchart showing an example of the flow of the inspection process in the first inspection example. First, the polarizing camera 20 takes an image of the object W in a state where the illumination units 10a to 10d are lit at the same time, and outputs polarized images 50a to 50d corresponding to the polarizing elements 22a to 22d (step S1). After step S1, the inspection system 1 starts steps S2, S3, S10 in parallel.
 ステップS2において、検査装置30のプロセッサ310は、偏光画像50a~50dを用いて、対象物Wの表面の法線ベクトルにおけるX方向に沿った成分の大きさを示すX方向法線画像を生成する。同様に、ステップS3において、プロセッサ310は、偏光画像50a~50dを用いて、対象物Wの表面の法線ベクトルにおけるY方向成分の大きさを示すY方向法線画像を生成する。X方向法線画像およびX方向法線画像の生成方法の詳細は後述する。 In step S2, the processor 310 of the inspection device 30 uses the polarized images 50a to 50d to generate an X-direction normal image showing the magnitude of the component along the X direction in the normal vector of the surface of the object W. .. Similarly, in step S3, the processor 310 uses the polarized images 50a to 50d to generate a Y-direction normal image showing the magnitude of the Y-direction component in the normal vector of the surface of the object W. Details of the X-direction normal image and the method of generating the X-direction normal image will be described later.
 ステップS2およびステップS3の後、ステップS4およびステップS5がそれぞれ実施される。ステップS2,S3では、後述するように、対象物Wの表面の光沢度αが1であると仮定して、X方向法線画像およびY方向法線画像が生成される。しかしながら、対象物Wの表面の光沢度αは1とは限らない。光沢度αに応じて、入射光のうち正反射する成分の割合が異なる。そのため、ステップS4において、プロセッサ310は、対象物Wの表面の光沢度αに応じてX方向法線画像を補正する。ステップS5において、プロセッサ310は、対象物Wの表面の光沢度αに応じてY方向法線画像を補正する。X方向法線画像およびY方向法線画像の補正方法の詳細は後述する。 After step S2 and step S3, steps S4 and S5 are carried out, respectively. In steps S2 and S3, as will be described later, an X-direction normal image and a Y-direction normal image are generated on the assumption that the glossiness α on the surface of the object W is 1. However, the glossiness α on the surface of the object W is not always 1. The ratio of the specularly reflected component of the incident light differs depending on the glossiness α. Therefore, in step S4, the processor 310 corrects the X-direction normal image according to the glossiness α of the surface of the object W. In step S5, the processor 310 corrects the Y-direction normal image according to the glossiness α of the surface of the object W. Details of the correction method for the X-direction normal image and the Y-direction normal image will be described later.
 ステップS4およびステップS5の後、プロセッサ310は、並行してステップS6およびステップS8を開始する。 After step S4 and step S5, the processor 310 starts steps S6 and S8 in parallel.
 ステップS6において、プロセッサ310は、X方向法線画像とY方向法線画像とに基づいて、対象物Wの表面の形状を示す形状画像を生成する。形状画像の生成方法の詳細は後述する。 In step S6, the processor 310 generates a shape image showing the shape of the surface of the object W based on the X direction normal image and the Y direction normal image. Details of the shape image generation method will be described later.
 次にステップS7において、プロセッサ310は、形状画像に基づいて対象物Wの表面を検査する。たとえば、プロセッサ310は、キズおよび打痕のような、凹凸の生じる欠陥の有無を検査する。なお、プロセッサ310は、ステップS7において、形状画像を二値化することにより二値画像を生成し、二値画像に基づいて欠陥の有無を検査してもよい。 Next, in step S7, the processor 310 inspects the surface of the object W based on the shape image. For example, the processor 310 inspects for uneven defects such as scratches and dents. In step S7, the processor 310 may generate a binary image by binarizing the shape image and inspect the presence or absence of defects based on the binary image.
 ステップS8において、プロセッサ310は、入射光のうち正反射する光の割合(アルベド(反射率))を示すアルベド画像を生成する。アルベド画像は、偏光画像50a~50dと、ステップS4,S5によってそれぞれ補正されたX方向法線画像およびY方向法線画像とを用いて生成される。アルベド画像の生成方法の詳細は後述する。 In step S8, the processor 310 generates an albedo image showing the ratio of specularly reflected light (albedo (reflectance)) to the incident light. The albedo image is generated using the polarized images 50a to 50d, and the X-direction normal image and the Y-direction normal image corrected by steps S4 and S5, respectively. Details of the albedo image generation method will be described later.
 次にステップS9において、プロセッサ310は、アルベド画像に基づいて対象物Wの表面を検査する。たとえば、プロセッサ310は、汚れのようなアルベドに変化を生じさせる欠陥の有無を検査する。なお、プロセッサ310は、ステップS9において、アルベド画像を二値化することにより二値画像を生成し、二値画像に基づいて欠陥の有無を検査してもよい。 Next, in step S9, the processor 310 inspects the surface of the object W based on the albedo image. For example, the processor 310 checks for defects that cause changes in the albedo, such as dirt. In step S9, the processor 310 may generate a binary image by binarizing the albedo image and inspect the presence or absence of defects based on the binary image.
 ステップS10において、プロセッサ310は、偏光画像50a~50dを平均化することにより平均画像を生成する。平均画像の各画素の値は、偏光画像50a~50dの当該画素の値の平均値である。平均画像は、直線偏光フィルタ12a~12dを取り外した状態で発光部11a~11dを同時点灯させ、偏光カメラ20の代わりに非偏光カメラを用いて撮像することにより得られる画像に対応する。 In step S10, the processor 310 generates an average image by averaging the polarized images 50a to 50d. The value of each pixel of the average image is the average value of the values of the pixels of the polarized images 50a to 50d. The average image corresponds to an image obtained by simultaneously lighting the light emitting units 11a to 11d with the linear polarization filters 12a to 12d removed and taking an image using a non-polarized camera instead of the polarized camera 20.
 次にステップS11において、プロセッサ310は、平均画像を用いて対象物Wの位置決め等を行なう。 Next, in step S11, the processor 310 positions the object W using the average image.
 ステップS7,S9,S11の終了後、プロセッサ310は、検査結果を表示部302に表示させる(ステップS12)。ステップS12の後、プロセッサ310は、検査処理を終了する。 After the completion of steps S7, S9, and S11, the processor 310 causes the display unit 302 to display the inspection result (step S12). After step S12, the processor 310 ends the inspection process.
  (X方向法線画像およびX方向法線画像の生成方法)
 図8は、照明部10a~10dと対象物Wの表面上の点Pにおける法線ベクトルnとの関係を示す図である。図8において、偏光カメラ20の光軸25がZ軸と一致する。法線ベクトルnは、(nx,ny,nz)で表される。nx,ny,nzは、それぞれ法線ベクトルnのX成分、Y成分およびZ成分である。
(How to generate an X-direction normal image and an X-direction normal image)
FIG. 8 is a diagram showing the relationship between the illumination units 10a to 10d and the normal vector n at the point P on the surface of the object W. In FIG. 8, the optical axis 25 of the polarizing camera 20 coincides with the Z axis. The normal vector n is represented by (nx, ny, nz). nx, ny, and nz are the X component, the Y component, and the Z component of the normal vector n, respectively.
 図9は、照明部10cから照射された光の進行方向と点Pの法線方向とをXZ平面に投影したときの、当該進行方向と当該法線方向の関係を示す図である。図9において、法線方向は、ベクトル(nx,nz)で表される。法線方向とZ軸とのなす角度をφxとすると、φxは、以下の式で表される。φx=arctan(nx/nz)
さらに、照明部10a~10dの各々から照射された光の入射角をθ、nx>0およびnz>0とすると、照明部10cから照射され、点Pで正反射した光の進行方向(正反射方向)のXZ平面への投影成分とZ軸とのなす角度は、θ+2φxとなる。
FIG. 9 is a diagram showing the relationship between the traveling direction and the normal direction when the traveling direction of the light emitted from the illumination unit 10c and the normal direction of the point P are projected onto the XZ plane. In FIG. 9, the normal direction is represented by a vector (nx, nz). Assuming that the angle formed by the normal direction and the Z axis is φx, φx is expressed by the following equation. φx = arctan (nx / nz)
Further, assuming that the incident angles of the light emitted from each of the illumination units 10a to 10d are θ, nx> 0 and nz> 0, the traveling direction (specular reflection) of the light emitted from the illumination unit 10c and specularly reflected at the point P. The angle formed by the projection component of the direction) on the XZ plane and the Z axis is θ + 2φx.
 図10は、照明部10cから照射された光の進行方向と点Pの法線方向とをYZ平面に投影したときの、当該進行方向と当該法線方向の関係を示す図である。図10において、法線方向は、ベクトル(ny,nz)で表される。法線方向とZ軸とのなす角度をφyとすると、φyは、以下の式で表される。
φx=arctan(ny/nz)
さらに、照明部10cから照射された光の入射角をθ、ny>0およびnz>0とすると、照明部10cから照射され、点Pで正反射した光の進行方向(正反射方向)のYZ平面への投影成分とZ軸とのなす角度は、2φyとなる。
FIG. 10 is a diagram showing the relationship between the traveling direction and the normal direction when the traveling direction of the light emitted from the illumination unit 10c and the normal direction of the point P are projected onto the YZ plane. In FIG. 10, the normal direction is represented by a vector (ny, nz). Assuming that the angle formed by the normal direction and the Z axis is φy, φy is expressed by the following equation.
φx = arctan (ny / nz)
Further, assuming that the incident angles of the light emitted from the illumination unit 10c are θ, ny> 0 and nz> 0, the YZ in the traveling direction (specular reflection direction) of the light emitted from the illumination unit 10c and specularly reflected at the point P. The angle formed by the projection component on the plane and the Z axis is 2φy.
 同様にして、照明部10a,10b,10dから照射され、点Pで正反射した光の進行方向(正反射方向)のXZ平面への投影成分とZ軸とのなす角度は、それぞれθ-2φx,2φx,2φxとなる。照明部10a,10b,10dから照射され、点Pで正反射した光の進行方向(正反射方向)のYZ平面への投影成分とZ軸とのなす角度は、それぞれ2φy,θ-2φy,θ+2φyとなる。 Similarly, the angles formed by the projection component on the XZ plane and the Z axis in the traveling direction (specular reflection direction) of the light irradiated from the illumination units 10a, 10b, 10d and specularly reflected at the point P are θ-2φx, respectively. , 2φx, 2φx. The angles formed by the projection component on the YZ plane and the Z axis in the traveling direction (specular reflection direction) of the light emitted from the illumination units 10a, 10b, and 10d and specularly reflected at the point P are 2φy, θ-2φy, and θ + 2φy, respectively. It becomes.
 照明部10a~10dから照射され、点Pで正反射した光のうち偏光カメラ20にそれぞれ入射する光の強度Ia~Idは、Phongの反射モデルを用いて、以下の式によって近似される。
Ia=μ[cosα(θ-2φx)cosα(2φy)]
Ib=μ[cosα(2φx)cosα(θ-2φy)]
Ic=μ[cosα(θ+2φx)cosα(2φy)]
Id=μ[cosα(2φx)cosα(θ+2φy)]
上記の式において、αは、光沢度を表す。μは、対象物Wの表面のアルベド(反射率)を表す。
Of the light emitted from the illumination units 10a to 10d and specularly reflected at the point P, the intensities Ia to Id of the light incident on the polarizing camera 20 are approximated by the following equations using the Phong reflection model.
Ia = μ [cos α (θ-2φx) cos α (2φy)]
Ib = μ [cos α (2φx) cos α (θ-2φy)]
Ic = μ [cos α (θ + 2φx) cos α (2φy)]
Id = μ [cos α (2φx) cos α (θ + 2φy)]
In the above formula, α represents glossiness. μ represents an albedo (reflectance) on the surface of the object W.
 上述したように、偏光カメラ20の偏光子22aの偏光方向は、照明部10aの偏光方向と平行であり、照明部10cの偏光方向と直交する。さらに、偏光子22aの偏光方向と照明部10b,10dの偏光方向とのなす角度は45°である。そのため、偏光子22aに対応する偏光画像50aの画素値をJaとすると、
Ja=Ia+(Ib+Id)/2
で表される。同様に、偏光画像50b~50dのそれぞれの画素値Jb~Jdは、
Jb=Ib+(Ia+Ic)/2
Jc=Ic+(Id+Ib)/2
Jd=Id+(Ic+Ia)/2
で表される。
As described above, the polarization direction of the polarizing element 22a of the polarizing camera 20 is parallel to the polarization direction of the illumination unit 10a and orthogonal to the polarization direction of the illumination unit 10c. Further, the angle formed by the polarization direction of the polarizing element 22a and the polarization direction of the illumination units 10b and 10d is 45 °. Therefore, assuming that the pixel value of the polarized image 50a corresponding to the polarizing element 22a is Ja,
Ja = Ia + (Ib + Id) / 2
It is represented by. Similarly, the respective pixel values Jb to Jd of the polarized images 50b to 50d are
Jb = Ib + (Ia + Ic) / 2
Jc = Ic + (Id + Ib) / 2
Jd = Id + (Ic + Ia) / 2
It is represented by.
 光沢度α=1の場合、以下の式(1)~(3)が成り立つ。
Ja+Jb+Jc+Jd=2(Ia+Ib+Ic+Id)
=8μcos(θ)cos(2φx)cos(2φy)・・・式(1)
Ja-Jc=Ia-Ic=2μsin(θ)sin(2φx)cos(2φy)・・・式(2)
Jb-Jd=Ib-Id=2μsin(θ)cos(2φx)sin(2φy)・・・式(3)。
When the glossiness α = 1, the following equations (1) to (3) hold.
Ja + Jb + Jc + Jd = 2 (Ia + Ib + Ic + Id)
= 8 μcos (θ) cos (2φx) cos (2φy) ... Equation (1)
Ja-Jc = Ia-Ic = 2μsin (θ) sin (2φx) cos (2φy) ... Equation (2)
Jb-Jd = Ib-Id = 2μsin (θ) cos (2φx) sin (2φy) ... Equation (3).
 上記の式(1)と式(2)とから、以下の式(4)が導かれる。同様に、上記の式(1)と式(3)とから、以下の式(5)が導かれる。
tan(2φx)={4(Ja-Jc)}/{(Ja+Jb+Jc+Jd)tan(θ)}
・・・式(4)
tan(2φy)={4(Jb-Jd)}/{(Ja+Jb+Jc+Jd)tan(θ)}
・・・式(5)。
The following equation (4) is derived from the above equations (1) and (2). Similarly, the following equation (5) is derived from the above equations (1) and (3).
tan (2φx) = {4 (Ja-Jc)} / {(Ja + Jb + Jc + Jd) tan (θ)}
... Equation (4)
tan (2φy) = {4 (Jb-Jd)} / {(Ja + Jb + Jc + Jd) tan (θ)}
... Equation (5).
 式(4)および式(5)において、入射角θは、照明部10a~10dの位置に応じて予め決定される。そのため、プロセッサ310は、偏光画像50b~50dの画素値と入射角θとを用いて式(4)の右辺の計算し、その結果であるNxを画素値とするX方向法線画像を生成する。同様に、プロセッサ310は、偏光画像50b~50dの画素値と入射角θとを用いて式(5)の右辺の計算し、その結果であるNyを画素値とするY方向法線画像を生成する。式(4)で表されるtan(2φx)(≡Nx)は、法線ベクトルのx方向成分であるnxに依存する値である。式(5)で表されるtan(2φx)(≡Ny)は、法線ベクトルのy方向成分であるnyに依存する値である。そのため、X方向法線画像およびY方向法線画像は、対象物Wの表面の法線方向を表す。 In the formulas (4) and (5), the incident angle θ is determined in advance according to the positions of the illumination units 10a to 10d. Therefore, the processor 310 calculates the right side of the equation (4) using the pixel values of the polarized images 50b to 50d and the incident angle θ, and generates an X-direction normal image having Nx as the pixel value as the result. .. Similarly, the processor 310 calculates the right side of the equation (5) using the pixel values of the polarized images 50b to 50d and the incident angle θ, and generates a Y-direction normal image having Ny as the pixel value as a result. do. Tan (2φx) (≡Nx) represented by the equation (4) is a value depending on nx, which is a component in the x direction of the normal vector. Tan (2φx) (≡Ny) represented by the equation (5) is a value depending on ny, which is a component in the y direction of the normal vector. Therefore, the X-direction normal image and the Y-direction normal image represent the normal direction of the surface of the object W.
  (X方向法線画像およびY方向法線画像の補正方法)
 X方向法線画像およびY方向法線画像は、対象物Wの表面の光沢度αが1であると仮定して、式(4)および(式5)に従ってそれぞれ生成される。そのため、対象物Wの光沢度αに従って、X方向法線画像およびY方向法線画像の補正が実行される。これにより、光沢度αに応じた法線画像が精度良く生成される。
(Correction method for X-direction normal image and Y-direction normal image)
The X-direction normal image and the Y-direction normal image are generated according to the equations (4) and (5), respectively, assuming that the glossiness α of the surface of the object W is 1. Therefore, the correction of the X-direction normal image and the Y-direction normal image is executed according to the glossiness α of the object W. As a result, a normal image corresponding to the glossiness α is generated with high accuracy.
 対象物Wの光沢度αは、以下の検査前手順に従って、予め決定される。検査前手順では、欠陥のない対象物Wが2軸ゴニオステージに載置される。2軸ゴニオステージは、X方向およびY方向に傾斜可能なステージである。 The glossiness α of the object W is determined in advance according to the following pre-inspection procedure. In the pre-inspection procedure, the defect-free object W is placed on the 2-axis goniometer stage. The biaxial goniometer stage is a stage that can be tilted in the X and Y directions.
 2軸ゴニオステージのX方向およびY方向の傾斜角度が0°に設定された状態のとき、対象物Wは、上面が水平かつ平坦な領域(以下、「対象領域」と称する。)を有するように、2軸ゴニオステージに載置される。そのため、対象領域の法線ベクトルのX方向成分とZ軸とのなす角度φxは、2軸ゴニオステージのX方向の傾斜角度と一致する。同様に、し、対象領域の法線ベクトルのY方向成分とZ軸とのなす角度φyは、2軸ゴニオステージのY方向の傾斜角度と一致する。 When the inclination angles of the biaxial goniometer stage in the X and Y directions are set to 0 °, the object W has a horizontal and flat upper surface (hereinafter referred to as “target area”). It will be placed on the 2-axis goniometer stage. Therefore, the angle φx formed by the X-direction component of the normal vector of the target region and the Z-axis coincides with the inclination angle of the biaxial goniometer stage in the X-direction. Similarly, the angle φy formed by the Y-direction component of the normal vector of the target region and the Z-axis coincides with the tilt angle of the biaxial goniometer stage in the Y-direction.
 2軸ゴニオステージのX方向およびY方向の傾斜角度を変化させながら、検査システム1を用いてM回撮像する。M回の撮像の各々について、X方向法線画像およびY方向法線画像を生成する。プロセッサ310は、X方向法線画像およびY方向法線画像における対象領域内の任意の1画素について、データセット[φxm,φym,Nxm,Nym]をRAM312に格納する。mは、撮像番号を示し、1~Mのいずれかの整数である。φxmは、m回目の撮像における、2軸ゴニオステージのX方向の傾斜角度である。φymは、m回目の撮像における、2軸ゴニオステージのY方向の傾斜角度である。Nxmは、X方向法線画像の画素値である。Nymは、Y方向法線画像の画素値である。 While changing the tilt angle of the 2-axis goniometer stage in the X and Y directions, image is taken M times using the inspection system 1. An X-direction normal image and a Y-direction normal image are generated for each of the M times of imaging. The processor 310 stores the data set [φxm, φym, Nxm, Nym] in the RAM 312 for any one pixel in the target area in the X-direction normal image and the Y-direction normal image. m indicates an imaging number and is an integer of 1 to M. φxm is the tilt angle of the biaxial goniometer stage in the X direction in the mth imaging. φym is the tilt angle of the biaxial goniometer stage in the Y direction in the mth imaging. Nxm is a pixel value of an X-direction normal image. Nym is a pixel value of a Y-direction normal image.
 プロセッサ310は、以下の理論式を用いて、RAM312に格納されたデータセット[φxm,φym,Nxm,Nym]に最もよく当てはまる光沢度αを決定する。
≪理論式≫
Ia=cosα(θ-2φxm)cosα(2φym)
Ib=cosα(2φxm)cosα(θ-2φym)
Ic=cosα(θ+2φxm)cosα(2φym)
Id=cosα(2φxm)cosα(θ+2φym)
Nx={2(Ia-Ic)}/{(Ia+Ib+Ic+Id)tan(θ)}
Ny={2(Ib-Id)}/{(Ia+Ib+Ic+Id)tan(θ)}
たとえば、プロセッサ310は、非線形最小二乗法を用いて光沢度αを決定する。このような検査前手順によって、対象物Wの光沢度αが予め決定される。
The processor 310 uses the following theoretical formula to determine the gloss α that best fits the data set [φxm, φym, Nxm, Nym] stored in the RAM 312.
≪Theoretical formula≫
Ia = cos α (θ-2φxm) cos α (2φym)
Ib = cos α (2φxm) cosα (θ-2φym)
Ic = cos α (θ + 2φxm) cos α (2φym)
Id = cos α (2φxm) cosα (θ + 2φym)
Nx = {2 (Ia-Ic)} / {(Ia + Ib + Ic + Id) tan (θ)}
Ny = {2 (Ib-Id)} / {(Ia + Ib + Ic + Id) tan (θ)}
For example, the processor 310 uses a nonlinear least squares method to determine the gloss α. By such a pre-inspection procedure, the glossiness α of the object W is predetermined.
 図11は、光沢度α=5のときのNyとφxとφyとの関係を示す図である。図12は、光沢度α=10のときのNyとφxとφyとの関係を示す図である。図13は、光沢度α=20のときのNyとφxとφyとの関係を示す図である。 FIG. 11 is a diagram showing the relationship between Ny, φx, and φy when the glossiness α = 5. FIG. 12 is a diagram showing the relationship between Ny, φx, and φy when the glossiness α = 10. FIG. 13 is a diagram showing the relationship between Ny, φx, and φy when the glossiness α = 20.
 図11~図13に示されるように、法線ベクトルのX,Y方向成分とZ軸とのそれぞれのなす角度φx,φyと、式(5)の右辺の計算結果であるNyとの関係は、単調変化を示す。すなわち、φyの増大に伴い、Nyも増大する。また、φy>0の場合、φxの増大に伴い、Nyは減少する。φy<0の場合、φxの増大に伴い、Nyも増大する。ただし、光沢度αが1に近い場合にはNyとφyとの関係が線形であるのに対し、光沢度αが大きくなると、Nyとφyとの関係が非線形になる。そのため、プロセッサ310は、光沢度αに応じて、法線画像を補正する。 As shown in FIGS. 11 to 13, the relationship between the angles φx and φy formed by the X and Y direction components of the normal vector and the Z axis and Ny which is the calculation result on the right side of the equation (5) is , Shows monotonous changes. That is, as φy increases, Ny also increases. Further, when φy> 0, Ny decreases as φx increases. When φy <0, Ny also increases as φx increases. However, when the glossiness α is close to 1, the relationship between Ny and φy is linear, whereas when the glossiness α is large, the relationship between Ny and φy becomes non-linear. Therefore, the processor 310 corrects the normal image according to the glossiness α.
 具体的には、プロセッサ310は、予め決定された光沢度αが代入された上記の理論式を用いて、(Nx,Ny)から(φx,φy)を逆算する。プロセッサ310は、X方向法線画像における各画素値Nxをφxに置き換えることにより、X方向法線画像を補正する。同様に、プロセッサ310は、Y方向法線画像における各画素値Nyをφyに置き換えることにより、Y方向法線画像を補正する。 Specifically, the processor 310 back-calculates (φx, φy) from (Nx, Ny) using the above theoretical formula to which the predetermined glossiness α is substituted. The processor 310 corrects the X-direction normal image by replacing each pixel value Nx in the X-direction normal image with φx. Similarly, the processor 310 corrects the Y-direction normal image by replacing each pixel value Ny in the Y-direction normal image with φy.
 あるいは、プロセッサ310は、予め作成されたルックアップテーブル群の中から光沢度αに対応する1つのルックアップテーブルを選択し、選択したルックアップテーブルを用いて、(Nx,Ny)を(φx,φy)に変換してもよい。ルックアップテーブルは、対応する光沢度αが代入された上記の理論式を用いて、予め作成される。ルックアップテーブルを用いることにより、法線画像の補正処理に要する時間が短縮される。 Alternatively, the processor 310 selects one look-up table corresponding to the glossiness α from the look-up table group created in advance, and uses the selected look-up table to set (Nx, Ny) to (φx, Ny). It may be converted to φy). The look-up table is pre-created using the above theoretical formula to which the corresponding gloss α is substituted. By using the look-up table, the time required for the correction process of the normal image is shortened.
  (アルベド画像の生成方法)
 対象物Wの表面の光沢度α=1の場合、以下の式(6)が成り立つ。
μ={(Ja+Jb+Jc+Jd)/2}[{1+tan2(2φx)}{1+tan2(2φy)}]1/2/{4cos(θ)}・・・式(6)。
(How to generate an albedo image)
When the glossiness α = 1 on the surface of the object W, the following equation (6) holds.
μ = {(Ja + Jb + Jc + Jd) / 2} [{1 + tan 2 (2φx)} {1 + tan 2 (2φy)}] 1/2 / {4cos (θ)} ... Equation (6).
 図14は、光沢度α=5のときのアルベドμとφxとφyとの関係を示す図である。図15は、光沢度α=10のときのアルベドμとφxとφyとの関係を示す図である。図16は、光沢度α=20のときのアルベドμとφxとφyとの関係を示す図である。 FIG. 14 is a diagram showing the relationship between albedo μ, φx, and φy when the glossiness α = 5. FIG. 15 is a diagram showing the relationship between albedo μ, φx, and φy when the glossiness α = 10. FIG. 16 is a diagram showing the relationship between albedo μ, φx, and φy when the glossiness α = 20.
 図14~図16に示されるように、アルベドμとφxおよびφyとの関係は、単調変化を示す。ただし、光沢度αが1に近い場合にはアルベドμとφxおよびφyとの関係が線形であるのに対し、光沢度αが大きくなると、アルベドμとφxおよびφyとの関係が非線形になる。そのため、プロセッサ310は、光沢度αに応じて補正されたX方向法線画像およびY方向法線画像ぞれぞれの画素値であるφx,φyと、偏光画像50a~50dそれぞれの画素値Ja~Jdとを上記の式に代入することにより、アルベド画像を生成する。アルベド画像の各画素値は、アルベドμを示す。 As shown in FIGS. 14 to 16, the relationship between albedo μ and φx and φy shows a monotonous change. However, when the glossiness α is close to 1, the relationship between the albedo μ and φx and φy is linear, whereas when the glossiness α is large, the relationship between the albedo μ and φx and φy becomes non-linear. Therefore, the processor 310 has the pixel values φx and φy of the X-direction normal image and the Y-direction normal image corrected according to the gloss α, and the pixel values Ja of each of the polarized images 50a to 50d. By substituting ~ Jd into the above equation, an albed image is generated. Each pixel value of the albedo image indicates albedo μ.
  (形状画像の生成方法)
 形状画像は、光沢度αに応じて補正されたX方向法線画像およびY方向法線画像に基づいて生成される。
(Method of generating shape image)
The shape image is generated based on the X-direction normal image and the Y-direction normal image corrected according to the glossiness α.
 図17は、形状画像の生成のために、X方向法線画像に対して実行される処理を示す図である。X方向法線画像の画素値は、対象物Wの表面の法線ベクトルのうちX方向成分の大きさを示す。 FIG. 17 is a diagram showing a process executed for an X-direction normal image for generating a shape image. The pixel value of the X-direction normal image indicates the magnitude of the X-direction component in the normal vector on the surface of the object W.
 プロセッサ310は、X方向法線画像内の4つの画素で囲まれる1つの点を注目点Qとして選択する。X方向法線画像の横方向は、X方向に対応する。そのため、図17に示されるように、プロセッサ310は、注目点Qの左側および右側に矩形領域R1,R2をそれぞれ設定する。矩形領域R1,R2の高さは予め定められた長さLであり、矩形領域R1,R2の幅はL/2である。 The processor 310 selects one point surrounded by four pixels in the X-direction normal image as the point of interest Q. The lateral direction of the X-direction normal image corresponds to the X-direction. Therefore, as shown in FIG. 17, the processor 310 sets rectangular regions R1 and R2 on the left side and the right side of the point of interest Q, respectively. The height of the rectangular regions R1 and R2 is a predetermined length L, and the width of the rectangular regions R1 and R2 is L / 2.
 プロセッサ310は、矩形領域R1に含まれる画素の値の和Sx1を計算する。同様に、プロセッサ310は、矩形領域R2に含まれる画素の値の和Sx2を計算する。プロセッサ310は、和Sx1と和Sx2との差Sxを計算する。プロセッサ310は、X方向法線画像内の4つの画素で囲まれる点の全てについて、差Sxを計算する。 The processor 310 calculates the sum Sx1 of the values of the pixels included in the rectangular area R1. Similarly, the processor 310 calculates the sum Sx2 of the values of the pixels included in the rectangular region R2. The processor 310 calculates the difference Sx between the sum Sx1 and the sum Sx2. The processor 310 calculates the difference Sx for all the points surrounded by the four pixels in the X-direction normal image.
 図18は、形状画像の生成のために、Y方向法線画像に対して実行される処理を示す図である。Y方向法線画像の画素値は、対象物Wの表面の法線ベクトルのうちY方向成分の大きさを示す。 FIG. 18 is a diagram showing a process executed for a Y-direction normal image for generating a shape image. The pixel value of the Y-direction normal image indicates the magnitude of the Y-direction component of the normal vector on the surface of the object W.
 プロセッサ310は、Y方向法線画像において4つの画素で囲まれる1つの点を注目点Qとして選択する。Y方向法線画像の縦方向は、Y方向に対応する。そのため、図18に示されるように、プロセッサ310は、注目点Qの上側および下側に矩形領域R3,R4をそれぞれ設定する。矩形領域R3,R4の幅はLであり、矩形領域R3,R4の高さはL/2である。 The processor 310 selects one point surrounded by four pixels in the Y-direction normal image as the point of interest Q. The vertical direction of the Y-direction normal image corresponds to the Y-direction. Therefore, as shown in FIG. 18, the processor 310 sets rectangular regions R3 and R4 on the upper side and the lower side of the point of interest Q, respectively. The width of the rectangular regions R3 and R4 is L, and the height of the rectangular regions R3 and R4 is L / 2.
 プロセッサ310は、矩形領域R3に含まれる画素の値のSy1を計算する。同様に、プロセッサ310は、矩形領域R4に含まれる画素の値のSy2を計算する。プロセッサ310は、和Sy1と和Sy2との差Syを計算する。プロセッサ310は、Y方向法線画像内の4つの画素で囲まれる点の全てについて、差Syを計算する。 The processor 310 calculates Sy1 of the value of the pixel included in the rectangular area R3. Similarly, the processor 310 calculates Sy2, which is the value of the pixel included in the rectangular region R4. The processor 310 calculates the difference Sy between the sum Sy1 and the sum Sy2. The processor 310 calculates the difference Sy for all the points surrounded by the four pixels in the Y-direction normal image.
 対象物Wの表面にキズまたは打痕のような欠陥が存在する場合、当該欠陥において法線ベクトルが大きく変化する。差Sxは、法線ベクトルのX方向成分が大きく変化するほど、大きくなる。差Syは、法線ベクトルのY方向成分が大きく変化するほど、大きくなる。すなわち、差Sx,Syは、キズまたは打痕のような、凹凸の生じる欠陥が存在する箇所において大きな値をとる。そのため、プロセッサ310は、以下の式で得られるSを画素値とする形状画像を生成する。
S=A(Sx+Sy)+B
Aは、形状画像のコントラストを決定するためのパラメータである。Bは、形状画像の全体的な画素値のレベルを決定するためのパラメータである。プロセッサ310は、形状画像における画素値の最大、最小、および最大と最小との差が規定範囲内に収まるように、パラメータA,Bの値を設定する。規定範囲は、欠陥の検査に適したコントラストおよびレベルに応じて予め定められる。これにより、凹凸の生じる欠陥の検査に適した形状画像が生成される。
When a defect such as a scratch or a dent is present on the surface of the object W, the normal vector changes significantly in the defect. The difference Sx increases as the X-direction component of the normal vector changes significantly. The difference Sy increases as the Y-direction component of the normal vector changes significantly. That is, the difference Sx and Sy take a large value in a place where a defect having unevenness such as a scratch or a dent is present. Therefore, the processor 310 generates a shape image having S as a pixel value obtained by the following equation.
S = A (Sx + Sy) + B
A is a parameter for determining the contrast of the shape image. B is a parameter for determining the level of the overall pixel value of the shape image. The processor 310 sets the values of the parameters A and B so that the maximum and minimum of the pixel values in the shape image and the difference between the maximum and the minimum are within the specified range. The defined range is predetermined according to the contrast and level suitable for defect inspection. As a result, a shape image suitable for inspection of defects having irregularities is generated.
  (偏光画像、法線画像、形状画像、アルベド画像、平均画像の例)
 図19は、ボタン電池を撮像することにより得られる偏光画像と、偏光画像から生成された各種画像とを示す図である。プロセッサ310は、偏光画像50a~50dからX方向法線画像51およびY方向法線画像52を生成する。プロセッサ310は、X方向法線画像51およびY方向法線画像52に基づいて形状画像53を生成する。プロセッサ310は、形状画像53を二値化することにより、二値画像54を生成する。さらに、プロセッサ310は、偏光画像50a~50dからアルベド画像55および平均画像56を生成する。
(Examples of polarized images, normal images, shape images, albedo images, and average images)
FIG. 19 is a diagram showing a polarized image obtained by imaging a button battery and various images generated from the polarized image. The processor 310 generates an X-direction normal image 51 and a Y-direction normal image 52 from the polarized images 50a to 50d. The processor 310 generates the shape image 53 based on the X-direction normal image 51 and the Y-direction normal image 52. The processor 310 generates the binary image 54 by binarizing the shape image 53. Further, the processor 310 generates an albedo image 55 and an average image 56 from the polarized images 50a to 50d.
 図19において、ボタン電池の表面の枠線F1で囲まれる領域に打痕が存在する。さらに、ボタン電池の表面の枠線F2で囲まれる領域に汚れが存在する。図19に示されるように、形状画像53において、枠線F1内に打痕部分に応じた画素値の変化が観察される。二値画像54では、打痕部分が白で表される。これにより、形状画像53または二値画像54を用いることにより、キズまたは打痕のような、凹凸の生じる欠陥が精度良く検査される。なお、X方向法線画像51およびY方向法線画像52においても、打痕部分とその周囲とにおいて画素値が異なる。そのため、X方向法線画像51およびY方向法線画像52を用いても、凹凸の生じる欠陥を検査できる。 In FIG. 19, there are dents in the area surrounded by the frame line F1 on the surface of the button battery. Further, there is dirt in the area surrounded by the frame line F2 on the surface of the button battery. As shown in FIG. 19, in the shape image 53, a change in the pixel value according to the dented portion is observed in the frame line F1. In the binary image 54, the dented portion is represented by white. Thereby, by using the shape image 53 or the binary image 54, defects having irregularities such as scratches or dents are inspected with high accuracy. Also in the X-direction normal image 51 and the Y-direction normal image 52, the pixel values are different between the dented portion and its surroundings. Therefore, even if the X-direction normal image 51 and the Y-direction normal image 52 are used, defects having irregularities can be inspected.
 X方向法線画像51、Y方向法線画像52、形状画像53、および二値画像54において、汚れ部分とその周囲とにおいて画素値に変化が見られない。汚れによって凹凸に変化が生じていないためである。これに対し、アルベド画像55では、枠線F2内に汚れに応じた画素値の変化が観察される。汚れによりアルベド(反射率)が変化したためである。このように、アルベド画像55を用いることにより、汚れのような、アルベド(反射率)に変化を生じさせる欠陥が精度良く検査される。 In the X-direction normal image 51, the Y-direction normal image 52, the shape image 53, and the binary image 54, there is no change in the pixel value between the dirty portion and its surroundings. This is because the unevenness does not change due to dirt. On the other hand, in the albedo image 55, a change in the pixel value according to the stain is observed in the frame line F2. This is because the albedo (reflectance) has changed due to dirt. In this way, by using the albedo image 55, defects that cause changes in the albedo (reflectance), such as stains, are accurately inspected.
 図20は、乾電池の側面を撮像することにより得られる偏光画像と、偏光画像から生成された各種画像とを示す図である。図20には、偏光画像50a~50dと、偏光画像50a~50dから生成されたX方向法線画像51、Y方向法線画像52および平均画像56と、X方向法線画像51およびY方向法線画像52から生成された形状画像53および二値画像54とが示される。図20において、乾電池の側面の枠線F1で囲まれる領域に打痕が存在する。形状画像53において、枠線F1内に打痕部分に応じた画素値の変化が観察される。二値画像54では、打痕部分が白で表される。これにより、形状画像53または二値画像54を用いることにより、キズまたは打痕のような、凹凸の生じる欠陥が精度良く検査される。 FIG. 20 is a diagram showing a polarized image obtained by imaging the side surface of a dry cell and various images generated from the polarized image. 20 shows the polarized images 50a to 50d, the X-direction normal image 51, the Y-direction normal image 52, and the average image 56 generated from the polarized images 50a to 50d, and the X-direction normal image 51 and the Y-direction method. The shape image 53 and the binary image 54 generated from the line image 52 are shown. In FIG. 20, there are dents in the region surrounded by the frame line F1 on the side surface of the dry cell. In the shape image 53, a change in the pixel value according to the dented portion is observed in the frame line F1. In the binary image 54, the dented portion is represented by white. Thereby, by using the shape image 53 or the binary image 54, defects having irregularities such as scratches or dents are inspected with high accuracy.
 図21は、乾電池の底面を撮像することにより得られる偏光画像と、偏光画像から生成された各種画像とを示す図である。図21には、偏光画像50a~50dと、偏光画像50a~50dから生成されたX方向法線画像51、Y方向法線画像52および平均画像56と、X方向法線画像51およびY方向法線画像52から生成された形状画像53とが示される。図21において、乾電池の底面の枠線F1で囲まれる領域に打痕が存在する。形状画像53において、枠線F1内に打痕部分に応じた画素値の変化が観察される。これにより、形状画像53を用いることにより、キズまたは打痕のような、凹凸の生じる欠陥が精度良く検査される。 FIG. 21 is a diagram showing a polarized image obtained by imaging the bottom surface of a dry cell and various images generated from the polarized image. In FIG. 21, the polarized images 50a to 50d, the X-direction normal image 51, the Y-direction normal image 52, and the average image 56 generated from the polarized images 50a to 50d, the X-direction normal image 51, and the Y-direction method are shown. The shape image 53 generated from the line image 52 is shown. In FIG. 21, there are dents in the region surrounded by the frame line F1 on the bottom surface of the dry cell. In the shape image 53, a change in the pixel value according to the dented portion is observed in the frame line F1. As a result, by using the shape image 53, defects with irregularities such as scratches or dents can be inspected with high accuracy.
 図22は、シボ加工された樹脂表面を撮像することにより得られる偏光画像と、偏光画像から生成された各種画像とを示す図である。図22には、偏光画像50a~50dと、偏光画像50a~50dから生成されたX方向法線画像51、Y方向法線画像52および平均画像56と、X方向法線画像51およびY方向法線画像52から生成された形状画像53および二値画像54とが示される。図22において、樹脂表面の枠線F1で囲まれる領域に打痕が存在する。さらに、樹脂表面の枠線F2で囲まれる領域に汚れが存在する。 FIG. 22 is a diagram showing a polarized image obtained by imaging the textured resin surface and various images generated from the polarized image. In FIG. 22, the polarized images 50a to 50d, the X-direction normal image 51, the Y-direction normal image 52, and the average image 56 generated from the polarized images 50a to 50d, the X-direction normal image 51, and the Y-direction method are shown. The shape image 53 and the binary image 54 generated from the line image 52 are shown. In FIG. 22, dents are present in the region surrounded by the frame line F1 on the resin surface. Further, there is dirt in the region surrounded by the frame line F2 on the resin surface.
 図22に示されるように、形状画像53において、枠線F1内に打痕部分に応じた画素値の変化が観察される。二値画像54では、打痕部分が白で表される。これにより、形状画像53または二値画像54を用いることにより、キズまたは打痕のような、凹凸の生じる欠陥が精度良く検査される。 As shown in FIG. 22, in the shape image 53, a change in the pixel value according to the dented portion is observed in the frame line F1. In the binary image 54, the dented portion is represented by white. Thereby, by using the shape image 53 or the binary image 54, defects having irregularities such as scratches or dents are inspected with high accuracy.
 X方向法線画像51、Y方向法線画像52、形状画像53、および二値画像54において、汚れ部分とその周囲とにおいて画素値に変化が見られない。汚れによって凹凸に変化が生じていないためである。これに対し、平均画像56では、枠線F2内に汚れ部分に応じた画素値の変化が観察される。汚れによりアルベド(反射率)が高まったためである。このように、平均画像56を用いても、汚れのような、アルベド(反射率)に変化を生じさせる欠陥が精度良く検査される。 In the X-direction normal image 51, the Y-direction normal image 52, the shape image 53, and the binary image 54, there is no change in the pixel value between the dirty portion and its surroundings. This is because the unevenness does not change due to dirt. On the other hand, in the average image 56, a change in the pixel value according to the dirty portion is observed in the frame line F2. This is because the albedo (reflectance) has increased due to dirt. As described above, even if the average image 56 is used, defects such as stains that cause changes in the albedo (reflectance) are accurately inspected.
 (B-2.第2の検査例)
 第2の検査例は、明視野照明および暗視野照明の条件下でそれぞれ得られる2つの画像を用いる。明視野照明は、対象物Wの表面の正反射光が偏光カメラ20に入射する照明法である。暗視野照明は、対象物Wの表面の正反射光が偏光カメラ20に入射せず、拡散反射光のみが偏光カメラ20に入射する照明法である。キズが存在する部分では光が正反射しにくい。そのため、明視野照明の条件下で得られる画像において、キズが存在する部分の輝度が低下する。これにより、明視野照明の条件下で得られる画像を用いることにより、キズを精度良く検査できる。一方、汚れまたは異物が存在する部分では拡散反射しやすい。そのため、暗視野照明の条件下で得られる画像において、汚れまたは異物が存在する部分の輝度が高くなる。これにより、暗視野照明の条件下で得られる画像を用いることにより、汚れまたは異物を精度良く検査できる。
(B-2. Second inspection example)
In the second inspection example, two images obtained under the conditions of bright-field illumination and dark-field illumination are used. Bright-field illumination is an illumination method in which specularly reflected light on the surface of the object W is incident on the polarizing camera 20. The dark field illumination is an illumination method in which the specularly reflected light on the surface of the object W does not enter the polarized camera 20 and only the diffuse reflected light is incident on the polarized camera 20. Light is difficult to specularly reflect in areas with scratches. Therefore, in the image obtained under the condition of bright field illumination, the brightness of the portion where the scratch is present is lowered. As a result, scratches can be inspected with high accuracy by using the image obtained under the condition of bright field illumination. On the other hand, it is easy to diffuse and reflect in the part where dirt or foreign matter is present. Therefore, in the image obtained under the condition of dark field illumination, the brightness of the portion where dirt or foreign matter is present becomes high. Thereby, by using the image obtained under the condition of dark field illumination, dirt or foreign matter can be inspected with high accuracy.
  (照明部の配置)
 図23は、第2の検査例に係る複数の照明部の配置を示す図である。図23に示されるように、複数の照明部10は、照明部10e,10fを含む。照明部10e,10fは、対象物Wに対する仰角が互いに異なるように配置される。
(Arrangement of lighting unit)
FIG. 23 is a diagram showing the arrangement of a plurality of lighting units according to the second inspection example. As shown in FIG. 23, the plurality of lighting units 10 include the lighting units 10e and 10f. The illumination units 10e and 10f are arranged so that the elevation angles with respect to the object W are different from each other.
 照明部10fは、偏光カメラ20の光軸25を中心とするリング型の照明部である。照明部10fは、偏光カメラ20の光軸25を中心とするリング型の発光部11fと、発光部11fの発光面に取り付けられる直線偏光フィルタ12fとを有する。対象物Wに対する照明部10fの仰角は、対象物Wの正反射光が偏光カメラ20に入射しないように設定される。 The lighting unit 10f is a ring-shaped lighting unit centered on the optical axis 25 of the polarizing camera 20. The illumination unit 10f has a ring-shaped light emitting unit 11f centered on the optical axis 25 of the polarizing camera 20, and a linear polarizing filter 12f attached to the light emitting surface of the light emitting unit 11f. The elevation angle of the illumination unit 10f with respect to the object W is set so that the specularly reflected light of the object W does not enter the polarizing camera 20.
 図24は、照明部10eの構成を示す図である。照明部10eは、偏光カメラ20の光軸25に沿って照明光を対象物Wに照射する同軸照明である。図24に示されるように、照明部10eは、発光部11eと、直線偏光フィルタ12eと、ハーフミラー13eと、を有する。直線偏光フィルタ12eは、発光部11eの発光面とハーフミラー13eとの間に配置され、発光部11eから照射された非偏光のうち予め定められた方向に沿った直線偏光を透過させる。ハーフミラー13eは、直線偏光フィルタ12eを透過した直線偏光を、偏光カメラ20の光軸25に沿って落射させる。 FIG. 24 is a diagram showing the configuration of the lighting unit 10e. The illumination unit 10e is coaxial illumination that irradiates the object W with illumination light along the optical axis 25 of the polarizing camera 20. As shown in FIG. 24, the illumination unit 10e includes a light emitting unit 11e, a linear polarization filter 12e, and a half mirror 13e. The linear polarization filter 12e is arranged between the light emitting surface of the light emitting unit 11e and the half mirror 13e, and transmits linear polarization along a predetermined direction among the unpolarized light emitted from the light emitting unit 11e. The half mirror 13e shoots linearly polarized light transmitted through the linear polarization filter 12e along the optical axis 25 of the polarizing camera 20.
 直線偏光フィルタ12e,12fは、照明部10e,10fから対象物Wに照射される直線偏光の偏光方向が互いに直交するように配置される。 The linear polarization filters 12e and 12f are arranged so that the polarization directions of the linear polarizations irradiated to the object W from the illumination units 10e and 10f are orthogonal to each other.
  (検査方法)
 第2の検査例において、検査システム1は、図2に示す偏光子22a~22dを含む偏光カメラ20を備える。偏光子22aの偏光方向は、照明部10eから対象物Wに照射される直線偏光の偏光方向と一致する(平行である)。偏光子22cの偏光方向は、照明部10fから照射される直線偏光の偏光方向と一致する(平行である)。そのため、照明部10eから照射され、対象物Wで正反射した光は、偏光子22aを透過し、偏光子22cを透過しない。照明部10fから照射され、対象物Wで拡散反射した光の一部は、偏光子22cを透過する。なお、照明部10fから照射され、対象物Wで正反射した光は、偏光カメラ20に入射しない。
(Inspection methods)
In the second inspection example, the inspection system 1 includes a polarizing camera 20 including the splitters 22a to 22d shown in FIG. The polarization direction of the polarizing element 22a coincides with (parallel to) the polarization direction of the linearly polarized light radiated to the object W from the illumination unit 10e. The polarization direction of the polarizing element 22c coincides with (parallel to) the polarization direction of the linearly polarized light emitted from the illumination unit 10f. Therefore, the light irradiated from the illumination unit 10e and specularly reflected by the object W passes through the polarizing element 22a and does not pass through the polarizing element 22c. A part of the light emitted from the illumination unit 10f and diffusely reflected by the object W passes through the polarizing element 22c. The light emitted from the illumination unit 10f and specularly reflected by the object W does not enter the polarizing camera 20.
 図25は、第2の検査例において得られた複数の偏光画像の一例を示す図である。偏光画像50a~50dは、偏光子22a~22dにそれぞれ対応する。 FIG. 25 is a diagram showing an example of a plurality of polarized images obtained in the second inspection example. The polarized images 50a to 50d correspond to the polarizing elements 22a to 22d, respectively.
 偏光画像50aは、照明部10eから照射され、対象物Wで正反射した光の輝度を示す。すなわち、偏光画像50aは、明視野照明の条件下での撮像により得られる画像に対応する。そのため、偏光画像50aの枠線F1内において、キズが容易に確認される。このように、偏光画像50aを用いることにより、キズが精度良く検査される。 The polarized image 50a shows the brightness of the light emitted from the illumination unit 10e and specularly reflected by the object W. That is, the polarized image 50a corresponds to an image obtained by imaging under bright-field illumination conditions. Therefore, scratches are easily confirmed in the frame line F1 of the polarized image 50a. In this way, by using the polarized image 50a, scratches can be inspected with high accuracy.
 偏光画像50cは、照明部10fから照射され、対象物Wで拡散反射した光のうち偏光子22cを透過した光の輝度を示す。すなわち、偏光画像50cは、暗視野照明の条件下での撮像により得られる画像に対応する。そのため、偏光画像50cの枠線F2内において、汚れが容易に確認される。このように、偏光画像50cを用いることにより、汚れが精度良く検査される。 The polarized image 50c shows the brightness of the light transmitted from the polarizing element 22c among the light diffusely reflected by the object W, which is irradiated from the illumination unit 10f. That is, the polarized image 50c corresponds to an image obtained by imaging under the condition of dark field illumination. Therefore, dirt is easily confirmed in the frame line F2 of the polarized image 50c. In this way, by using the polarized image 50c, stains are inspected with high accuracy.
 偏光画像50b、50dは、偏光画像50a,50cの中間的な輝度を示す。そのため、偏光画像50b、50dは、対象物Wの検査に使用されなくてもよい。 The polarized images 50b and 50d show an intermediate brightness between the polarized images 50a and 50c. Therefore, the polarized images 50b and 50d may not be used for inspecting the object W.
 (B-3.第3の検査例)
 第3の検査例は、位相シフト法を用いる。位相シフト法は,位相をずらしながら、照射強度が正弦波に変調された縞パターンを照射して撮像された複数枚の画像から、対象物Wの三次元形状を計測する方法である。
(B-3. Third inspection example)
The third inspection example uses a phase shift method. The phase shift method is a method of measuring the three-dimensional shape of an object W from a plurality of images captured by irradiating a striped pattern whose irradiation intensity is modulated into a sine wave while shifting the phase.
  (照明部の配置)
 図26は、第3の検査例に係る複数の照明部の配置の一例を示す図である。図26に示されるように、複数の照明部10は、照明部10e,10g~10iを含む。照明部10eは、第2の検査例において説明したように、偏光カメラ20の光軸25と同軸の照明光を対象物Wに照射する(図24参照)。
(Arrangement of lighting unit)
FIG. 26 is a diagram showing an example of the arrangement of a plurality of lighting units according to the third inspection example. As shown in FIG. 26, the plurality of lighting units 10 include the lighting units 10e, 10g to 10i. As described in the second inspection example, the illumination unit 10e irradiates the object W with illumination light coaxial with the optical axis 25 of the polarizing camera 20 (see FIG. 24).
 照明部10g~10iの各々は、リング型照明である。照明部10g~10iは、偏光カメラ20の光軸25を中心とするリング型の発光部11g~11iと、発光部の発光面に取り付けられる直線偏光フィルタ12g~12iとをそれぞれ有する。 Each of the lighting units 10g to 10i is a ring type lighting. The illumination units 10g to 10i have a ring-shaped light emitting unit 11g to 11i centered on the optical axis 25 of the polarizing camera 20, and a linear polarizing filter 12g to 12i attached to the light emitting surface of the light emitting unit, respectively.
 照明部10eから照射される光の偏光方向を基準方向とするとき、直線偏光フィルタ12e,12g~12iは、照明部10e,10g~10iから照射される光の偏光方向と基準方向とのなす角度がそれぞれ0°,45°,90°,135°となるように配置される。 When the polarization direction of the light emitted from the illumination unit 10e is set as the reference direction, the linear polarization filters 12e, 12g to 12i form an angle between the polarization direction of the light emitted from the illumination unit 10e, 10g to 10i and the reference direction. Are arranged so as to be 0 °, 45 °, 90 °, and 135 °, respectively.
 照明部10e,10g~10iは、対象物Wに対する仰角が互いに異なるように配置される。具体的には、照明部10e,10g~10iは、対象物Wに対して、この順に仰角が小さくなるように配置される。言い換えると、照明部10e,10g~10iから対象物Wへの照明光の入射角は、この順に大きくなる。対象物Wに対する照明部10eの仰角は90°であり、照明部10eから対象物Wへの照明光の入射角は0°である。 The lighting units 10e and 10g to 10i are arranged so that the elevation angles with respect to the object W are different from each other. Specifically, the illumination units 10e and 10g to 10i are arranged so that the elevation angle becomes smaller in this order with respect to the object W. In other words, the incident angle of the illumination light from the illumination unit 10e, 10g to 10i to the object W increases in this order. The elevation angle of the illumination unit 10e with respect to the object W is 90 °, and the incident angle of the illumination light from the illumination unit 10e to the object W is 0 °.
 第3の検査例において、検査システム1は、図2に示す偏光子22a~22dを含む偏光カメラ20を備える。偏光子22a~22dの偏光方向は、照明部10e,10g~10iから照射される光の偏光方向とそれぞれ平行である。そのため、照明部10e,10g~10iから照射され、対象物Wで正反射した光は、偏光子22a~22dをそれぞれ透過する。照明部10eから照射される光は、偏光子22b,22dの偏光方向に平行な成分を有する。そのため、照明部10eから照射され、対象物Wで正反射した光のうちの一部の成分は、偏光子22b,22dを透過する。具体的には、照明部10eから照射され、偏光子22b,22dの各々を透過する光量は、照明部10eから照射され、偏光子22aを透過する光量の約1/2である。同様に、照明部10gから照射され、対象物Wで正反射した光のうちの一部の成分は、偏光子22a,22cを透過する。照明部10hから照射され、対象物Wで正反射した光のうちの一部の成分は、偏光子22b,22dを透過する。照明部10iから照射され、対象物Wで正反射した光のうちの一部の成分は、偏光子22a,22cを透過する。照明部10e,10g~10iから照射され、対象物Wで正反射した光は、偏光子22c,22d,22a,22bをそれぞれ透過しない。 In the third inspection example, the inspection system 1 includes a polarizing camera 20 including the polarizing elements 22a to 22d shown in FIG. The polarization directions of the extruders 22a to 22d are parallel to the polarization directions of the light emitted from the illumination units 10e and 10g to 10i, respectively. Therefore, the light emitted from the illumination units 10e and 10g to 10i and specularly reflected by the object W passes through the polarizing elements 22a to 22d, respectively. The light emitted from the illumination unit 10e has components parallel to the polarization directions of the splitters 22b and 22d. Therefore, some of the components of the light that is irradiated from the illumination unit 10e and specularly reflected by the object W pass through the splitters 22b and 22d. Specifically, the amount of light emitted from the illumination unit 10e and transmitted through each of the splitters 22b and 22d is about ½ of the amount of light emitted from the illumination unit 10e and transmitted through the splitter 22a. Similarly, a part of the light that is irradiated from the illumination unit 10g and is specularly reflected by the object W passes through the polarizing elements 22a and 22c. Some of the components of the light that is radiated from the illumination unit 10h and specularly reflected by the object W pass through the substituents 22b and 22d. Some of the components of the light emitted from the illumination unit 10i and specularly reflected by the object W pass through the stators 22a and 22c. The light emitted from the illumination units 10e, 10g to 10i and specularly reflected by the object W does not pass through the polarizing elements 22c, 22d, 22a, and 22b, respectively.
 図27は、照明部10e,10g~10iから照射され、対象物Wで正反射した光のうち各偏光子を透過する量を示す図である。図27に示されるように、偏光子22a~22dの各々を透過する光量は、正弦波に従う。すなわち、偏光子22a~22dにそれぞれ対応する偏光画像50a~50dは、同心円状の縞パターンの光が対象物Wに照射された条件下で撮像された画像に対応する。さらに、図27に示されるように、偏光子22a~22dにそれぞれ対応する縞パターンの位相は、π/2ずつずれている。 FIG. 27 is a diagram showing the amount of light that is radiated from the illumination units 10e, 10g to 10i and is specularly reflected by the object W and that passes through each polarizing element. As shown in FIG. 27, the amount of light transmitted through each of the transducers 22a to 22d follows a sine wave. That is, the polarized images 50a to 50d corresponding to the splitters 22a to 22d correspond to the images captured under the condition that the object W is irradiated with the light of the concentric striped pattern. Further, as shown in FIG. 27, the phases of the fringe patterns corresponding to the stators 22a to 22d are shifted by π / 2.
 図28は、第3の検査例に係る複数の照明部の配置の別の例を示す図である。図28に示される複数の照明部10は、図26に示される複数の照明部10と比較して、照明部10j~10lをさらに含む点で相違する。 FIG. 28 is a diagram showing another example of the arrangement of the plurality of lighting units according to the third inspection example. The plurality of illumination units 10 shown in FIG. 28 are different from the plurality of illumination units 10 shown in FIG. 26 in that they further include the illumination units 10j to 10l.
 照明部10j~10lの各々は、リング型照明である。照明部10j~10lは、偏光カメラ20の光軸25を中心とするリング型の発光部11j~11lと、発光部11j~11lの発光面に取り付けられる直線偏光フィルタ12j~12lとをそれぞれ有する。 Each of the lighting units 10j to 10l is a ring type lighting. The illumination units 10j to 10l have ring-shaped light emitting units 11j to 11l centered on the optical axis 25 of the polarizing camera 20, and linear polarizing filters 12j to 12l attached to the light emitting surfaces of the light emitting units 11j to 11l, respectively.
 照明部10eから照射される光の偏光方向を基準方向とするとき、直線偏光フィルタ12j~12lは、照明部10j~10lから照射される光の偏光方向と基準方向とのなす角度がそれぞれ22.5°,67.5°,110.5°となるように配置される。 When the polarization direction of the light emitted from the illumination unit 10e is set as the reference direction, the linear polarization filters 12j to 12l have an angle formed by the polarization direction and the reference direction of the light emitted from the illumination units 10j to 10l of 22. It is arranged so as to be 5 °, 67.5 °, and 110.5 °.
 照明部10e,10j,10g,10k,10h,10l,10iは、対象物Wに対して、この順に仰角が小さくなるように配置される。言い換えると、照明部10e,10j,10g,10k,10h,10l,10iから対象物Wへの照明光の入射角は、この順に大きくなる。 The lighting units 10e, 10j, 10g, 10k, 10h, 10l, and 10i are arranged so that the elevation angle becomes smaller in this order with respect to the object W. In other words, the incident angle of the illumination light from the illumination unit 10e, 10j, 10g, 10k, 10h, 10l, 10i to the object W increases in this order.
 図29は、照明部10e,10g~10lから照射され、対象物Wで正反射した光のうち各偏光子を透過する量を示す図である。図29に示されるように、偏光子22a~22dの各々を透過する光量は、正弦波に従う。すなわち、偏光子22a~22dにそれぞれ対応する偏光画像50a~50dは、同心円状の縞パターンの光が対象物Wに照射された条件下で撮像された画像に対応する。さらに、図29に示されるように、偏光子22a~22dにそれぞれ対応する縞パターンの位相は、π/2ずつずれている。 FIG. 29 is a diagram showing the amount of light that is radiated from the illumination unit 10e, 10g to 10l and is specularly reflected by the object W and that passes through each polarizing element. As shown in FIG. 29, the amount of light transmitted through each of the substituents 22a to 22d follows a sine wave. That is, the polarized images 50a to 50d corresponding to the splitters 22a to 22d correspond to the images captured under the condition that the object W is irradiated with the light of the concentric striped pattern. Further, as shown in FIG. 29, the phases of the fringe patterns corresponding to the stators 22a to 22d are shifted by π / 2.
  (検査方法)
 プロセッサ310は、偏光画像50a~50dに基づいて、対象物Wの検査を行う。プロセッサ310は、位相シフト法によって、偏光画像50a~50dに基づいて、位相画像および正反射画像を生成する。
(Inspection methods)
The processor 310 inspects the object W based on the polarized images 50a to 50d. The processor 310 generates a phase image and a specular reflection image based on the polarized images 50a to 50d by the phase shift method.
 画像の2次元座標を(x,y)と表し、偏光画像50a~50dにおける画素(x,y)の値をJa(x,y),Jb(x,y),Jc(x,y),Jd(x,y)とそれぞれ表す場合、画素の位相は、
Φ(x,y)=arctan[{Jb(x,y)-Jd(x,y)}/{Ja(x,y)-Jc(x,y)}]
と表される。縞パターンが撮像できる対象物Wの法線角度の範囲は0°~45°である。位相値Φ(x,y)は、対象物Wの法線方向と偏光カメラ20の光軸25とのなす角度に比例する値となる。プロセッサ310は、画素(x、y)の値がIp(x,y)={2π-Φ(x,y)}×128/πとなる位相画像を生成する。このとき、位相画像は、対象物Wの法線方向と光軸25とのなす角度が0°の場合に白画素(256)となり、対象物Wの法線方向と光軸25とのなす角度が45°の場合に黒画素(0)となる。そのため、位相画像を用いることで、対象物Wの表面の凹凸を検査できる。
The two-dimensional coordinates of the image are represented as (x, y), and the values of the pixels (x, y) in the polarized images 50a to 50d are Ja (x, y), Jb (x, y), Jc (x, y), When expressed as Jd (x, y), the phase of the pixel is
Φ (x, y) = arctan [{Jb (x, y) -Jd (x, y)} / {Ja (x, y) -Jc (x, y)}]
It is expressed as. The range of the normal angle of the object W on which the fringe pattern can be imaged is 0 ° to 45 °. The phase value Φ (x, y) is a value proportional to the angle formed by the normal direction of the object W and the optical axis 25 of the polarizing camera 20. The processor 310 generates a phase image in which the value of the pixel (x, y) is Ip (x, y) = {2π-Φ (x, y)} × 128 / π. At this time, the phase image becomes white pixels (256) when the angle formed by the normal direction of the object W and the optical axis 25 is 0 °, and the angle formed by the normal direction of the object W and the optical axis 25. When is 45 °, it becomes a black pixel (0). Therefore, by using the phase image, the unevenness of the surface of the object W can be inspected.
 さらに、プロセッサ310は、画素(x、y)の値が
A(x,y)=[{Ja(x,y)-Jc(x,y)}2+{Jb(x,y)-Jd(x,y)}21/2
となる正反射画像を生成する。正反射画像は、対象物Wの表面で正反射した光の強度を表す。対象物Wにキズがある場合、キズの微細形状により、光が乱反射することがある。この場合、キズ部分で正反射成分が相対的に弱くなる。そのため、正反射画像を用いて、対象物Wの表面の凹凸を検査できる。
Further, in the processor 310, the value of the pixel (x, y) is A (x, y) = [{Ja (x, y) -Jc (x, y)} 2 + {Jb (x, y) -Jd ( x, y)} 2 ] 1/2
Generates a specular reflection image. The specular reflection image represents the intensity of light that is specularly reflected on the surface of the object W. When the object W has scratches, light may be diffusely reflected due to the fine shape of the scratches. In this case, the specular reflection component becomes relatively weak at the scratched portion. Therefore, the specular reflection image can be used to inspect the unevenness of the surface of the object W.
 図30は、図26に示す複数の照明部10を用いて得られる偏光画像50a~50dの一例を示す図である。図31は、図28に示す複数の照明部10を用いて得られる偏光画像50a~50dの一例を示す図である。図30および図31に示されるように、対象物Wの表面で正反射した縞パターンが観察され、偏光画像50a~50dにおいて縞パターンの位相がπ/2ずつずれている。さらに、照明部10の数を増やすことにより、縞パターンが滑らかになる。これにより、対象物Wの表面の法線方向の分解能が良くなる。 FIG. 30 is a diagram showing an example of polarized images 50a to 50d obtained by using the plurality of illumination units 10 shown in FIG. 26. FIG. 31 is a diagram showing an example of polarized images 50a to 50d obtained by using the plurality of illumination units 10 shown in FIG. 28. As shown in FIGS. 30 and 31, a specularly reflected fringe pattern is observed on the surface of the object W, and the phase of the fringe pattern is shifted by π / 2 in the polarized images 50a to 50d. Further, by increasing the number of the illumination units 10, the striped pattern becomes smooth. This improves the resolution in the normal direction of the surface of the object W.
 なお、同軸照明である照明部10eを設けず、リング状の照明部10g~10i(または照明部10g~10l)のみを設ける構成とすることも可能である。この場合、同心円状の中心部分に関する縞パターンが欠損することになる。照明部10eを設けることで、同心円の中心部分に関する縞パターンの欠損を低減でき、より正確な検査を行うことができる。 It is also possible to provide only the ring-shaped illumination unit 10g to 10i (or the illumination unit 10g to 10l) without providing the illumination unit 10e which is coaxial illumination. In this case, the striped pattern relating to the central portion of the concentric circle is lost. By providing the illumination unit 10e, it is possible to reduce the loss of the stripe pattern with respect to the central portion of the concentric circles, and it is possible to perform a more accurate inspection.
 (B-4.第4の検査例)
 第4の検査例は、同軸照明およびバックライトの条件下でそれぞれ得られる2つの画像を用いる。偏光カメラ20の光軸25と同軸の照明条件下で得られる画像は、対象物Wの表面の正反射光の輝度を示す。上述したように、キズが存在する部分では光が正反射しにくい。そのため、同軸照明の条件下で得られる画像において、キズが存在する部分の輝度が低下する。これにより、同軸照明の条件下で得られる画像を用いることにより、キズを精度良く検査できる。一方、対象物Wが遮光性を有する場合、バックライトの条件下で得られる画像において、対象物Wに対応する領域の輝度が0となる。そのため、対象物Wの外周が容易に確認される。これにより、バックライトの条件下で得られる画像を用いることにより、対象物Wの外周におけるバリまたは欠けを精度良く検査できる。
(B-4. Fourth inspection example)
In the fourth inspection example, two images obtained under the conditions of coaxial illumination and backlight are used. The image obtained under the illumination condition coaxial with the optical axis 25 of the polarizing camera 20 shows the brightness of the specularly reflected light on the surface of the object W. As described above, it is difficult for light to be specularly reflected in a portion where a scratch is present. Therefore, in the image obtained under the condition of coaxial illumination, the brightness of the portion where the scratch is present is lowered. As a result, scratches can be inspected with high accuracy by using the image obtained under the condition of coaxial illumination. On the other hand, when the object W has a light-shielding property, the brightness of the region corresponding to the object W becomes 0 in the image obtained under the condition of the backlight. Therefore, the outer circumference of the object W can be easily confirmed. Thereby, by using the image obtained under the condition of the backlight, burrs or chips on the outer periphery of the object W can be inspected with high accuracy.
  (照明部の配置)
 図32は、第4の検査例に係る複数の照明部の配置を示す図である。図32に示されるように、複数の照明部10は、照明部10e,10mを含む。照明部10eは、第2の検査例において説明したように、偏光カメラ20の光軸25に沿った照明光を対象物Wに照射する(図24参照)。
(Arrangement of lighting unit)
FIG. 32 is a diagram showing the arrangement of a plurality of lighting units according to the fourth inspection example. As shown in FIG. 32, the plurality of lighting units 10 include the lighting units 10e and 10m. As described in the second inspection example, the illumination unit 10e irradiates the object W with illumination light along the optical axis 25 of the polarizing camera 20 (see FIG. 24).
 照明部10mは、バックライトとして、対象物Wの偏光カメラ20とは反対側に配置される。すなわち、照明部10mは、対象物Wの背面側から照明光を照射する。照明部10mは、平板状の発光部11mと、発光部11fの発光面に取り付けられる直線偏光フィルタ12mとを有する。 The lighting unit 10m is arranged as a backlight on the opposite side of the object W from the polarizing camera 20. That is, the illumination unit 10m irradiates the illumination light from the back surface side of the object W. The illumination unit 10m has a flat light emitting unit 11m and a linear polarizing filter 12m attached to the light emitting surface of the light emitting unit 11f.
 照明部10eの直線偏光フィルタ12e(図24参照)と直線偏光フィルタ12mとは、照明部10e,10mから対象物Wに照射される直線偏光の偏光方向が互いに直交するように配置される。 The linear polarization filter 12e (see FIG. 24) of the illumination unit 10e and the linear polarization filter 12m are arranged so that the polarization directions of the linear polarization irradiated from the illumination units 10e and 10m to the object W are orthogonal to each other.
  (検査方法)
 第4の検査例において、検査システム1は、図2に示す偏光子22a~22dを含む偏光カメラ20を備える。偏光子22aの偏光方向は、照明部10eから対象物Wに照射される直線偏光の偏光方向と一致する(平行である)。偏光子22cの偏光方向は、照明部10mから照射される直線偏光の偏光方向と一致する(平行である)。そのため、照明部10eから照射され、対象物Wで正反射した光は、偏光子22aを透過し、偏光子22cを透過しない。照明部10mから照射され、対象物Wの周囲を進行する光は、偏光子22cを透過し、偏光子22aを透過しない。
(Inspection methods)
In the fourth inspection example, the inspection system 1 includes a polarizing camera 20 including the splitters 22a to 22d shown in FIG. The polarization direction of the polarizing element 22a coincides with (parallel to) the polarization direction of the linearly polarized light radiated to the object W from the illumination unit 10e. The polarization direction of the polarizing element 22c coincides with (parallel to) the polarization direction of the linearly polarized light emitted from the illumination unit 10m. Therefore, the light irradiated from the illumination unit 10e and specularly reflected by the object W passes through the polarizing element 22a and does not pass through the polarizing element 22c. The light emitted from the illumination unit 10m and traveling around the object W passes through the polarizing element 22c and does not pass through the polarizing element 22a.
 図33は、第4の検査例において得られた複数の偏光画像の一例を示す図である。偏光画像50a~50dは、偏光子22a~22dにそれぞれ対応する。 FIG. 33 is a diagram showing an example of a plurality of polarized images obtained in the fourth inspection example. The polarized images 50a to 50d correspond to the polarizing elements 22a to 22d, respectively.
 偏光画像50aは、照明部10eから照射され、対象物Wで正反射した光の輝度を示す。そのため、偏光画像50aの枠線F1内において、キズ部分の輝度が低下していることが容易に確認される。そのため、プロセッサ310は、偏光画像50aを用いて、キズを精度良く検査できる。 The polarized image 50a shows the brightness of the light emitted from the illumination unit 10e and specularly reflected by the object W. Therefore, it is easily confirmed that the brightness of the scratched portion is reduced in the frame line F1 of the polarized image 50a. Therefore, the processor 310 can accurately inspect scratches by using the polarized image 50a.
 偏光画像50cは、照明部10mから照射され、対象物Wの周囲を進行した光の輝度を示す。すなわち、偏光画像50cは、バックライトの条件下で得られる画像に対応する。そのため、偏光画像50cの枠線F3内において、対象物Wの外周のバリが容易に確認される。そのため、プロセッサ310は、偏光画像50cを用いて、対象物Wの外周におけるバリまたは欠けを精度良く検査できる。 The polarized image 50c shows the brightness of the light emitted from the illumination unit 10m and traveling around the object W. That is, the polarized image 50c corresponds to an image obtained under backlight conditions. Therefore, burrs on the outer periphery of the object W can be easily confirmed in the frame line F3 of the polarized image 50c. Therefore, the processor 310 can accurately inspect burrs or chips on the outer periphery of the object W by using the polarized image 50c.
 偏光画像50b、50dは、偏光画像50a,50cの中間的な輝度を示す。そのため、偏光画像50b、50dは、対象物Wの検査に使用されなくてもよい。 The polarized images 50b and 50d show an intermediate brightness between the polarized images 50a and 50c. Therefore, the polarized images 50b and 50d may not be used for inspecting the object W.
 (B-5.第5の検査例)
 第5の検査例は、ハイダイナミックレンジ技法を用いる。ハイダイナミックレンジ技法は、照明強度の異なる条件下の複数枚の画像を合成することで白飛びや黒つぶれの少ない幅広いダイナミックレンジを持つ画像(ハイダイナミックレンジイメージ)を生成する技法である。
(B-5. Fifth inspection example)
The fifth inspection example uses a high dynamic range technique. The high dynamic range technique is a technique for generating an image (high dynamic range image) having a wide dynamic range with less overexposure and underexposure by synthesizing a plurality of images under different lighting intensity conditions.
  (照明部の配置)
 図34は、第5の検査例に係る複数の照明部の配置を示す図である。図34に示されるように、複数の照明部10は、照明部10n,10oを含む。
(Arrangement of lighting unit)
FIG. 34 is a diagram showing the arrangement of a plurality of lighting units according to the fifth inspection example. As shown in FIG. 34, the plurality of lighting units 10 include the lighting units 10n and 10o.
 照明部10nは、非偏光を発する発光部11nと、発光部11nの対象物Wに配置される直線偏光フィルタ12nと、を有する。そのため、照明部10nから対象物Wに直線偏光が照射される。 The illumination unit 10n has a light emitting unit 11n that emits non-polarized light, and a linear polarization filter 12n that is arranged on the object W of the light emitting unit 11n. Therefore, the object W is irradiated with linearly polarized light from the illumination unit 10n.
 照明部10oは、非偏光を発する発光部11oを有し、直線偏光フィルタを有さない。そのため、照明部10oから対象物Wに非偏光が照射される。 The illumination unit 10o has a light emitting unit 11o that emits non-polarized light, and does not have a linear polarization filter. Therefore, the object W is irradiated with non-polarized light from the illumination unit 10o.
 なお、照明部10n,10oは、近接して配置されることが好ましい。これにより、照明部10n,10oから対象物Wへの照射方向がほぼ同じとなり、後述するように、照明条件として照明強度が異なり、照明方向がほぼ同じの複数の偏光画像50が1回の撮像で得られる。 It is preferable that the lighting units 10n and 10o are arranged close to each other. As a result, the irradiation directions from the illumination units 10n and 10o to the object W are substantially the same, and as will be described later, a plurality of polarized images 50 having different illumination intensities and substantially the same illumination direction are captured once. Obtained at.
  (検査方法)
 第5の検査例において、検査システム1は、図2に示す偏光子22a~22dを含む偏光カメラ20を備える。照明部10nから照射される直線偏光の偏光方向は、偏光子22aの偏光方向と平行であり、偏光子22cの偏光方向と直交する。そのため、照明部10nから照射され、対象物Wで正反射した光は、偏光子22aを透過し、偏光子22cを透過しない。照明部10nから照射され、対象物Wで正反射した光は、偏光子22b,22dの偏光方向に平行な成分を有する。そのため、照明部10nから照射され、対象物Wで正反射した光のうちの一部の成分は、偏光子22b,22dを透過する。具体的には、照明部10nから照射され、偏光子22b,22dの各々を透過する光量は、照明部10nから照射され、偏光子22aを透過する光量の約1/2である。
(Inspection methods)
In the fifth inspection example, the inspection system 1 includes a polarizing camera 20 including the splitters 22a to 22d shown in FIG. The polarization direction of the linearly polarized light emitted from the illumination unit 10n is parallel to the polarization direction of the splitter 22a and orthogonal to the polarization direction of the polarizing element 22c. Therefore, the light irradiated from the illumination unit 10n and specularly reflected by the object W passes through the polarizing element 22a and does not pass through the polarizing element 22c. The light emitted from the illumination unit 10n and specularly reflected by the object W has components parallel to the polarization directions of the splitters 22b and 22d. Therefore, some of the components of the light that is irradiated from the illumination unit 10n and specularly reflected by the object W pass through the polarizing elements 22b and 22d. Specifically, the amount of light emitted from the illumination unit 10n and transmitted through each of the splitters 22b and 22d is about ½ of the amount of light emitted from the illumination unit 10n and transmitted through the splitter 22a.
 照明部10oから照射される非偏光は、偏光子22a~22dの偏光方向に平行な成分を均等に有する。そのため、照明部10oから照射され、偏光子22a~22dの各々を透過する光量は、同じである。 The non-polarized light emitted from the illumination unit 10o evenly has components parallel to the polarization direction of the splitters 22a to 22d. Therefore, the amount of light emitted from the illumination unit 10o and transmitted through each of the polarizing elements 22a to 22d is the same.
 照明部10n,10oから照射され、対象物Wで正反射し、偏光子22aを透過する光の強度をそれぞれIn,Ioとする。このとき、偏光子22a~22dにそれぞれ対応する偏光画像50a~50dそれぞれの画素値Ja~Jdは、以下のように表される。
偏光画像50aの画素値Ja:In+Io
偏光画像50bの画素値Jb:In/2+Io
偏光画像50cの画素値Jc:Io
偏光画像50dの画素値Jd:In/2+Io
以上から、偏光画像50a~50cは、互いに照明強度の異なる条件下の複数枚の画像に相当する。そのため、プロセッサ310は、偏光画像50a~50cをハイダイナミックレンジ合成して、合成画像を生成する。プロセッサ310は、公知のハイダイナミックレンジ合成の手法を用いて、合成画像を生成すればよい。
The intensities of the light emitted from the illumination units 10n and 10o, specularly reflected by the object W, and transmitted through the polarizing element 22a are defined as In and Io, respectively. At this time, the pixel values Ja to Jd of the polarized images 50a to 50d corresponding to the polarizing elements 22a to 22d are represented as follows.
Pixel value Ja: In + Io of polarized image 50a
Pixel value Jb of polarized image 50b: In / 2 + Io
Pixel value of polarized image 50c Jc: Io
Pixel value of polarized image 50d Jd: In / 2 + Io
From the above, the polarized images 50a to 50c correspond to a plurality of images under conditions where the illumination intensities are different from each other. Therefore, the processor 310 synthesizes the polarized images 50a to 50c in a high dynamic range to generate a composite image. The processor 310 may generate a composite image using a known high dynamic range synthesis technique.
 図35は、第5の検査例において得られた複数の偏光画像の一例を示す図である。図35に示されるように、偏光画像50aの輝度が最も高く、偏光画像50cの輝度が最も低く、偏光画像50b,50dの輝度が中間である。プロセッサ310は、偏光画像50a~50cをハイダイナミックレンジ合成して、合成画像57を生成する。合成画像57では、白飛びや黒つぶれが少ない。そのため、プロセッサ310は、合成画像57を用いることにより、対象物Wを精度良く検査できる。 FIG. 35 is a diagram showing an example of a plurality of polarized images obtained in the fifth inspection example. As shown in FIG. 35, the luminance of the polarized image 50a is the highest, the luminance of the polarized image 50c is the lowest, and the luminance of the polarized images 50b and 50d is intermediate. The processor 310 synthesizes the polarized images 50a to 50c in a high dynamic range to generate the composite image 57. In the composite image 57, there are few overexposure and underexposure. Therefore, the processor 310 can inspect the object W with high accuracy by using the composite image 57.
 <C.変形例>
 偏光カメラ20の単位領域21に含まれる偏光子22の個数は、4個に限定されず、複数であればよい。上述したように、第2の検査例および第4の検査例では、偏光画像50b,50dが使用されなくてもよい。そのため、単位領域21は、偏光子22a,22cのみを含み、偏光子22b,22dを含まなくてもよい。同様に、第5の検査例では、偏光画像50dが使用されない。そのため、単位領域21は、偏光子22a~22cのみを含み、偏光子22dを含まなくてもよい。
<C. Modification example>
The number of the polarizing elements 22 included in the unit region 21 of the polarizing camera 20 is not limited to four, and may be a plurality. As described above, the polarized images 50b and 50d may not be used in the second inspection example and the fourth inspection example. Therefore, the unit region 21 may include only the splitters 22a and 22c and may not include the splitters 22b and 22d. Similarly, in the fifth inspection example, the polarized image 50d is not used. Therefore, the unit region 21 may include only the splitters 22a to 22c and may not include the splitter 22d.
 第1の検査例において、複数の照明部10は、照明部10a,10cのみを含み、照明部10b,10dを含まなくてもよい。この場合、偏光カメラ20の単位領域21は、偏光子22a~22cのみを含み、偏光子22dを含まなくてもよい。偏光画像50a,50cそれぞれの画素値Ja,Jcは、以下の式で表される。
Ja=Ia=μ[cosα(θ-2φx)cosα(2φy)]
Jc=Ic=μ[cosα(θ+2φx)cosα(2φy)]
 対象物Wの表面の光沢度α=1の場合、以下の式(7),(8)が成り立つ。
Ja+Jc=2μcos(θ)cos(2φx)cos(2φy)・・・式(7)Ja-Jc=2μsin(θ)sin(2φx)cos(2φy)・・・式(8)。
In the first inspection example, the plurality of lighting units 10 may include only the lighting units 10a and 10c and may not include the lighting units 10b and 10d. In this case, the unit region 21 of the polarizing camera 20 may include only the polarizing elements 22a to 22c and may not include the polarizing elements 22d. The pixel values Ja and Jc of the polarized images 50a and 50c, respectively, are expressed by the following equations.
Ja = Ia = μ [cos α (θ-2φx) cos α (2φy)]
Jc = Ic = μ [cos α (θ + 2φx) cos α (2φy)]
When the glossiness α = 1 on the surface of the object W, the following equations (7) and (8) hold.
Ja + Jc = 2μcos (θ) cos (2φx) cos (2φy) ... Equation (7) Ja−Jc = 2μsin (θ) sin (2φx) cos (2φy) ・ ・ ・ Equation (8).
 上記の式(7)と式(8)とから、以下の式(9)が導かれる。
tan(2φx)=(Ja-Jc)/{(Ja+Jc)tan(θ)}・・・式(9)。
From the above equation (7) and equation (8), the following equation (9) is derived.
tan (2φx) = (Ja-Jc) / {(Ja + Jc) tan (θ)} ... Equation (9).
 式(9)において、入射角θは、照明部10a,10cの位置に応じて予め決定される。そのため、プロセッサ310は、偏光画像50a,50cの画素値Ja,Jcと入射角θとを用いて式(9)の右辺の計算し、その結果であるNxを画素値とするX方向法線画像を生成すればよい。式(9)で表されるtan(2φx)(≡Nx)は、法線ベクトルのx方向成分であるnxに依存する値である。そのため、X方向法線画像は、対象物Wの表面の法線方向を表す。 In the equation (9), the incident angle θ is determined in advance according to the positions of the illumination units 10a and 10c. Therefore, the processor 310 calculates the right side of the equation (9) using the pixel values Ja and Jc of the polarized images 50a and 50c and the incident angle θ, and the X-direction normal image having Nx as the pixel value as the result. Should be generated. Tan (2φx) (≡Nx) represented by the equation (9) is a value depending on nx, which is a component in the x direction of the normal vector. Therefore, the X-direction normal image represents the normal direction of the surface of the object W.
 さらに、プロセッサ310は、図17に示すように、注目点Qの左側および右側に矩形領域R1,R2をそれぞれ設定する。それから、プロセッサ310は、矩形領域R1に含まれる画素の値の和Sx1と、矩形領域R2に含まれる画素の値の和Sx2との差Sxを計算する。プロセッサ310は、X方向法線画像内の4つの画素で囲まれる点の全てについて、差Sxを計算する。プロセッサ310は、差Sxを画素値とする形状画像を生成すればよい。上述したように、差Sxは、キズまたは打痕のような、凹凸の生じる欠陥が存在する箇所において大きな値をとる。そのため、プロセッサ310は、形状画像を用いることにより、凹凸の生じる欠陥を精度良く検査できる。 Further, as shown in FIG. 17, the processor 310 sets rectangular areas R1 and R2 on the left side and the right side of the point of interest Q, respectively. Then, the processor 310 calculates the difference Sx between the sum Sx1 of the values of the pixels included in the rectangular region R1 and the sum Sx2 of the values of the pixels included in the rectangular region R2. The processor 310 calculates the difference Sx for all the points surrounded by the four pixels in the X-direction normal image. The processor 310 may generate a shape image having a difference Sx as a pixel value. As described above, the difference Sx takes a large value in a place where a defect having unevenness such as a scratch or a dent is present. Therefore, the processor 310 can accurately inspect defects having irregularities by using the shape image.
 第1の検査例において、上記の式(4)および式(5)を用いて、X方向法線画像およびY方向法線画像がそれぞれ生成されるものとした。しかしながら、法線画像の生成方法は、これに限定されない。たとえば、ディープラーニングの画像生成技術を用いて、複数の偏光画像から法線画像を推定する学習モデルが構築される。プロセッサ310は、当該学習モデルを用いて、複数の偏光画像から法線画像を生成してもよい。 In the first inspection example, it was assumed that the X-direction normal image and the Y-direction normal image were generated by using the above equations (4) and (5), respectively. However, the method of generating a normal image is not limited to this. For example, a learning model for estimating a normal image from a plurality of polarized images is constructed using a deep learning image generation technique. The processor 310 may generate a normal image from a plurality of polarized images using the learning model.
 §3 付記
 以上のように、本実施の形態は以下のような開示を含む。
§3 Addendum As described above, this embodiment includes the following disclosures.
 (構成1)
 検査システム(1)であって、
 対象物(W)を照明する複数の照明部(10,10a~10o)と、
 複数の偏光子(22,22a~22d)を含む単位領域(21)が繰り返して配列された偏光カメラ(20)と、
 検査装置(30)と、を備え、
 前記複数の照明部(10,10a~10o)は、互いに異なる偏光状態の照明光を照射し、
 前記複数の偏光子(22,22a~22d)は、互いに異なる偏光方向の光を透過させ、
 前記偏光カメラ(20)は、前記複数の照明部(10,10a~10o)が同時に点灯している状態において撮像することにより、前記複数の偏光子(22,22a~22d)にそれぞれ対応する複数の偏光画像(50,50a~50d)を出力し、
 前記検査装置(30)は、前記複数の偏光画像(50,50a~50d)を用いて前記対象物(W)を検査する、検査システム(1)。
(Structure 1)
Inspection system (1)
A plurality of lighting units (10, 10a to 10o) that illuminate the object (W), and
A polarizing camera (20) in which a unit region (21) including a plurality of polarizing elements (22, 22a to 22d) is repeatedly arranged, and a polarizing camera (20).
Equipped with an inspection device (30)
The plurality of lighting units (10, 10a to 10o) irradiate illumination light having different polarization states from each other.
The plurality of polarizing elements (22, 22a to 22d) transmit light in different polarization directions from each other.
The polarizing camera (20) corresponds to each of the plurality of polarizing elements (22, 22a to 22d) by taking an image in a state where the plurality of lighting units (10, 10a to 10o) are lit at the same time. The polarized image (50, 50a to 50d) of
The inspection system (1) inspects the object (W) using the plurality of polarized images (50, 50a to 50d).
 (構成2)
 前記複数の照明部は、前記偏光カメラ(20)の光軸(25)の周りの方位角が互いに異なるように配置され、
 前記複数の照明部は、第1~第Nの照明部(10a~10d)を含み、
 前記複数の偏光子は、第1~第Nの偏光子(22a~22d)を含み、
 Nは、2以上の整数であり、
 前記第1~第Nの照明部(10a~10d)の照明光の偏光方向は、前記第1~第Nの偏光子(22a~22d)を透過する光の偏光方向とそれぞれ平行であり、
 前記検査装置(30)は、前記複数の偏光画像(50a~50d)から前記対象物(W)の表面の法線方向を示す法線画像(51,52)を生成し、前記法線画像(51,52)に基づいて前記対象物(W)を検査する、構成1に記載の検査システム(1)。
(Structure 2)
The plurality of lighting units are arranged so that the azimuth angles around the optical axis (25) of the polarizing camera (20) are different from each other.
The plurality of lighting units include first to Nth lighting units (10a to 10d).
The plurality of splitters include first to Nth substituents (22a to 22d).
N is an integer greater than or equal to 2 and
The polarization directions of the illumination light of the first to Nth illumination units (10a to 10d) are parallel to the polarization directions of the light transmitted through the first to Nth modulators (22a to 22d), respectively.
The inspection device (30) generates a normal image (51, 52) showing the normal direction of the surface of the object (W) from the plurality of polarized images (50a to 50d), and the normal image (51, 52). 51, 52) The inspection system (1) according to configuration 1, which inspects the object (W) based on the object (W).
 (構成3)
 Nは4であり、
 前記第1の照明部(10a)および前記第3の照明部(10c)は、前記偏光カメラ(20)の光軸(25)に対して対称な位置に配置され、
 前記第2の照明部(10b)および前記第4の照明部(10d)は、前記偏光カメラ(20)の光軸(25)に対して対称な位置に配置され、
 前記偏光カメラ(20)の光軸(25)の周りにおいて、前記第1の照明部(10a)が配置される第1の方位角と前記第2の照明部(10b)が配置される第2の方位角との差が90°であり、
 前記複数の偏光画像は、前記第1~第4の偏光子(22a~22d)にそれぞれ対応する第1~第4の偏光画像(50a~50b)を含み、
 前記検査装置(30)は、
  前記第1~第4の偏光画像(50a~50b)に基づいて、前記対象物(W)の表面の法線ベクトルにおける前記第1の方位角の方向に沿った成分の大きさを示す第1の法線画像(51)を生成し、
  前記第1~第4の偏光画像に基づいて、前記対象物(W)の表面の法線ベクトルにおける前記第2の方位角の方向に沿った成分の大きさを示す第2の法線画像(52)を生成し、
  前記第1の法線画像(51)と前記第2の法線画像(52)とに基づいて、前記対象物(W)の表面の形状を示す形状画像(53)を生成し、
  前記形状画像(53)に基づいて前記対象物(W)を検査する、構成2に記載の検査システム(1)。
(Structure 3)
N is 4
The first lighting unit (10a) and the third lighting unit (10c) are arranged at positions symmetrical with respect to the optical axis (25) of the polarizing camera (20).
The second lighting unit (10b) and the fourth lighting unit (10d) are arranged at positions symmetrical with respect to the optical axis (25) of the polarizing camera (20).
Around the optical axis (25) of the polarizing camera (20), a first azimuth angle in which the first illumination unit (10a) is arranged and a second azimuth in which the second illumination unit (10b) is arranged are arranged. The difference from the azimuth of is 90 °,
The plurality of polarized images include first to fourth polarized images (50a to 50b) corresponding to the first to fourth polarizing elements (22a to 22d), respectively.
The inspection device (30) is
A first indicating the size of a component along the direction of the first azimuth in the normal vector of the surface of the object (W) based on the first to fourth polarized images (50a to 50b). Generated a normal image (51) of
A second normal image showing the magnitude of a component along the direction of the second azimuth in the normal vector of the surface of the object (W) based on the first to fourth polarized images ( 52) is generated,
Based on the first normal image (51) and the second normal image (52), a shape image (53) showing the shape of the surface of the object (W) is generated.
The inspection system (1) according to configuration 2, which inspects the object (W) based on the shape image (53).
 (構成4)
 前記第2の照明部(10b)、前記第3の照明部(10c)および前記第4の照明部(10d)の照明光の偏光方向と前記第1の照明部(10a)の照明光の偏光方向とのなす角度は、それぞれ45°、90°および135°である、構成3に記載の検査システム(1)。
(Structure 4)
The polarization direction of the illumination light of the second illumination unit (10b), the third illumination unit (10c), and the fourth illumination unit (10d) and the polarization of the illumination light of the first illumination unit (10a). The inspection system (1) according to configuration 3, wherein the angles formed with the directions are 45 °, 90 °, and 135 °, respectively.
 (構成5)
 前記複数の照明部(10e~10l)は、前記対象物に対する仰角が互いに異なるように配置される、構成1に記載の検査システム(1)。
(Structure 5)
The inspection system (1) according to the configuration 1, wherein the plurality of lighting units (10e to 10l) are arranged so that the elevation angles with respect to the object are different from each other.
 (構成6)
 前記複数の照明部は、
  前記偏光カメラの光軸に沿って照明光を照射する第1の照明部(10e)と、
  前記偏光カメラの光軸を中心とするリング型の第2の照明部(10f)と、を含み、
 前記複数の偏光子は、第1の偏光子(22a)および第2の偏光子(22c)を含み、
 前記第1の照明部(10e)および前記第2の照明部(10f)から照射される照明光の偏光方向は、前記第1の偏光子(22a)および前記第2の偏光子(22c)を透過する光の偏光方向とそれぞれ一致する、構成5に記載の検査システム(1)。
(Structure 6)
The plurality of lighting units are
A first illumination unit (10e) that irradiates illumination light along the optical axis of the polarizing camera, and
A ring-shaped second illumination unit (10f) centered on the optical axis of the polarizing camera is included.
The plurality of polarizing elements include a first polarizing element (22a) and a second polarizing element (22c).
The polarization direction of the illumination light emitted from the first illuminating unit (10e) and the second illuminating unit (10f) is the same as that of the first polarizing element (22a) and the second polarizing element (22c). The inspection system (1) according to the configuration 5, which coincides with the polarization direction of the transmitted light.
 (構成7)
 前記複数の照明部は、前記偏光カメラの光軸を中心とする同心円状の複数のリング型照明部(10g~10l)を含み、
 前記検査装置(30)は、
  前記複数の偏光画像(50a~50d)に基づいて、前記対象物(W)の表面の法線方向と前記偏光カメラの光軸方向とのなす角度を示す位相画像を生成し、
  前記位相画像に基づいて、前記対象物(W)を検査する、構成5に記載の検査システム(1)。
(Structure 7)
The plurality of illumination units include a plurality of concentric ring-shaped illumination units (10 g to 10 l) centered on the optical axis of the polarizing camera.
The inspection device (30) is
Based on the plurality of polarized images (50a to 50d), a phase image showing an angle formed by the normal direction of the surface of the object (W) and the optical axis direction of the polarized camera is generated.
The inspection system (1) according to configuration 5, which inspects the object (W) based on the phase image.
 (構成8)
 前記複数の照明部は、さらに、前記偏光カメラ(20)の光軸(25)に沿って照明光を照射する照明部(10e)を含む、構成7に記載の検査システム(1)。
(Structure 8)
The inspection system (1) according to configuration 7, wherein the plurality of illumination units further include an illumination unit (10e) that irradiates illumination light along the optical axis (25) of the polarizing camera (20).
 (構成9)
 前記複数の照明部は、
  前記偏光カメラの光軸に沿って照明光を照射する第1の照明部(10e)と、
  前記対象物の背面側から照明光を照射する第2の照明部(10m)と、を含み、
 前記複数の偏光子は、第1の偏光子(22a)および第2の偏光子(22c)を含み、
 前記第1の照明部(10e)および前記第2の照明部(10m)から照射される照明光の偏光方向は、前記第1の偏光子(22a)および前記第2の偏光子(22c)を透過する光の偏光方向とそれぞれ一致する、構成1に記載の検査システム(1)。
(Structure 9)
The plurality of lighting units are
A first illumination unit (10e) that irradiates illumination light along the optical axis of the polarizing camera, and
A second illumination unit (10 m) that irradiates illumination light from the back surface side of the object is included.
The plurality of polarizing elements include a first polarizing element (22a) and a second polarizing element (22c).
The polarization direction of the illumination light emitted from the first illuminating unit (10e) and the second illuminating unit (10 m) is the same as that of the first polarizing element (22a) and the second polarizing element (22c). The inspection system (1) according to configuration 1, which coincides with the polarization direction of the transmitted light.
 (構成10)
 前記複数の照明部は、直線偏光の照明光を照射する第1の照明部(10n)と、非偏光を照射する第2の照明部(10o)と、を含み、
 前記複数の偏光子は、
  前記第1の照明部(10n)の照明光の偏光方向と同じ偏光方向の光を透過させる第1の偏光子(22a)と、
  前記第1の偏光子(22a)を透過する光の偏光方向とのなす角度が45°の偏光方向の光を透過させる第2の偏光子(22b)と、
  前記第1の偏光子(22a)を透過する光の偏光方向とのなす角度が90°の偏光方向の光を透過させる第3の偏光子(22c)と、を含み、
 前記複数の偏光画像は、前記第1~第3の偏光子(22a~22c)にそれぞれ対応する第1~第3の偏光画像(55a~55c)を含み、
 前記検査装置(30)は、
  前記第1~第3の偏光画像(55a~55c)をハイダイナミックレンジ合成して、合成画像を生成し、
  前記合成画像に基づいて、前記対象物(W)を検査する、構成1に記載の検査システム(1)。
(Structure 10)
The plurality of illumination units include a first illumination unit (10n) that irradiates linearly polarized illumination light and a second illumination unit (10o) that irradiates non-polarized illumination.
The plurality of substituents are
A first polarizing element (22a) that transmits light in the same polarization direction as the polarization direction of the illumination light of the first illumination unit (10n).
A second polarizing element (22b) that transmits light in a polarization direction having an angle of 45 ° with the polarization direction of the light transmitted through the first polarizing element (22a).
A third polarizing element (22c) that transmits light in a polarization direction having an angle of 90 ° with respect to the polarization direction of the light transmitted through the first polarizing element (22a) is included.
The plurality of polarized images include first to third polarized images (55a to 55c) corresponding to the first to third polarizing elements (22a to 22c), respectively.
The inspection device (30) is
The first to third polarized images (55a to 55c) are synthesized in a high dynamic range to generate a composite image.
The inspection system (1) according to configuration 1, which inspects the object (W) based on the composite image.
 (構成11)
 対象物(W)を照明する複数の照明部(10,10a~10o)と、複数の偏光子(22,22a~22d)を含む単位領域(21)が繰り返し配列された偏光カメラ(20)とを用いた検査方法であって、
 前記複数の照明部(10,10a~10o)は、互いに異なる偏光状態の照明光を照射し、
 前記複数の偏光子(22,22a~22d)は、互いに異なる偏光方向の光を透過させ、
 前記検査方法は、
 前記複数の照明部(10,10a~10o)が同時に点灯している状態において、前記偏光カメラ(20)を用いて前記対象物(W)を撮像することにより、前記複数の偏光子(22,22a~22d)にそれぞれ対応する複数の偏光画像(50,50a~50d)を取得するステップと、
 前記複数の偏光画像(50,50a~50d)を用いて前記対象物(W)を検査するステップと、を備える、検査方法。
(Structure 11)
A polarizing camera (20) in which a plurality of lighting units (10, 10a to 10o) for illuminating an object (W) and a unit region (21) including a plurality of polarizing elements (22, 22a to 22d) are repeatedly arranged. It is an inspection method using
The plurality of lighting units (10, 10a to 10o) irradiate illumination light having different polarization states from each other.
The plurality of polarizing elements (22, 22a to 22d) transmit light in different polarization directions from each other.
The inspection method is
In a state where the plurality of lighting units (10, 10a to 10o) are lit at the same time, the object (W) is imaged by the polarizing camera (20), whereby the plurality of polarizing elements (22, A step of acquiring a plurality of polarized images (50, 50a to 50d) corresponding to 22a to 22d), respectively.
An inspection method comprising a step of inspecting the object (W) using the plurality of polarized images (50, 50a to 50d).
 本発明の実施の形態について説明したが、今回開示された実施の形態はすべての点で例示であって制限的なものではないと考えられるべきである。本発明の範囲は請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。 Although the embodiments of the present invention have been described, the embodiments disclosed this time should be considered to be exemplary in all respects and not restrictive. The scope of the present invention is indicated by the scope of claims and is intended to include all modifications within the meaning and scope equivalent to the scope of claims.
 1 検査システム、2 搬送ベルト、10,10a~10o 照明部、11a~11o 発光部、12a~12n 直線偏光フィルタ、13e ハーフミラー、20 偏光カメラ、21 単位領域、22,22a~22d 偏光子、25,225 光軸、30 検査装置、50,50a~50d 偏光画像、51 X方向法線画像、52 Y方向法線画像、53 形状画像、54 二値画像、55 アルベド画像、56 平均画像、57 合成画像、110 照明装置、110a~110d 円弧領域、302 表示部、304 キーボード、306 メモリカード、310 プロセッサ、312 RAM、314 表示コントローラ、316 システムコントローラ、318 コントローラ、320 ハードディスク、322 カメラインターフェイス、324 入力インターフェイス、328 通信インターフェイス、330 メモリカードインターフェイス、350 検査プログラム、F1,F2 枠線、P 点、Q 注目点、R1~R4 矩形領域、W 対象物、n 法線ベクトル。 1 Inspection system, 2 Conveyance belt, 10, 10a to 10o Lighting unit, 11a to 11o light emitting unit, 12a to 12n linear polarization filter, 13e half mirror, 20 polarization camera, 21 unit area, 22, 22a to 22d polarizing element, 25 , 225 optical axis, 30 inspection device, 50, 50a-50d polarized image, 51 X direction normal image, 52 Y direction normal image, 53 shape image, 54 binary image, 55 albed image, 56 average image, 57 composite Image, 110 lighting device, 110a-110d arc area, 302 display unit, 304 keyboard, 306 memory card, 310 processor, 312 RAM, 314 display controller, 316 system controller, 318 controller, 320 hard disk, 322 camera interface, 324 input interface 328 communication interface, 330 memory card interface, 350 inspection program, F1, F2 border, P point, Q attention point, R1 to R4 rectangular area, W object, n normal vector.

Claims (11)

  1.  検査システムであって、
     対象物を照明する複数の照明部と、
     複数の偏光子を含む単位領域が繰り返して配列された偏光カメラと、
     検査装置と、を備え、
     前記複数の照明部は、互いに異なる偏光状態の照明光を照射し、
     前記複数の偏光子は、互いに異なる偏光方向の光を透過させ、
     前記偏光カメラは、前記複数の照明部が同時に点灯している状態において撮像することにより、前記複数の偏光子にそれぞれ対応する複数の偏光画像を出力し、
     前記検査装置は、前記複数の偏光画像を用いて前記対象物を検査する、検査システム。
    It ’s an inspection system,
    Multiple lighting units that illuminate the object,
    A polarized camera in which unit regions containing multiple modulators are repeatedly arranged, and
    Equipped with inspection equipment,
    The plurality of lighting units irradiate illumination light having different polarization states from each other.
    The plurality of polarizing elements transmit light in different polarization directions from each other.
    The polarizing camera outputs a plurality of polarized images corresponding to the plurality of polarizing elements by taking an image in a state where the plurality of lighting units are lit at the same time.
    The inspection device is an inspection system that inspects the object using the plurality of polarized images.
  2.  前記複数の照明部は、前記偏光カメラの光軸の周りの方位角が互いに異なるように配置され、
     前記複数の照明部は、第1~第Nの照明部を含み、
     前記複数の偏光子は、第1~第Nの偏光子を含み、
     Nは、2以上の整数であり、
     前記第1~第Nの照明部の照明光の偏光方向は、前記第1~第Nの偏光子を透過する光の偏光方向とそれぞれ平行であり、
     前記検査装置は、前記複数の偏光画像から前記対象物の表面の法線方向を示す法線画像を生成し、前記法線画像に基づいて前記対象物を検査する、請求項1に記載の検査システム。
    The plurality of illumination units are arranged so that the azimuth angles around the optical axis of the polarizing camera are different from each other.
    The plurality of lighting units include the first to Nth lighting units.
    The plurality of modulators include the first to Nth substituents, and the plurality of substituents include.
    N is an integer greater than or equal to 2 and
    The polarization direction of the illumination light of the first to Nth illumination units is parallel to the polarization direction of the light transmitted through the first to Nth polarizing elements.
    The inspection according to claim 1, wherein the inspection device generates a normal image showing the normal direction of the surface of the object from the plurality of polarized images, and inspects the object based on the normal image. system.
  3.  Nは4であり、
     前記第1の照明部および前記第3の照明部は、前記偏光カメラの光軸に対して対称な位置に配置され、
     前記第2の照明部および前記第4の照明部は、前記偏光カメラの光軸に対して対称な位置に配置され、
     前記偏光カメラの光軸の周りにおいて、前記第1の照明部が配置される第1の方位角と前記第2の照明部が配置される第2の方位角との差が90°であり、
     前記複数の偏光画像は、前記第1~第4の偏光子にそれぞれ対応する第1~第4の偏光画像を含み、
     前記検査装置は、
      前記第1~第4の偏光画像に基づいて、前記対象物の表面の法線ベクトルにおける前記第1の方位角の方向に沿った成分の大きさを示す第1の法線画像を生成し、
      前記第1~第4の偏光画像に基づいて、前記対象物の表面の法線ベクトルにおける前記第2の方位角の方向に沿った成分の大きさを示す第2の法線画像を生成し、
      前記第1の法線画像と前記第2の法線画像とに基づいて、前記対象物の表面の形状を示す形状画像を生成し、
      前記形状画像に基づいて前記対象物を検査する、請求項2に記載の検査システム。
    N is 4
    The first lighting unit and the third lighting unit are arranged at positions symmetrical with respect to the optical axis of the polarizing camera.
    The second lighting unit and the fourth lighting unit are arranged at positions symmetrical with respect to the optical axis of the polarizing camera.
    Around the optical axis of the polarizing camera, the difference between the first azimuth in which the first illumination unit is arranged and the second azimuth in which the second illumination unit is arranged is 90 °.
    The plurality of polarized images include first to fourth polarized images corresponding to the first to fourth modulators, respectively.
    The inspection device is
    Based on the first to fourth polarized images, a first normal image showing the magnitude of a component along the direction of the first azimuth in the normal vector of the surface of the object is generated.
    Based on the first to fourth polarized images, a second normal image showing the magnitude of the component along the direction of the second azimuth in the normal vector of the surface of the object is generated.
    Based on the first normal image and the second normal image, a shape image showing the shape of the surface of the object is generated.
    The inspection system according to claim 2, wherein the object is inspected based on the shape image.
  4.  前記第2の照明部、前記第3の照明部および前記第4の照明部の照明光の偏光方向と前記第1の照明部の照明光の偏光方向とのなす角度は、それぞれ45°、90°および135°である、請求項3に記載の検査システム。 The angles formed by the polarization direction of the illumination light of the second illumination unit, the third illumination unit, and the fourth illumination unit and the polarization direction of the illumination light of the first illumination unit are 45 ° and 90, respectively. The inspection system of claim 3, wherein ° and 135 °.
  5.  前記複数の照明部は、前記対象物に対する仰角が互いに異なるように配置される、請求項1に記載の検査システム。 The inspection system according to claim 1, wherein the plurality of lighting units are arranged so that the elevation angles with respect to the object are different from each other.
  6.  前記複数の照明部は、
      前記偏光カメラの光軸に沿って照明光を照射する第1の照明部と、
      前記偏光カメラの光軸を中心とするリング型の第2の照明部と、を含み、
     前記複数の偏光子は、第1の偏光子および第2の偏光子を含み、
     前記第1の照明部および前記第2の照明部から照射される照明光の偏光方向は、前記第1の偏光子および前記第2の偏光子を透過する光の偏光方向とそれぞれ一致する、請求項5に記載の検査システム。
    The plurality of lighting units are
    A first illumination unit that irradiates illumination light along the optical axis of the polarizing camera, and
    Includes a ring-shaped second illumination unit centered on the optical axis of the polarizing camera.
    The plurality of splitters include a first modulator and a second substituent.
    The polarization direction of the illumination light emitted from the first illumination unit and the second illumination unit coincides with the polarization direction of the light transmitted through the first polarizing element and the second polarizing element, respectively. Item 5. The inspection system according to Item 5.
  7.  前記複数の照明部は、前記偏光カメラの光軸を中心とする同心円状の複数のリング型照明部を含み、
     前記検査装置は、
      前記複数の偏光画像に基づいて、前記対象物の表面の法線方向と前記偏光カメラの光軸方向とのなす角度を示す位相画像を生成し、
      前記位相画像に基づいて、前記対象物を検査する、請求項5に記載の検査システム。
    The plurality of illumination units include a plurality of concentric ring-shaped illumination units centered on the optical axis of the polarizing camera.
    The inspection device is
    Based on the plurality of polarized images, a phase image showing an angle formed by the normal direction of the surface of the object and the optical axis direction of the polarized camera is generated.
    The inspection system according to claim 5, wherein the object is inspected based on the phase image.
  8.  前記複数の照明部は、さらに、前記偏光カメラの光軸に沿って照明光を照射する照明部を含む、請求項7に記載の検査システム。 The inspection system according to claim 7, wherein the plurality of lighting units further include a lighting unit that irradiates illumination light along the optical axis of the polarizing camera.
  9.  前記複数の照明部は、
      前記偏光カメラの光軸に沿って照明光を照射する第1の照明部と、
      前記対象物の背面側から照明光を照射する第2の照明部と、を含み、
     前記複数の偏光子は、第1の偏光子および第2の偏光子を含み、
     前記第1の照明部および前記第2の照明部から照射される照明光の偏光方向は、前記第1の偏光子および前記第2の偏光子を透過する光の偏光方向とそれぞれ一致する、請求項1に記載の検査システム。
    The plurality of lighting units are
    A first illumination unit that irradiates illumination light along the optical axis of the polarizing camera, and
    A second illumination unit that irradiates illumination light from the back surface side of the object, and the like.
    The plurality of splitters include a first modulator and a second substituent.
    The polarization direction of the illumination light emitted from the first illumination unit and the second illumination unit coincides with the polarization direction of the light transmitted through the first polarizing element and the second polarizing element, respectively. Item 1. The inspection system according to Item 1.
  10.  前記複数の照明部は、直線偏光の照明光を照射する第1の照明部と、非偏光を照射する第2の照明部と、を含み、
     前記複数の偏光子は、
      前記第1の照明部の照明光の偏光方向と同じ偏光方向の光を透過させる第1の偏光子と、
      前記第1の偏光子を透過する光の偏光方向とのなす角度が45°の偏光方向の光を透過させる第2の偏光子と、
      前記第1の偏光子を透過する光の偏光方向とのなす角度が90°の偏光方向の光を透過させる第3の偏光子と、を含み、
     前記複数の偏光画像は、前記第1~第3の偏光子にそれぞれ対応する第1~第3の偏光画像を含み、
     前記検査装置は、
      前記第1~第3の偏光画像をハイダイナミックレンジ合成して、合成画像を生成し、
      前記合成画像に基づいて、前記対象物を検査する、請求項1に記載の検査システム。
    The plurality of illumination units include a first illumination unit that irradiates linearly polarized illumination light and a second illumination unit that irradiates non-polarized illumination.
    The plurality of substituents are
    A first polarizing element that transmits light in the same polarization direction as the polarization direction of the illumination light of the first illumination unit,
    A second polarizing element that transmits light in a polarization direction having an angle of 45 ° with the polarization direction of the light transmitted through the first polarizing element.
    A third polarizing element that transmits light in a polarization direction having an angle of 90 ° with respect to the polarization direction of the light transmitted through the first polarizing element is included.
    The plurality of polarized images include the first to third polarized images corresponding to the first to third modulators, respectively.
    The inspection device is
    The first to third polarized images are synthesized in a high dynamic range to generate a composite image.
    The inspection system according to claim 1, wherein the object is inspected based on the composite image.
  11.  対象物を照明する複数の照明部と、複数の偏光子を含む単位領域が繰り返し配列された偏光カメラとを用いた検査方法であって、
     前記複数の照明部は、互いに異なる偏光状態の照明光を照射し、
     前記複数の偏光子は、互いに異なる偏光方向の光を透過させ、
     前記検査方法は、
     前記複数の照明部が同時に点灯している状態において、前記偏光カメラを用いて前記対象物を撮像することにより、前記複数の偏光子にそれぞれ対応する複数の偏光画像を取得するステップと、
     前記複数の偏光画像を用いて前記対象物を検査するステップと、を備える、検査方法。
    It is an inspection method using a plurality of lighting units for illuminating an object and a polarizing camera in which unit regions including a plurality of polarizing elements are repeatedly arranged.
    The plurality of lighting units irradiate illumination light having different polarization states from each other.
    The plurality of polarizing elements transmit light in different polarization directions from each other.
    The inspection method is
    A step of acquiring a plurality of polarized images corresponding to the plurality of polarizing elements by photographing the object with the polarized camera in a state where the plurality of lighting units are lit at the same time.
    An inspection method comprising the step of inspecting the object using the plurality of polarized images.
PCT/JP2021/009554 2020-12-24 2021-03-10 Inspection system and inspection method WO2022137579A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-214546 2020-12-24
JP2020214546A JP2022100524A (en) 2020-12-24 2020-12-24 Inspection system and inspection method

Publications (1)

Publication Number Publication Date
WO2022137579A1 true WO2022137579A1 (en) 2022-06-30

Family

ID=82158012

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/009554 WO2022137579A1 (en) 2020-12-24 2021-03-10 Inspection system and inspection method

Country Status (2)

Country Link
JP (1) JP2022100524A (en)
WO (1) WO2022137579A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012237680A (en) * 2011-05-12 2012-12-06 Ihi Corp Device and method for inspecting coated state, and program
JP2013036908A (en) * 2011-08-10 2013-02-21 Ricoh Co Ltd Observation device
US20150363914A1 (en) * 2012-10-17 2015-12-17 Cathx Research Ltd Processing survey data of an underwater scene
JP2018082424A (en) * 2016-11-04 2018-05-24 パナソニックIpマネジメント株式会社 Image forming apparatus
JP2018084572A (en) * 2016-11-15 2018-05-31 パナソニックIpマネジメント株式会社 Image forming apparatus
WO2019012858A1 (en) * 2017-07-12 2019-01-17 ソニー株式会社 Imaging device, image generation method, and imaging system
WO2020235067A1 (en) * 2019-05-22 2020-11-26 オムロン株式会社 Three-dimensional measurement system and three-dimensional measurement method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012237680A (en) * 2011-05-12 2012-12-06 Ihi Corp Device and method for inspecting coated state, and program
JP2013036908A (en) * 2011-08-10 2013-02-21 Ricoh Co Ltd Observation device
US20150363914A1 (en) * 2012-10-17 2015-12-17 Cathx Research Ltd Processing survey data of an underwater scene
JP2018082424A (en) * 2016-11-04 2018-05-24 パナソニックIpマネジメント株式会社 Image forming apparatus
JP2018084572A (en) * 2016-11-15 2018-05-31 パナソニックIpマネジメント株式会社 Image forming apparatus
WO2019012858A1 (en) * 2017-07-12 2019-01-17 ソニー株式会社 Imaging device, image generation method, and imaging system
WO2020235067A1 (en) * 2019-05-22 2020-11-26 オムロン株式会社 Three-dimensional measurement system and three-dimensional measurement method

Also Published As

Publication number Publication date
JP2022100524A (en) 2022-07-06

Similar Documents

Publication Publication Date Title
US9752869B2 (en) Systems and methods for performing machine vision using diffuse structured light
CN101443654B (en) Surface inspection apparatus
JP2011002240A (en) Three-dimensional shape measurement method and device
JP2009192520A (en) Surface inspection device
JP2012127934A (en) Inspection method and inspection device
JP2007322272A (en) Surface inspection device
JP2000146554A (en) Method and apparatus for inspecting surface unevenness of transparent board
JP2009168454A (en) Surface flaw inspection device and surface flaw inspection method
JP2014240766A (en) Surface inspection method and device
US8223328B2 (en) Surface inspecting apparatus and surface inspecting method
JP4462232B2 (en) Surface inspection device
WO2022137579A1 (en) Inspection system and inspection method
JP2009222625A (en) Device and method for inspecting pattern
JP2009103494A (en) Surface inspection apparatus
JP4506723B2 (en) Surface inspection device
JP4696607B2 (en) Surface inspection device
JP4552202B2 (en) Surface inspection device
CN104704346A (en) Method and device for identifying materials in scene
JP2014095617A (en) Device for measuring pattern and method for measuring pattern
JP2009265026A (en) Inspection device
CN112986284A (en) Combined type light source structure and optical detection device
JP2016080517A (en) Surface inspection device
TW201641928A (en) System for object inspection
JP7062798B1 (en) Inspection system and inspection method
JP2009180702A (en) Method of adjusting defect inspection device, evaluating method of adjusting state of defect inspection device, and setting method of azimuthal angle of pattern

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21909741

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21909741

Country of ref document: EP

Kind code of ref document: A1