WO2021215064A1 - Imaging device - Google Patents

Imaging device Download PDF

Info

Publication number
WO2021215064A1
WO2021215064A1 PCT/JP2021/003108 JP2021003108W WO2021215064A1 WO 2021215064 A1 WO2021215064 A1 WO 2021215064A1 JP 2021003108 W JP2021003108 W JP 2021003108W WO 2021215064 A1 WO2021215064 A1 WO 2021215064A1
Authority
WO
WIPO (PCT)
Prior art keywords
light source
light
imaging device
image sensor
signal processing
Prior art date
Application number
PCT/JP2021/003108
Other languages
French (fr)
Japanese (ja)
Inventor
雅之 高瀬
大塚 信之
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2021215064A1 publication Critical patent/WO2021215064A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B11/00Filters or other obturators specially adapted for photographic purposes
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof

Definitions

  • the present invention relates to an imaging device, and is particularly suitable for imaging a fine object such as dust.
  • Patent Document 1 describes a ranging system that irradiates light from two light sources, captures the reflected light with an imaging unit, and measures the distance to an object. In this distance measuring system, the distance information calculated for each pixel is combined to generate a distance image for one frame.
  • the reflected light of the light radiated to the front surface of the object is collected on the light receiving surface of the image sensor.
  • the object as the subject has a fine shape such as dust
  • the reflected light from the minute region on the front surface of the object is collected on the light receiving surface of the image sensor, and the reflected light is incident on the light receiving surface.
  • the area becomes extremely narrow. Therefore, it becomes difficult to properly detect an object from the captured image.
  • an object of the present invention is to provide an imaging device capable of appropriately detecting an object from an captured image even if it is a minute object such as dust.
  • the image pickup apparatus is a light source unit that irradiates a light source area with light, an image pickup element that images the picture area, and a collection that collects light from the picture area on the image pickup device. It includes an optical lens and a reflecting surface that reflects a part of the light emitted from the light source unit toward the subject area.
  • the light collected by the image pickup element is only the reflected light that is directly applied to the object from the light source unit and reflected on the surface of the object.
  • the reflected light captured by the image pickup element is the reflected light reflected in the minute area on the surface of the object. Therefore, on the light receiving surface of the image sensor, the region where the reflected light is collected becomes extremely small.
  • the reflecting surface is provided as described above, the direction of the light radiated directly to the object from the light source unit and the direction of the light radiated to the object through the reflecting surface are different.
  • the reflected light reflected in the area outside this area is also focused on the image pickup element. Will be done. Therefore, on the light receiving surface of the image sensor, the region where the reflected light from the object is collected can be widened. Therefore, according to the image pickup apparatus according to this aspect, even a minute object such as dust can be detected more appropriately from the captured image.
  • an imaging device capable of appropriately detecting an object from an captured image even if it is a minute object such as dust.
  • FIG. 1 is a plan view schematically showing a configuration of an image pickup apparatus when viewed in the negative direction of the Z axis according to the first embodiment.
  • FIG. 2 is a block diagram showing a configuration of an imaging device according to the first embodiment.
  • FIG. 3 is a side view schematically showing the configuration of the image pickup apparatus when viewed in the positive direction of the X-axis according to the second embodiment.
  • 4 (a) and 4 (b) are diagrams showing the size of each part of the configuration of the embodiment set in the verification experiment of object detection.
  • FIG. 5A is a diagram showing a captured image captured using the configuration of the comparative example.
  • FIG. 5B is a diagram showing an captured image captured using the configuration of the embodiment.
  • FIG. 6 is a side view schematically showing the configuration of the image pickup apparatus when viewed in the positive direction of the X-axis according to the modified example of the second embodiment.
  • FIG. 7 is a side view schematically showing the configuration of the image pickup apparatus when viewed in the positive direction of the X-axis according to the third embodiment.
  • FIG. 8 is a plan view schematically showing the configuration of the image pickup apparatus when viewed in the negative direction of the Z axis according to the fourth embodiment.
  • FIG. 9 is a block diagram showing the configuration of the image pickup apparatus according to the fourth embodiment.
  • FIG. 10 is a flowchart showing a captured image generation process by the signal processing unit according to the fourth embodiment.
  • FIG. 11 (a) and 11 (b) are diagrams schematically showing a projection region on the light receiving surface of the image sensor according to the fourth embodiment.
  • FIG. 12A is a flowchart showing a captured image generation process by the signal processing unit according to the fifth embodiment.
  • FIG. 12B is a diagram schematically showing the configuration of a table showing the optimum position of the light source unit corresponding to the distance to the object according to the fifth embodiment.
  • FIG. 13A is a plan view showing a configuration in the vicinity of the light source portion when viewed in the negative direction of the Z axis according to the sixth embodiment.
  • FIG. 13B is a side view showing a configuration in the vicinity of the light source portion when viewed in the positive direction of the Y-axis according to the sixth embodiment.
  • FIG. 14 is a flowchart showing a captured image generation process by the signal processing unit according to the sixth embodiment.
  • FIG. 15 is a plan view schematically showing the configuration of the image pickup apparatus when viewed in the Z-axis direction according to the seventh embodiment.
  • FIG. 16 is a plan view schematically showing the configuration of the image pickup apparatus when viewed in the Z-axis direction according to the eighth embodiment.
  • FIG. 17 is a block diagram showing a configuration of an image pickup apparatus according to the eighth embodiment.
  • FIG. 18 is a flowchart showing a captured image generation process by the signal processing unit according to the eighth embodiment.
  • the XY plane is a horizontal plane
  • the Z-axis direction is a vertical direction.
  • FIG. 1 is a plan view schematically showing the configuration of the image pickup apparatus 1 when viewed in the negative direction of the Z axis.
  • the image pickup device 1 includes two light source units 10, two reflection members 20, a condenser lens 30, and an image pickup element 40.
  • the light source unit 10 and the condenser lens 30 are fixed to the housing of the image pickup apparatus 1 via a support unit (not shown).
  • the reflection member 20 and the image pickup device 40 are fixed to the housing of the image pickup apparatus 1 directly or via another member.
  • the configuration of the two light source units 10 is the same.
  • the two light source units 10 are arranged on the Y-axis positive side and the Y-axis negative side of the condenser lens 30, respectively.
  • the two light source units 10 are arranged symmetrically in the Y-axis direction with respect to the optical axis 31 of the condenser lens 30.
  • the light source unit 10 includes a light source 11 and a diffuser plate 12.
  • the light source 11 is composed of, for example, an LED.
  • the light source 11 emits light having a predetermined spreading angle in the positive direction of the X-axis, and irradiates the projected area A10 with the light.
  • the emitted wave of the light source 11 is, for example, an infrared wavelength band.
  • the diffuser plate 12 is composed of, for example, a microlens array, and increases the spread angle of the light emitted from the light source 11. Of the light emitted from the light source 11 and transmitted through the diffuser plate 12, a part of the light is directly irradiated to the projection area A10 without passing through the reflection member 20, and the other part is reflected by the reflection member 20 and covered.
  • the image area A10 is irradiated.
  • the configurations of the two reflective members 20 are the same.
  • the two reflecting members 20 are the light source unit 10 arranged on the positive side of the Y-axis of the condensing lens 30 and the negative side of the Y-axis of the condensing lens 30, respectively. It is located on the negative side of the Y-axis.
  • the reflective member 20 has a flat plate shape and includes a reflective surface 21 parallel to the XX plane.
  • the reflecting surface 21 is composed of, for example, a mirror.
  • the reflecting member 20 is arranged so that the reflecting surface 21 is symmetrical with respect to the optical axis 31 of the condenser lens 30.
  • the reflecting surface 21 reflects a part of the light emitted from the light source 11 toward the subject area A10.
  • the projected area A10 is a spatial area located on the positive side of the X-axis of the two light source units 10 and near the middle of the two reflecting members 20.
  • the light from the subject area A10 is taken into the condenser lens 30 and condensed on the light receiving surface 41 of the image sensor 40. That is, an image of an object existing in the subject area A10 is formed on the light receiving surface 41.
  • the object to be imaged existing in the subject area A10 is imaged.
  • the object to be imaged is, for example, fine particles, so-called particles.
  • the light from the light source unit 10 irradiates the object in the subject area A10. The light radiated to the object is reflected on the surface of the object.
  • the condensing lens 30 condenses the light from the subject area A10 on the image sensor 40.
  • the condenser lens 30 is arranged near the middle of the two light source units 10.
  • the optical axis 31 of the condenser lens 30 is parallel to the X-axis direction, and the optical axis 31 passes through the center of the light receiving surface 41 of the image sensor 40.
  • the condenser lens 30 guides the reflected light from the image pickup region A10 to the light receiving surface 41 of the image sensor 40, and forms an image of the region of the light irradiated to the object on the light receiving surface 41.
  • the condenser lens 30 does not have to be one lens, and may be configured by combining a plurality of lenses.
  • the image sensor 40 images the image area A10.
  • the image sensor 40 is composed of a CMOS image sensor or a CCD image sensor.
  • a plurality of pixels 41a are arranged vertically and horizontally on the light receiving surface 41 of the image sensor 40.
  • the image pickup device 40 is configured to be able to receive light in the same wavelength band as the light emitted from the light source 11.
  • a filter that transmits the wavelength band of light emitted from the light source 11 may be arranged in front of the image sensor 40.
  • the image sensor 40 images the reflected light from the object irradiated on the light receiving surface 41, and outputs the image pickup signal to the signal processing circuit 103 (see FIG. 2) in the subsequent stage.
  • the object in the subject area A10 has both the light directly emitted from the light source 11 and the light emitted from the light source 11 reflected by the reflecting surface 21 of the reflecting member 20. Be irradiated. Therefore, on the object, a region where the light is directly irradiated from the light source 11 and a region where the light is irradiated through the reflecting surface 21 are generated.
  • the region where the light is irradiated through the reflecting surface 21 extends to the outside of the region where the light is directly irradiated from the light source 11. Therefore, the reflected light reflected in the region on the object directly irradiated with the light from the light source 11 is irradiated to the projection region A21 on the light receiving surface 41.
  • the reflected light reflected in the region on the object irradiated with the light through the reflecting surface 21 is irradiated to the projection region A22 extending outward from the vicinity of the outer edge of the projection region A21 on the light receiving surface 41. Therefore, when the reflecting surface 21 is arranged, the projection area is the projection area A20 in which the projection areas A21 and A22 are combined.
  • FIG. 2 is a block diagram showing the configuration of the image pickup apparatus 1.
  • the image pickup apparatus 1 includes a signal processing unit 101, a light source drive circuit 102, and a signal processing circuit 103.
  • FIG. 2 shows a light source drive circuit 102 provided for one light source 11 for convenience, but a light source drive circuit 102 is similarly provided for the other light source 11.
  • the signal processing unit 101 is composed of a microprocessor or the like, and includes a control unit and a storage unit.
  • the light source drive circuit 102 drives the light source 11 according to an instruction signal from the signal processing unit 101.
  • the signal processing circuit 103 performs signal processing on the image pickup signal output from the image pickup element 40, and outputs the processed image pickup signal to the signal processing unit 101.
  • the signal processing unit 101 processes the imaging signal from the signal processing unit 101 to generate an captured image.
  • the generated captured image is output to, for example, a display unit or a control device (not shown).
  • the signal processing unit 101 may process the captured image, detect the object to be imaged, and output the detection result to an external control device.
  • the light collected by the image pickup element 40 is only the reflected light that is directly applied to the object from the light source unit 10 and reflected on the surface of the object.
  • the reflected light taken in by the image pickup element 40 is the reflected light reflected in the minute area on the surface of the object. Therefore, on the light receiving surface 41 of the image sensor 40, the region where the reflected light is collected becomes extremely small.
  • the reflecting surface 21 when the reflecting surface 21 is provided as described above, the direction of the light radiated directly to the object from the light source unit 10 and the direction of the light radiated to the object via the reflecting surface 21 are different. As a result, in addition to the region on the surface of the object where the reflected light is taken into the image pickup element 40 when the light is directly applied to the object from the light source unit 10, the reflected light reflected in the region outside this region is also the image pickup element 40. Is focused on. Therefore, on the light receiving surface 41 of the image sensor 40, the region where the reflected light from the object is collected can be widened. Therefore, even a fine object such as dust can be detected more appropriately from the captured image.
  • Two light source units 10 are arranged around the optical axis 31 of the condenser lens 30. According to this configuration, the light from each light source unit 10 can be applied to the object from different directions via the reflecting surface 21. As a result, the area on the surface of the object on which the reflected light is taken into the image sensor 40 can be expanded as compared with the case where the light source unit 10 is one, and the reflected light from the object is collected on the light receiving surface 41 of the image sensor 40. The area to be used can be expanded. Therefore, the object can be detected more appropriately from the captured image.
  • the two light source units 10 are arranged on the Y-axis positive side and the Y-axis negative side of the condenser lens 30, respectively, and the two reflecting members 20 parallel to the XX plane are collected. It was arranged on the positive side of the Y-axis and the negative side of the Y-axis of the optical lens 30.
  • the second embodiment four light source units 10 and four reflecting members 20 are arranged.
  • FIG. 3 is a side view schematically showing the configuration of the image pickup apparatus 1 when viewed in the positive direction of the X-axis according to the second embodiment.
  • the position of the image sensor 40 is indicated by a broken line.
  • the light source units 10 are arranged around the optical axis 31 of the condenser lens 30. That is, in the second embodiment, the light source units 10 are arranged on the Z-axis positive side and the Z-axis negative side of the condenser lens 30, respectively, as compared with the first embodiment. Further, in the second embodiment, four reflecting members 20 are arranged outside the four light source units 10. Further, also in the second embodiment, the light source drive circuit 102 (see FIG. 2) is arranged for each light source 11.
  • each reflecting member 20 is arranged so that the reflecting surface 21 faces the light source unit 10.
  • the surface of the object on which the reflected light is taken into the image pickup element 40 is compared with the first embodiment.
  • the range of the region can be further expanded, and the region in which the reflected light from the object is collected can be expanded on the light receiving surface 41 of the image pickup element 40. Therefore, the object can be detected more appropriately from the captured image.
  • the reflecting surface 21 of the reflecting member 20 is arranged so as to surround the optical axis 31 of the condensing lens 30.
  • the object can be irradiated with the reflected light from the reflecting surface 21 over the entire circumference of the optical axis 31 of the condenser lens 30, so that the projected region of the object surface on the light receiving surface 41 of the image sensor 40 can be expanded. ..
  • the projected region of the object surface on the light receiving surface 41 is expanded in the positive and negative directions of the Y axis by the two reflecting surfaces 21 arranged in the Y axis direction.
  • the projected region of the object surface on the light receiving surface 41 is further expanded in the positive and negative directions of the Z-axis.
  • the projected region on the surface of the object is expanded over the entire circumference with respect to the optical axis 31 of the condenser lens 30. Therefore, as compared with the configuration of FIG. 1, the projection region of the object surface on the light receiving surface 41 of the image sensor 40 can be enlarged.
  • the shape of the reflecting member 20 when viewed in the X-axis direction is a square shown in FIG. It is not limited to the above, and may be a rectangle or a rhombus.
  • the number of the reflecting members 20 arranged around the condenser lens 30 is not limited to four as in the second embodiment, and may be another number. Even when another number of reflecting members 20 are arranged, the light reflected by the reflecting surface 21 is applied to a region on the surface of the object different from the light directly emitted from the light source unit 10 in each reflecting member 20. Arranged like this.
  • each reflecting member 20 is arranged so that the reflecting surface 21 is positioned outside the light source unit 10 with respect to the condenser lens 30.
  • the reflective members 20 arranged around the condenser lens 30 when viewed in the X-axis direction are not limited to the adjacent reflective members 20 being connected as shown in FIG. There may be a gap between them.
  • the number of the light source units 10 arranged around the condenser lens 30 is not limited to two or four as in the first and second embodiments, and may be another number.
  • 4 (a) and 4 (b) are diagrams showing the sizes of each part of the configuration of the embodiment set in this verification experiment.
  • a rectangular parallelepiped box member with the negative side of the Z axis open is covered with an optical system other than the two reflective members 20 of FIG. 1, and this optical system is housed inside the box member. bottom.
  • the reflective surface 21 was formed by each of the five inner surfaces of the box member. Therefore, in the configuration of the embodiment, the reflecting surfaces 21 are arranged on the Y-axis positive side, the Y-axis negative side, and the Z-axis positive side of the condenser lens 30, respectively. Further, the reflecting surfaces 21 are also arranged on the X-axis positive side of the subject area A10 where the object is positioned and on the X-axis negative side of the image sensor 40, respectively. On the negative side of the Z-axis of the condenser lens 30, a lower surface member having no reflecting surface was arranged.
  • the distance d1 between the condenser lens 30 and the light source 11 in the Y-axis direction was set to 3.3 cm.
  • the interval d1 may be 2 cm to 5 cm.
  • the distance d2 between the light source 11 and the reflecting surface 21 located outside the light source 11 in the Y-axis direction is set to about 10 cm.
  • the interval d2 may be 10 cm to 20 cm.
  • the distance d3 between the light source 11 and the reflecting surface 21 on the positive side of the Z axis was set to about 80 cm. That is, the interval d3 is set so that the reflected light reflected by the reflecting surface 21 on the positive side of the Z axis does not substantially affect the imaging of the object.
  • the distance d4 between the condenser lens 30 and the image sensor 40 was set to 3.75 cm.
  • the distance d4 may be 2 cm to 5 cm.
  • the focal length f of the condenser lens 30 was set to about 6 cm.
  • the focal length f may be 5 cm to 7 cm.
  • the distance from the image sensor 40 to the object was set to about 15 cm.
  • the interval d2 was set to 100 cm or more. That is, in the comparative example, the reflecting surface 21 on the positive and negative sides of the Y-axis was far away from the light source 11 as compared with the embodiment, and the light reflected by these reflecting surfaces was adjusted so as not to substantially affect the imaging of the object. Other conditions were the same as the configuration of the embodiment.
  • the inventor dropped a minute object of about 100 ⁇ m from the top with respect to the subject area A10, and imaged the object using the configurations of the comparative example and the embodiment.
  • FIG. 5A is a diagram showing a captured image captured using the configuration of the comparative example.
  • FIG. 5A in the experiment with the configuration of the comparative example, four minute objects were imaged at the timing of imaging.
  • the four bright spots included in the captured image of FIG. 5A correspond to the light receiving regions of the reflected light in which the light from the light source 11 is reflected on the surfaces of minute objects different from each other.
  • the region of each object in the captured image was extremely small. This is because, in the configuration of the comparative example, the light directly radiated to the object from the light source 11 is mainly reflected on the object surface and projected onto the image sensor 40, so that the projected object surface region becomes extremely small. Can be considered to be due to.
  • the reflected light from the reflecting surfaces 21 on the positive side of the Y axis, the negative side of the Y axis, and the positive side of the Z axis is , It is considered that there is almost no effect on imaging. Therefore, in the configuration of the comparative example, the reflected light from the minute surface region due to the direct irradiation of the object from the light source 11 is mainly collected by the image sensor 40, whereby the reflected light from each object is received. It is probable that the area became extremely small.
  • FIG. 5B is a diagram showing a captured image captured using the configuration of the embodiment.
  • one minute object was imaged at the timing of imaging.
  • One bright spot included in the captured image of FIG. 5B corresponds to a light receiving region of the reflected light in which the light from the light source 11 is reflected on the surface of one minute object.
  • the area of the object in the captured image is several steps larger than that of the comparative example.
  • the reflecting surfaces 21 on the positive side and the negative side of the Y axis are arranged at positions sufficiently close to the condenser lens 30, and the light from the light source unit 10 directly irradiates the object.
  • the light from the light source unit 10 is also irradiated to the region outside the region to be formed through the reflecting surface 21.
  • the light receiving region of the reflected light from the object is several steps larger than that in the case of the comparative example.
  • the four flat plate-shaped reflecting members 20 are arranged so as to have a square shape when viewed in the X-axis direction, but in this modified example, the cylindrical reflecting member 20 is arranged.
  • FIG. 6 is a side view schematically showing the configuration of the image pickup apparatus 1 when viewed in the positive direction of the X-axis according to this modified example.
  • a reflective member 20 having a cylindrical side surface shape is arranged instead of the four reflective members 20.
  • the reflective surface 21 arranged inside the reflective member 20 also has a cylindrical side surface shape.
  • eight light source units 10 are arranged around the condenser lens 30.
  • the reflective surface 21 is formed on the inner surface of the cylindrical reflective member 20.
  • the object can be uniformly irradiated with the reflected light from the reflecting surface 21 over the entire circumference of the optical axis 31 of the condenser lens 30, so that the projection region on the light receiving surface 41 of the image sensor 40 is all around. It can be expanded uniformly over.
  • the shape of the reflective member 20 when viewed in the X-axis direction is not limited to the perfect circle shown in FIG. 6, and may be an ellipse or a shape in which protrusions are formed on the circumference of the circle.
  • the four light source units 10 are similarly configured, and the four light sources 11 are configured to emit light of the same wavelength.
  • the third embodiment instead of the four light sources 11, a light source 11a and a light source 11b that emit light having different wavelengths are arranged.
  • FIG. 7 is a side view schematically showing the configuration of the image pickup apparatus 1 when viewed in the positive direction of the X-axis according to the third embodiment.
  • the light source unit 10 on the positive side of the Y-axis and the negative side of the Y-axis of the condenser lens 30 includes a light source 11a instead of the light source 11 as compared with the second embodiment shown in FIG.
  • the light source unit 10 on the Z-axis positive side and the Z-axis negative side of the condenser lens 30 includes a light source 11b instead of the light source 11 as compared with the second embodiment shown in FIG.
  • the light source 11a is configured to emit light having a wavelength ⁇ 1 and the light source 11b is configured to emit light having a wavelength ⁇ 2 different from the wavelength ⁇ 1.
  • the light source drive circuit 102 (see FIG. 2) is arranged for each of the light sources 11a and 11b.
  • the image sensor 40 is configured to be capable of receiving light in the same wavelength band as the light emitted from the light sources 11a and 11b.
  • the signal processing unit 101 drives the light sources 11a and 11b so that, for example, the two light sources 11a and the two light sources 11b all emit light at the same time. By doing so, even when the object to be imaged absorbs the light of one wavelength, the object reflects the light of the other wavelength, so that the object can be imaged. Therefore, the object can be detected more reliably from the captured image.
  • the signal processing unit 101 drives the light sources 11a and 11b so that, for example, the light emission timings of the two light sources 11a and the light emission timings of the two light sources 11b are different. Then, when the absorption wavelengths of the objects to be imaged are different, the amount of received light of the image pickup element 40 corresponding to the light emission timing of the light source 11a and the amount of light received by the image pickup element 40 corresponding to the light emission timing of the light source 11b are different. As a result, there may be a case where the object can be imaged at one wavelength, but the object cannot be imaged at the other wavelength.
  • the absorption wavelength and the reflection wavelength of the object can be grasped based on the emission wavelength at which the object can be imaged and the emission wavelength at which the object cannot be imaged, and what kind of material the object is made of.
  • Information material information indicating whether or not the object can be obtained.
  • the absorption wavelength and the reflection wavelength of the object can be grasped from the difference in the amount of received light when each wavelength is used, and thus the material information of the object can be acquired.
  • a dichroic mirror that transmits the reflected light of the wavelength ⁇ 1 and reflects the reflected light of the wavelength ⁇ 2 may be arranged in front of the image sensor 40.
  • the light sources 11a and 11b simultaneously emit light
  • the reflected light of wavelength ⁇ 1 transmitted through the dichroic mirror is received by the image pickup element 40
  • the reflected light of wavelength ⁇ 2 reflected by the dichroic mirror is further imaged.
  • the reflected light of wavelengths ⁇ 1 and ⁇ 2 may be separated by a spectroscopic element other than the dichroic mirror, and the reflected light of each wavelength may be received by the corresponding image sensor.
  • a plurality of light sources 11 that emit two different wavelengths are arranged, but the present invention is not limited to this, and a plurality of light sources 11 that emit three or more different wavelengths may be arranged. In this way, the information of the object can be acquired in more detail.
  • the two light source units 10 are fixed in the image pickup apparatus 1.
  • the two light source units 10 are installed in the image pickup apparatus 1 so as to be movable in a direction (Y-axis direction) perpendicular to the optical axis 31 of the condenser lens 30.
  • FIG. 8 is a plan view schematically showing the configuration of the image pickup apparatus 1 when viewed in the negative direction of the Z axis according to the fourth embodiment.
  • the image pickup apparatus 1 of the fourth embodiment further includes two drive units 50 as compared with the first embodiment.
  • the configurations of the two drive units 50 are the same.
  • the drive unit 50 is arranged on the negative side of the Z axis of the light source unit 10.
  • the drive unit 50 includes a support plate 51, a guide rail 52, a gear shaft 53, and a motor 54.
  • the support plate 51 is a flat plate-shaped plate member, and the light source 11 and the diffusion plate 12 are installed on a frame portion formed on the surface on the positive side of the Z axis.
  • the support plate 51 is supported by the guide rail 52 and is slidable along the guide rail 52.
  • the guide rail 52 is fixed in the image pickup apparatus 1.
  • the gear shaft 53 is a shaft-shaped member having a screw groove formed therein, and is passed through a screw hole (not shown) provided in the support plate 51.
  • the motor 54 is, for example, a stepping motor.
  • the rotation shaft 54a of the motor 54 is connected to the gear shaft 53.
  • the gear shaft 53 rotates according to the rotation of the rotating shaft 54a.
  • the support plate 51 moves in the Y-axis direction along the guide rail 52, and the light source 11 and the diffuser plate 12 move in the Y-axis direction.
  • FIG. 9 is a block diagram showing the configuration of the image pickup apparatus 1 according to the fourth embodiment.
  • the image pickup apparatus 1 of the fourth embodiment further includes a motor 54 and a motor drive circuit 104 as compared with the first embodiment shown in FIG.
  • FIG. 9 shows a motor 54 and a motor drive circuit 104 provided for one light source unit 10 for convenience, but the motor 54 and the motor drive circuit 104 are similarly provided for the other light source unit 10. Provided.
  • FIG. 10 is a flowchart showing a captured image generation process by the signal processing unit 101 according to the fourth embodiment.
  • the signal processing unit 101 moves the light source unit 10 in the Y-axis direction by driving the drive unit 50, that is, by driving the motor 54 via the motor drive circuit 104.
  • the signal processing unit 101 drives the light source 11 to emit light from the light source 11 (S11). Subsequently, the signal processing unit 101 determines whether or not the projection region on the light receiving surface 41 of the image sensor 40 is a predetermined size or larger (S12). Specifically, the signal processing unit 101 classifies the number of pixels 41a whose signal level is larger than the threshold value for removing noise, in other words, as the same substance, based on the image pickup signal output from the image pickup element 40. The number of pixels 41a included in the range is acquired.
  • 11 (a) and 11 (b) are diagrams schematically showing a projection region on the light receiving surface 41 of the image sensor 40.
  • the number of pixels 41a whose signal level is larger than the threshold value is 13
  • the projection area is a range (size) corresponding to the 13 pixels 41a.
  • the number of pixels 41a whose signal level is larger than the threshold value is 28, and the projection area is a range (size) corresponding to the 28 pixels 41a.
  • the determination result in step S12 is NO in the case of FIG. 11 (a), and NO in the case of FIG. 11 (b).
  • the determination result in step S12 is YES.
  • the threshold value of the size of the projection region used in the determination in step S12 is determined according to the number of pixels 41a on the light receiving surface 41, and is stored in advance in the storage unit of the signal processing unit 101.
  • the signal processing unit 101 moves the two light source units 10 inward, respectively (S13). That is, the signal processing unit 101 moves the light source unit 10 on the positive side of the Y-axis by a predetermined distance in the negative direction of the Y-axis, and the light source unit 10 on the negative side of the Y-axis by a predetermined distance in the positive direction of the Y-axis.
  • the signal processing unit 101 determines whether or not the projection region on the light receiving surface 41 has become larger as a result of the processing in step S13, based on the image pickup signal of the image pickup element 40.
  • the signal processing unit 101 determines the moving directions of the two light source units 10 inward (S15).
  • the signal processing unit 101 determines the moving directions of the two light source units 10 in the outward direction (S16).
  • the signal processing unit 101 determines whether or not the projection region on the light receiving surface 41 of the image sensor 40 is a predetermined size or larger, as in step S12 (S17).
  • the signal processing unit 101 moves the two light source units 10 by a predetermined distance in the direction determined in step S15 or S16 (S18). After that, the process is returned to step S17, and the determination in step S17 is performed again.
  • step S12 or S17 When it is determined in step S12 or S17 that the projection region is equal to or larger than a predetermined size (S12: YES or S17: YES), the signal processing unit 101 captures the captured image based on the image pickup signal of the image pickup element 40. Generate (S19). In this way, the process of generating the captured image is completed.
  • the drive unit 50 moves the light source unit 10 in the direction perpendicular to the optical axis 31 of the condenser lens 30 (Y-axis direction).
  • the direction of the light emitted to the projected area A10 can be changed directly from the light source unit 10 or through the reflecting surface 21, and the object is irradiated with the light.
  • the state can be changed. Therefore, by moving the light source unit 10, for example, the size of the projection region of the object on the light receiving surface 41 of the image sensor 40 can be expanded to an appropriate size.
  • the signal processing unit 101 specifies a projection region of an object on the light receiving surface 41 of the image sensor 40, and controls the position of the light source unit 10 based on the specified projection region.
  • the position of the light source unit 10 for example, the size of the projection region of the object on the light receiving surface 41 of the image sensor 40 can be expanded to an appropriate size.
  • the drive unit 50 may move at least the light source unit 10 in the direction perpendicular to the optical axis 31 of the condenser lens 30 (Y-axis direction), and in the direction in which the light source unit 10 is moved. Components in the X-axis direction and the Z-axis direction may be included.
  • the number of the light source units 10 is not limited to the number shown in FIG. 8, and for example, as shown in FIG. 3, four light source units 10 are arranged, and each light source unit 10 has an optical axis 31. It may be driven in the direction of approaching and separating from the light source. In this case, for example, the control of FIG. 10 may be performed on the four light source units 10, or each light source unit 10 is individually driven so that the size of the projection area becomes an appropriate size. It may be controlled.
  • the position of the light source unit 10 is controlled based on the projection region of the object on the light receiving surface 41 of the image sensor 40.
  • the position of the light source unit 10 is controlled based on the distance to the object.
  • FIG. 12A is a flowchart showing the generation processing of the captured image by the signal processing unit 101 according to the fifth embodiment.
  • the signal processing unit 101 drives the light source 11 to emit light from the light source 11 (S21). Subsequently, the signal processing unit 101 calculates the distance to the object based on the image pickup signal output from the image pickup device 40 (S22). Specifically, the signal processing unit 101 is an object for each pixel 41a based on the time difference between the light emission timing of the light source 11 and the light reception timing of the reflected light in the pixels 41a (see FIGS. 11A and 11B). Calculate the distance to. Then, the signal processing unit 101 specifies the projection region of the object on the light receiving surface 41, and calculates the distance to the object based on the distance for each pixel 41a and the projection region. For example, the average value of the distances based on each pixel 41a in the projection area is calculated as the distance to the object.
  • the signal processing unit 101 acquires the optimum position of the light source unit 10 using the table shown in FIG. 12B based on the distance to the object calculated in step S22 (S23).
  • the signal processing unit 101 stores a table as shown in FIG. 12B in a storage unit in the signal processing unit 101.
  • the optimum position Pn of the light source unit 10 that is, the light source unit 10 when the projection region on the light receiving surface 41 of the image sensor 40 is maximized corresponding to the distance Dn to the object.
  • the position is remembered.
  • the optimum position is close to the condenser lens 30, and when the distance to the object is long, the optimum position is far from the condenser lens 30.
  • the signal processing unit 101 drives the drive unit 50 to move the light source unit 10 to the optimum position acquired in step S23 (S24). After that, the signal processing unit 101 generates an image to be captured based on the image pickup signal of the image pickup device 40 (S25). In this way, the process of generating the captured image is completed.
  • the signal processing unit 101 measures the distance to the object in the captured image based on the image pickup signal from the image pickup element 40, and the measurement result of the distance.
  • the position of the light source unit 10 is controlled based on the above. By controlling the position of the light source unit 10 to a position suitable for the distance, for example, the size of the projected region of the object on the captured image can be enlarged to an appropriate size.
  • the two light source units 10 are arranged so as to be movable in the direction perpendicular to the optical axis 31 of the condenser lens 30.
  • the two light source units 10 are rotatably arranged with the Z-axis direction as the rotation axis.
  • FIG. 13A is a plan view showing a configuration in the vicinity of the light source unit 10 when viewed in the negative direction of the Z axis according to the sixth embodiment.
  • FIG. 13B is a side view showing the configuration of the vicinity of the light source unit 10 when viewed in the positive direction of the Y-axis according to the sixth embodiment.
  • the image pickup apparatus 1 of the sixth embodiment includes two drive units 60 instead of the two drive units 50 as compared with the fourth embodiment.
  • the configurations of the two drive units 60 are the same.
  • the two drive units 60 are arranged on the negative side of the Z axis of the two light source units 10, respectively.
  • the drive unit 60 includes a support plate 61, a shaft member 62, and a motor 63.
  • the support plate 61 is a flat plate-shaped plate member, and a light source portion 10 (light source 11 and diffusion plate 12) is installed on the surface on the positive side of the Z axis.
  • the shaft member 62 is a shaft-shaped member extending in the Z-axis direction, and is installed on the lower surface (the surface on the negative side of the Z-axis) of the support plate 61.
  • the motor 63 is, for example, a stepping motor.
  • the main body of the motor 63 is fixed in the image pickup apparatus 1.
  • the rotating shaft 63a of the motor 63 is connected to the shaft member 62.
  • the shaft member 62 rotates according to the rotation of the rotating shaft 63a.
  • the support plate 61 rotates about the Z-axis direction, and the light source unit 10 (light source 11 and diffusion plate 12) rotates.
  • a motor drive circuit (not shown) is provided to rotate each of the two motors 63.
  • FIG. 14 is a flowchart showing a captured image generation process by the signal processing unit 101 according to the sixth embodiment.
  • the signal processing unit 101 rotates in the Z-axis direction by driving the drive unit 60, that is, by driving two motors 63 via a motor drive circuit (not shown).
  • the light source unit 10 is rotated as an axis.
  • steps S31 to S34 are added in place of steps S13, S15, S16, and S18, respectively, as compared with the process of FIG.
  • steps S31 to S34 will be described.
  • the signal processing unit 101 determines in step S12 that the projection region is smaller than a predetermined size (S12: NO)
  • the signal processing unit 101 rotates the two light source units 10 inward, respectively (S31). That is, the signal processing unit 101 rotates the light source unit 10 on the positive side of the Y axis by a predetermined angle when viewed in the negative direction of the Z axis, and rotates the light source unit 10 on the negative side of the Y axis counterclockwise by a predetermined angle. Rotate only.
  • step S14 determines the projection region has become large (S14: YES)
  • the signal processing unit 101 determines the rotation direction of the light source unit 10 inward (S32).
  • the signal processing unit 101 determines the rotation direction of the light source unit 10 in the outward direction (S14: NO). S33).
  • step S17 When the signal processing unit 101 determines in step S17 that the projection region is smaller than a predetermined size (S17: NO), the signal processing unit 101 rotates the two light source units 10 by a predetermined angle in the rotation direction determined in step S32 or S33. Move (S34).
  • the drive unit 60 rotates the light source unit 10 so that the direction of the light emitted from the light source unit 10 changes.
  • the direction of the light emitted from the light source unit 10 directly or via the reflecting surface 21 can be changed, and the direction of the light applied to the object can be changed.
  • the irradiation state can be changed. Therefore, by moving the light source unit 10, for example, the size of the projection region of the object on the light receiving surface 41 of the image sensor 40 can be expanded to an appropriate size.
  • the signal processing unit 101 specifies a projection region of an object on the light receiving surface 41 of the image sensor 40, and controls the rotation position of the light source unit 10 based on the specified projection region.
  • the rotation position of the light source unit 10 for example, the size of the projection region of the object on the light receiving surface 41 of the image sensor 40 can be expanded to an appropriate size.
  • the direction of the light emitted from the light source unit 10 was changed by rotating the light source unit 10, but the direction of the light was changed by another method. It may be changed.
  • the direction of the light emitted from the light source unit 10 may be changed by swinging the light source unit 10.
  • the signal processing unit 101 measures the distance to the object in the captured image based on the signal from the image sensor 40, and the light source is based on the measurement result of the distance.
  • the rotation position of the unit 10 may be controlled.
  • the optimum rotation position is stored in the table shown in FIG. 12B corresponding to each distance, and the rotation position of the light source unit 10 is controlled by using this table.
  • the optimum rotation position is the rotation position where the front direction of the light source 11 is close to the condenser lens 30, and when the distance to the object is long, the optimum rotation position is the light source 11.
  • the front direction of is the rotation position away from the condensing lens 30. Even when the rotation position is controlled in this way, the size of the projected region of the object on the captured image can be enlarged to an appropriate size.
  • the configuration in which the light source unit 10 of the fourth embodiment is linearly moved and the configuration in which the light source unit 10 of the sixth embodiment is rotated may be combined.
  • the light source 11 is composed of LEDs.
  • the light source 13 composed of the semiconductor laser element is arranged instead of the light source 11, and the object in the subject area A10 is irradiated with the laser light in the predetermined polarization direction.
  • FIG. 15 is a plan view schematically showing the configuration of the image pickup apparatus 1 when viewed in the Z-axis direction according to the seventh embodiment.
  • the light source unit 10 of the seventh embodiment includes a light source 13 instead of the light source 11, and a collimator lens 14 and a concave lens 15 instead of the diffuser plate 12.
  • the light source 13 is composed of a semiconductor laser element and emits linearly polarized laser light.
  • the two light sources 13 are arranged so as to irradiate the subject area A10 with light whose polarization directions are aligned in one direction.
  • the light emitted from the light source 13 is converted into parallel light by the collimator lens 14 and converted into diffused light by the concave lens 15.
  • the image pickup apparatus 1 of the seventh embodiment includes a polarizing filter 70 between the condenser lens 30 and the subject area A10 as compared with the first embodiment.
  • the polarizing filter 70 is arranged so that the transmitted polarization direction coincides with the polarization direction of the light projected on the subject area A10 by the light source 13.
  • the polarizing filter 70 is fixed to the housing of the image pickup apparatus 1 via a support portion (not shown).
  • the light source unit 10 irradiates the subject area A10 with light by aligning the polarization directions in one direction.
  • the polarization state of the light reflected on the surface of the object can change depending on the state of the surface of the object.
  • the amount of light transmitted through the polarizing filter 70 changes according to the polarization state of the light reflected by the object. Therefore, the state of the object surface can be detected by referring to the amount of light received on the light receiving surface 41 of the image sensor 40.
  • the polarization direction of the reflected light from the object surface is substantially the same as the polarization direction of the light emitted from the light source 11.
  • the amount of reflected light transmitted through the polarizing filter 70 is large, the amount of light received on the light receiving surface 41 of the image sensor 40 is large.
  • the surface of an object has minute irregularities or the like, when the surface of the object is irradiated with light, the polarization state of the light changes, and the reflected light is mixed with light of various polarization states. ..
  • the signal processing unit 101 can determine the state of the object surface by comparing the received light amount with a predetermined threshold value based on the image pickup signal output from the image pickup element 40.
  • the polarizing filter 70 is not limited to being arranged between the image area A10 and the condenser lens 30, but may be arranged between the image area A10 and the image sensor 40.
  • the polarizing filter 70 may be arranged between the condenser lens 30 and the image sensor 40.
  • the image pickup device 40 is fixed in the image pickup device 1.
  • the image sensor 40 is movably arranged in a direction parallel to the optical axis 31 of the condenser lens 30 (X-axis direction).
  • FIG. 16 is a plan view schematically showing the configuration of the image pickup apparatus 1 when viewed in the Z-axis direction according to the eighth embodiment.
  • the imaging device 1 of the eighth embodiment further includes a distance changing unit 80 as compared with the first embodiment.
  • the distance changing unit 80 is arranged on the negative side of the Z axis of the image sensor 40.
  • the distance changing unit 80 has the same configuration as the driving unit 50 of the fourth embodiment shown in FIG.
  • the distance changing unit 80 includes a support plate 81, a guide rail 82, a gear shaft 83, and a motor 84.
  • the support plate 81 is a flat plate-shaped plate member, and the image sensor 40 is installed on a frame portion formed on the surface on the positive side of the Z axis.
  • the support plate 81 is supported by the guide rail 82 and is slidable along the guide rail 82.
  • the guide rail 82 is fixed in the image pickup apparatus 1.
  • the gear shaft 83 is a shaft-shaped member having a screw groove formed therein, and is passed through a screw hole (not shown) provided in the support plate 81.
  • the motor 84 is, for example, a stepping motor.
  • the rotation shaft 84a of the motor 84 is connected to the gear shaft 83.
  • the gear shaft 83 rotates according to the rotation of the rotating shaft 84a.
  • the support plate 81 moves in the X-axis direction along the guide rail 82, and the image sensor 40 moves in the X-axis direction.
  • FIG. 17 is a block diagram showing the configuration of the image pickup apparatus 1 according to the eighth embodiment.
  • the image pickup apparatus 1 of the eighth embodiment further includes a motor 84 and a motor drive circuit 105 as compared with the first embodiment shown in FIG.
  • FIG. 18 is a flowchart showing a captured image generation process by the signal processing unit 101 according to the eighth embodiment.
  • the signal processing unit 101 moves the image sensor 40 in the X-axis direction by driving the distance changing unit 80, that is, by driving the motor 84 via the motor drive circuit 105.
  • steps S41 to S44 are added in place of steps S13, S15, S16, and S18, respectively, as compared with the process of FIG.
  • steps S41 to S44 will be described.
  • step S12 determines in step S12 that the projection region is smaller than a predetermined size (S12: NO)
  • the signal processing unit 101 moves the image sensor 40 in the positive direction of the X-axis by a predetermined distance (S41).
  • step S14 determines in step S14 that the projection region has become large (S14: YES)
  • the signal processing unit 101 determines the moving direction of the image sensor 40 in the positive X-axis direction (S42).
  • step S14 determines in step S14 that the projection region has become smaller or the size of the projection region has not changed (S14: NO)
  • the signal processing unit 101 determines the moving direction of the image sensor 40 in the negative X-axis direction. (S43).
  • step S17 When the signal processing unit 101 determines in step S17 that the projection region is smaller than a predetermined size (S17: NO), the signal processing unit 101 moves the image sensor 40 by a predetermined distance in the moving direction determined in step S42 or S43 (S44). ..
  • the distance changing unit 80 changes the distance between the condenser lens 30 and the image sensor 40.
  • the projection region of the object on the light receiving surface 41 of the image sensor 40 can be defocused and the size of the projection region can be expanded. As a result, the object can be detected more appropriately from the captured image.
  • the signal processing unit 101 identifies a projection region of an object on the light receiving surface 41 of the image sensor 40, and based on the specified projection region, between the condenser lens 30 and the image sensor 40. Control the distance. By feeding back the size of the projection area of the object to the distance adjustment in this way, for example, the defocus amount can be appropriately adjusted so that the projection area of the object is maximized.
  • one pixel 41a When the distance between the condenser lens 30 and the image sensor 40 is changed and the defocus amount is adjusted so as to maximize the projection region, one pixel 41a (see FIGS. 11A and 11B) The amount of received light decreases. When the decrease in the amount of received light of one pixel 41a becomes a problem, the output of the light source 11 may be increased. In this case, if the emission wavelength of the light source 11 is in the infrared wavelength band, safety can be ensured even if the output of the light source 11 is increased.
  • the distance between the condenser lens 30 and the image pickup element 40 is not limited to being changed by the movement of the image pickup element 40 in the X-axis direction, and may be changed by the movement of the condenser lens 30 in the X-axis direction. good.
  • the configuration of the image pickup apparatus 1 can be changed in various ways in addition to the configurations shown in the above-described embodiment and modification examples.
  • the light source unit 10 is moved in the Y-axis direction to change the light irradiation state of the object, but the present invention is not limited to this, and the configuration is such that the light irradiation state of the object is changed.
  • the two reflecting members 20 may be moved in the Y-axis direction, respectively.
  • each of the three or more light source units 10 may be provided with a drive unit for driving the light source unit 10.
  • two reflecting members 20 are arranged around the condenser lens 30, but the number of reflecting members 20 (reflecting surfaces 21) is the same. Not limited to this, three or more reflecting members 20 may be arranged, and three or more reflecting surfaces 21 may surround the optical axis 31 of the condenser lens 30. Further, also in the above-described first to third to eighth embodiments, the cylindrical reflecting member 20 (reflecting surface 21) may be arranged as in the modified example of the second embodiment.
  • Image pickup device 10 Light source unit 21 Reflection surface 30 Condensing lens 31 Optical axis 40 Imaging element 41 Light receiving surface 50 Drive unit 60 Drive unit 70 Polarizing filter 80 Distance change unit 101 Signal processing unit A10 Imaging area

Abstract

An imaging device (1) is provided with a light source unit (10) for irradiating a subject region (A10) with light, and an imaging element (40) for imaging the subject region (A10). The imaging device (1) is provided with a focusing lens (30) for focusing light from the subject region (A10) at the imaging element (40). The imaging device (1) is provided with a reflective surface (21) for reflecting some of the light emitted from the light source unit (10) toward the subject region (A10).

Description

撮像装置Imaging device
 本発明は、撮像装置に関し、特に、塵埃等の微細な物体を撮像する場合に好適なものである。 The present invention relates to an imaging device, and is particularly suitable for imaging a fine object such as dust.
 従来、照射範囲に光を照射して撮像を行う撮像装置が知られている。たとえば、以下の特許文献1には、2つの光源から光を照射し、その反射光を撮像部で撮像して、物体までの距離を測定する測距システムが記載されている。この測距システムでは、画素ごとに算出された距離情報が合成されて、1フレーム分の距離画像が生成される。 Conventionally, an imaging device that irradiates an irradiation range with light to perform imaging is known. For example, Patent Document 1 below describes a ranging system that irradiates light from two light sources, captures the reflected light with an imaging unit, and measures the distance to an object. In this distance measuring system, the distance information calculated for each pixel is combined to generate a distance image for one frame.
国際公開第2017/213052号International Publication No. 2017/213052
 物体からの反射光を撮像素子で受光して撮像を行う構成では、物体の前面に照射された光の反射光が、撮像素子の受光面に集光される。この場合、被写体である物体が塵埃等の微細な形状であると、物体前面の微小な領域からの反射光が撮像素子の受光面に集光されることとなり、受光面上における反射光の入射領域は、極めて狭くなる。このため、撮像画像から適正に物体を検出することが困難となる。 In the configuration in which the image sensor receives the reflected light from the object and performs imaging, the reflected light of the light radiated to the front surface of the object is collected on the light receiving surface of the image sensor. In this case, if the object as the subject has a fine shape such as dust, the reflected light from the minute region on the front surface of the object is collected on the light receiving surface of the image sensor, and the reflected light is incident on the light receiving surface. The area becomes extremely narrow. Therefore, it becomes difficult to properly detect an object from the captured image.
 かかる課題に鑑み、本発明は、塵埃等の微細な物体であっても、撮像画像から適正に物体を検出することが可能な撮像装置を提供することを目的とする。 In view of such a problem, an object of the present invention is to provide an imaging device capable of appropriately detecting an object from an captured image even if it is a minute object such as dust.
 本発明の主たる態様に係る撮像装置は、被写領域に光を照射する光源部と、前記被写領域を撮像する撮像素子と、前記被写領域からの光を前記撮像素子に集光させる集光レンズと、前記光源部から出射された前記光の一部を前記被写領域に向けて反射する反射面と、を備える。 The image pickup apparatus according to the main aspect of the present invention is a light source unit that irradiates a light source area with light, an image pickup element that images the picture area, and a collection that collects light from the picture area on the image pickup device. It includes an optical lens and a reflecting surface that reflects a part of the light emitted from the light source unit toward the subject area.
 反射面が設けられない場合、撮像素子に集光される光は、光源部から直接、物体に照射されて物体表面で反射された反射光のみである。この場合、被写領域に存在する撮像対象の物体が塵埃等の微小な物体であると、撮像素子に取り込まれる反射光は、物体表面の微小な領域で反射された反射光となる。このため、撮像素子の受光面において、この反射光が集光される領域は、極めて微小となる。これに対し、上記のように反射面が設けられると、光源部から直接物体に照射される光の方向と、反射面を介して物体に照射される光の方向とが相違する。これにより、光源部から光が直接物体に照射された場合に反射光が撮像素子に取り込まれる物体表面の領域の他、この領域の外側の領域で反射された反射光も、撮像素子に集光される。このため、撮像素子の受光面において、物体からの反射光が集光される領域を広げることができる。よって、本態様に係る撮像装置によれば、塵埃等の微細な物体であっても、撮像画像からより適正に物体を検出できる。 When the reflecting surface is not provided, the light collected by the image pickup element is only the reflected light that is directly applied to the object from the light source unit and reflected on the surface of the object. In this case, if the object to be imaged in the subject area is a minute object such as dust, the reflected light captured by the image pickup element is the reflected light reflected in the minute area on the surface of the object. Therefore, on the light receiving surface of the image sensor, the region where the reflected light is collected becomes extremely small. On the other hand, when the reflecting surface is provided as described above, the direction of the light radiated directly to the object from the light source unit and the direction of the light radiated to the object through the reflecting surface are different. As a result, in addition to the area on the surface of the object where the reflected light is taken into the image pickup element when the light is directly applied to the object from the light source unit, the reflected light reflected in the area outside this area is also focused on the image pickup element. Will be done. Therefore, on the light receiving surface of the image sensor, the region where the reflected light from the object is collected can be widened. Therefore, according to the image pickup apparatus according to this aspect, even a minute object such as dust can be detected more appropriately from the captured image.
 以上のとおり、本発明によれば、塵埃等の微細な物体であっても、撮像画像から適正に物体を検出することが可能な撮像装置を提供できる。 As described above, according to the present invention, it is possible to provide an imaging device capable of appropriately detecting an object from an captured image even if it is a minute object such as dust.
 本発明の効果ないし意義は、以下に示す実施形態の説明により更に明らかとなろう。ただし、以下に示す実施形態は、あくまでも、本発明を実施化する際の一つの例示であって、本発明は、以下の実施形態に記載されたものに何ら制限されるものではない。 The effect or significance of the present invention will be further clarified by the description of the embodiments shown below. However, the embodiments shown below are merely examples when the present invention is put into practice, and the present invention is not limited to those described in the following embodiments.
図1は、実施形態1に係る、Z軸負方向に見た場合の撮像装置の構成を模式的に示す平面図である。FIG. 1 is a plan view schematically showing a configuration of an image pickup apparatus when viewed in the negative direction of the Z axis according to the first embodiment. 図2は、実施形態1に係る、撮像装置の構成を示すブロック図である。FIG. 2 is a block diagram showing a configuration of an imaging device according to the first embodiment. 図3は、実施形態2に係る、X軸正方向に見た場合の撮像装置の構成を模式的に示す側面図である。FIG. 3 is a side view schematically showing the configuration of the image pickup apparatus when viewed in the positive direction of the X-axis according to the second embodiment. 図4(a)、(b)は、物体検出の検証実験において設定された実施形態の構成の各部のサイズを示す図である。4 (a) and 4 (b) are diagrams showing the size of each part of the configuration of the embodiment set in the verification experiment of object detection. 図5(a)は、比較例の構成を用いて撮像した撮像画像を示す図である。図5(b)は、実施形態の構成を用いて撮像した撮像画像を示す図である。FIG. 5A is a diagram showing a captured image captured using the configuration of the comparative example. FIG. 5B is a diagram showing an captured image captured using the configuration of the embodiment. 図6は、実施形態2の変更例に係る、X軸正方向に見た場合の撮像装置の構成を模式的に示す側面図である。FIG. 6 is a side view schematically showing the configuration of the image pickup apparatus when viewed in the positive direction of the X-axis according to the modified example of the second embodiment. 図7は、実施形態3に係る、X軸正方向に見た場合の撮像装置の構成を模式的に示す側面図である。FIG. 7 is a side view schematically showing the configuration of the image pickup apparatus when viewed in the positive direction of the X-axis according to the third embodiment. 図8は、実施形態4に係る、Z軸負方向に見た場合の撮像装置の構成を模式的に示す平面図である。FIG. 8 is a plan view schematically showing the configuration of the image pickup apparatus when viewed in the negative direction of the Z axis according to the fourth embodiment. 図9は、実施形態4に係る、撮像装置の構成を示すブロック図である。FIG. 9 is a block diagram showing the configuration of the image pickup apparatus according to the fourth embodiment. 図10は、実施形態4に係る、信号処理部による撮像画像の生成処理を示すフローチャートである。FIG. 10 is a flowchart showing a captured image generation process by the signal processing unit according to the fourth embodiment. 図11(a)、(b)は、実施形態4に係る、撮像素子の受光面上における投影領域を模式的に示す図である。11 (a) and 11 (b) are diagrams schematically showing a projection region on the light receiving surface of the image sensor according to the fourth embodiment. 図12(a)は、実施形態5に係る、信号処理部による撮像画像の生成処理を示すフローチャートである。図12(b)は、実施形態5に係る、物体までの距離に対応する光源部の最適位置を示すテーブルの構成を模式的に示す図である。FIG. 12A is a flowchart showing a captured image generation process by the signal processing unit according to the fifth embodiment. FIG. 12B is a diagram schematically showing the configuration of a table showing the optimum position of the light source unit corresponding to the distance to the object according to the fifth embodiment. 図13(a)は、実施形態6に係る、Z軸負方向に見た場合の光源部の近傍の構成を示す平面図である。図13(b)は、実施形態6に係る、Y軸正方向に見た場合の光源部の近傍の構成を示す側面図である。FIG. 13A is a plan view showing a configuration in the vicinity of the light source portion when viewed in the negative direction of the Z axis according to the sixth embodiment. FIG. 13B is a side view showing a configuration in the vicinity of the light source portion when viewed in the positive direction of the Y-axis according to the sixth embodiment. 図14は、実施形態6に係る、信号処理部による撮像画像の生成処理を示すフローチャートである。FIG. 14 is a flowchart showing a captured image generation process by the signal processing unit according to the sixth embodiment. 図15は、実施形態7に係る、Z軸方向に見た場合の撮像装置の構成を模式的に示す平面図である。FIG. 15 is a plan view schematically showing the configuration of the image pickup apparatus when viewed in the Z-axis direction according to the seventh embodiment. 図16は、実施形態8に係る、Z軸方向に見た場合の撮像装置の構成を模式的に示す平面図である。FIG. 16 is a plan view schematically showing the configuration of the image pickup apparatus when viewed in the Z-axis direction according to the eighth embodiment. 図17は、実施形態8に係る、撮像装置の構成を示すブロック図である。FIG. 17 is a block diagram showing a configuration of an image pickup apparatus according to the eighth embodiment. 図18は、実施形態8に係る、信号処理部による撮像画像の生成処理を示すフローチャートである。FIG. 18 is a flowchart showing a captured image generation process by the signal processing unit according to the eighth embodiment.
 ただし、図面はもっぱら説明のためのものであって、この発明の範囲を限定するものではない。 However, the drawings are for illustration purposes only and do not limit the scope of the present invention.
 以下、本発明の実施形態について、図を参照して説明する。便宜上、各図には互いに直交するX、Y、Z軸が付記されている。X-Y平面は水平面であり、Z軸方向は鉛直方向である。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. For convenience, the X, Y, and Z axes that are orthogonal to each other are added to each figure. The XY plane is a horizontal plane, and the Z-axis direction is a vertical direction.
 図1は、Z軸負方向に見た場合の撮像装置1の構成を模式的に示す平面図である。 FIG. 1 is a plan view schematically showing the configuration of the image pickup apparatus 1 when viewed in the negative direction of the Z axis.
 撮像装置1は、2つの光源部10と、2つの反射部材20と、集光レンズ30と、撮像素子40と、を備える。光源部10および集光レンズ30は、図示しない支持部を介して、撮像装置1の筐体に固定されている。反射部材20と撮像素子40は、直接または他の部材を介して、撮像装置1の筐体に固定されている。 The image pickup device 1 includes two light source units 10, two reflection members 20, a condenser lens 30, and an image pickup element 40. The light source unit 10 and the condenser lens 30 are fixed to the housing of the image pickup apparatus 1 via a support unit (not shown). The reflection member 20 and the image pickup device 40 are fixed to the housing of the image pickup apparatus 1 directly or via another member.
 2つの光源部10の構成は、同じである。2つの光源部10は、それぞれ、集光レンズ30のY軸正側およびY軸負側に配置されている。2つの光源部10は、集光レンズ30の光軸31に対してY軸方向に対称に配置されている。光源部10は、光源11と拡散板12を備える。 The configuration of the two light source units 10 is the same. The two light source units 10 are arranged on the Y-axis positive side and the Y-axis negative side of the condenser lens 30, respectively. The two light source units 10 are arranged symmetrically in the Y-axis direction with respect to the optical axis 31 of the condenser lens 30. The light source unit 10 includes a light source 11 and a diffuser plate 12.
 光源11は、たとえば、LEDにより構成される。光源11は、所定の広がり角を有する光をX軸正方向に出射し、被写領域A10に光を照射する。光源11の出射波は、たとえば、赤外波長帯である。拡散板12は、たとえば、マイクロレンズアレイにより構成され、光源11から出射された光の広がり角を大きくする。光源11から出射され拡散板12を透過した光のうち、一部は反射部材20を経由せずに直接的に被写領域A10へ照射され、他の一部は反射部材20により反射されて被写領域A10へ照射される。 The light source 11 is composed of, for example, an LED. The light source 11 emits light having a predetermined spreading angle in the positive direction of the X-axis, and irradiates the projected area A10 with the light. The emitted wave of the light source 11 is, for example, an infrared wavelength band. The diffuser plate 12 is composed of, for example, a microlens array, and increases the spread angle of the light emitted from the light source 11. Of the light emitted from the light source 11 and transmitted through the diffuser plate 12, a part of the light is directly irradiated to the projection area A10 without passing through the reflection member 20, and the other part is reflected by the reflection member 20 and covered. The image area A10 is irradiated.
 2つの反射部材20の構成は、同じである。2つの反射部材20は、それぞれ、集光レンズ30のY軸正側に配置された光源部10のY軸正側、および、集光レンズ30のY軸負側に配置された光源部10のY軸負側に配置されている。反射部材20は、平板形状を有しており、X-Z平面に平行な反射面21を備える。反射面21は、たとえばミラーにより構成される。反射部材20は、反射面21が集光レンズ30の光軸31に対して対称となるように配置される。反射面21は、光源11から出射された光の一部を被写領域A10に向けて反射する。 The configurations of the two reflective members 20 are the same. The two reflecting members 20 are the light source unit 10 arranged on the positive side of the Y-axis of the condensing lens 30 and the negative side of the Y-axis of the condensing lens 30, respectively. It is located on the negative side of the Y-axis. The reflective member 20 has a flat plate shape and includes a reflective surface 21 parallel to the XX plane. The reflecting surface 21 is composed of, for example, a mirror. The reflecting member 20 is arranged so that the reflecting surface 21 is symmetrical with respect to the optical axis 31 of the condenser lens 30. The reflecting surface 21 reflects a part of the light emitted from the light source 11 toward the subject area A10.
 被写領域A10は、2つの光源部10のX軸正側に位置し、且つ、2つの反射部材20の中間付近に位置する空間領域である。被写領域A10からの光が集光レンズ30に取り込まれて、撮像素子40の受光面41に集光される。すなわち、被写領域A10に存在する物体の像が受光面41に結像される。これにより、被写領域A10に存在する撮像対象の物体が撮像される。撮像対象の物体は、たとえば、微細な粒子であり、いわゆるパーティクルである。光源部10からの光は、被写領域A10内の物体に照射される。物体に照射された光は、物体の表面で反射される。 The projected area A10 is a spatial area located on the positive side of the X-axis of the two light source units 10 and near the middle of the two reflecting members 20. The light from the subject area A10 is taken into the condenser lens 30 and condensed on the light receiving surface 41 of the image sensor 40. That is, an image of an object existing in the subject area A10 is formed on the light receiving surface 41. As a result, the object to be imaged existing in the subject area A10 is imaged. The object to be imaged is, for example, fine particles, so-called particles. The light from the light source unit 10 irradiates the object in the subject area A10. The light radiated to the object is reflected on the surface of the object.
 集光レンズ30は、被写領域A10からの光を撮像素子40に集光させる。集光レンズ30は、2つの光源部10の中間付近に配置される。集光レンズ30の光軸31は、X軸方向に平行であり、光軸31は、撮像素子40の受光面41の中心を通る。集光レンズ30は、被写領域A10からの反射光を撮像素子40の受光面41へと導くとともに、受光面41上に、物体に照射された光の領域の像を形成する。集光レンズ30は、1つのレンズでなくてもよく、複数のレンズが組み合わされて構成されてもよい。 The condensing lens 30 condenses the light from the subject area A10 on the image sensor 40. The condenser lens 30 is arranged near the middle of the two light source units 10. The optical axis 31 of the condenser lens 30 is parallel to the X-axis direction, and the optical axis 31 passes through the center of the light receiving surface 41 of the image sensor 40. The condenser lens 30 guides the reflected light from the image pickup region A10 to the light receiving surface 41 of the image sensor 40, and forms an image of the region of the light irradiated to the object on the light receiving surface 41. The condenser lens 30 does not have to be one lens, and may be configured by combining a plurality of lenses.
 撮像素子40は、被写領域A10を撮像する。撮像素子40は、CMOSイメージセンサまたはCCDイメージセンサにより構成される。撮像素子40の受光面41には、縦横に複数の画素41a(実施形態4の図11(a)、(b)参照)が並んでいる。撮像素子40は、光源11から出射される光と同じ波長帯の光を受光可能に構成されている。撮像素子40の前段に、光源11から出射される光の波長帯を透過させるフィルタが配置されてもよい。撮像素子40は、受光面41に照射された物体からの反射光を撮像し、撮像信号を後段の信号処理回路103(図2参照)に出力する。 The image sensor 40 images the image area A10. The image sensor 40 is composed of a CMOS image sensor or a CCD image sensor. A plurality of pixels 41a (see FIGS. 11A and 11B of the fourth embodiment) are arranged vertically and horizontally on the light receiving surface 41 of the image sensor 40. The image pickup device 40 is configured to be able to receive light in the same wavelength band as the light emitted from the light source 11. A filter that transmits the wavelength band of light emitted from the light source 11 may be arranged in front of the image sensor 40. The image sensor 40 images the reflected light from the object irradiated on the light receiving surface 41, and outputs the image pickup signal to the signal processing circuit 103 (see FIG. 2) in the subsequent stage.
 ここで、被写領域A10の物体には、上述したように、光源11から直接出射された光と、光源11から出射された光が反射部材20の反射面21によって反射された光の両方が照射される。このため、物体上には、光源11から光が直接照射される領域と、反射面21を介して光が照射される領域とが生じる。反射面21を介して光が照射される領域は、光源11から光が直接照射される領域の外側に広がる。したがって、光源11から光が直接照射された物体上の領域で反射された反射光は、受光面41において投影領域A21に照射される。一方、反射面21を介して光が照射された物体上の領域で反射された反射光は、受光面41において投影領域A21の外縁付近から外側に広がった投影領域A22に照射される。よって、反射面21が配置される場合の投影領域は、投影領域A21、A22が合わさった投影領域A20となる。 Here, as described above, the object in the subject area A10 has both the light directly emitted from the light source 11 and the light emitted from the light source 11 reflected by the reflecting surface 21 of the reflecting member 20. Be irradiated. Therefore, on the object, a region where the light is directly irradiated from the light source 11 and a region where the light is irradiated through the reflecting surface 21 are generated. The region where the light is irradiated through the reflecting surface 21 extends to the outside of the region where the light is directly irradiated from the light source 11. Therefore, the reflected light reflected in the region on the object directly irradiated with the light from the light source 11 is irradiated to the projection region A21 on the light receiving surface 41. On the other hand, the reflected light reflected in the region on the object irradiated with the light through the reflecting surface 21 is irradiated to the projection region A22 extending outward from the vicinity of the outer edge of the projection region A21 on the light receiving surface 41. Therefore, when the reflecting surface 21 is arranged, the projection area is the projection area A20 in which the projection areas A21 and A22 are combined.
 図2は、撮像装置1の構成を示すブロック図である。 FIG. 2 is a block diagram showing the configuration of the image pickup apparatus 1.
 撮像装置1は、図1に示した構成に加えて、信号処理部101と、光源駆動回路102と、信号処理回路103と、を備える。なお、図2には、便宜上、一方の光源11に対して設けられる光源駆動回路102が図示されているが、他方の光源11についても同様に光源駆動回路102が設けられる。 In addition to the configuration shown in FIG. 1, the image pickup apparatus 1 includes a signal processing unit 101, a light source drive circuit 102, and a signal processing circuit 103. Note that FIG. 2 shows a light source drive circuit 102 provided for one light source 11 for convenience, but a light source drive circuit 102 is similarly provided for the other light source 11.
 信号処理部101は、マイクロプロセッサ等により構成され、制御部と記憶部を備える。光源駆動回路102は、信号処理部101からの指示信号に従って、光源11を駆動する。信号処理回路103は、撮像素子40から出力された撮像信号に対して信号処理を行い、処理を行った撮像信号を信号処理部101に出力する。信号処理部101は、信号処理部101からの撮像信号を処理し、撮像画像を生成する。生成された撮像画像は、たとえば、図示しない表示部や制御装置に出力される。あるいは、信号処理部101は、撮像画像を処理して、撮像対象の物体を検出し、検出結果を外部の制御装置に出力してもよい。 The signal processing unit 101 is composed of a microprocessor or the like, and includes a control unit and a storage unit. The light source drive circuit 102 drives the light source 11 according to an instruction signal from the signal processing unit 101. The signal processing circuit 103 performs signal processing on the image pickup signal output from the image pickup element 40, and outputs the processed image pickup signal to the signal processing unit 101. The signal processing unit 101 processes the imaging signal from the signal processing unit 101 to generate an captured image. The generated captured image is output to, for example, a display unit or a control device (not shown). Alternatively, the signal processing unit 101 may process the captured image, detect the object to be imaged, and output the detection result to an external control device.
 <実施形態1の効果>
 実施形態1によれば、以下の効果が奏される。
<Effect of Embodiment 1>
According to the first embodiment, the following effects are achieved.
 反射面21が設けられない場合、撮像素子40に集光される光は、光源部10から直接、物体に照射されて物体表面で反射された反射光のみである。この場合、被写領域A10に存在する撮像対象の物体が塵埃等の微小な物体であると、撮像素子40に取り込まれる反射光は、物体表面の微小な領域で反射された反射光となる。このため、撮像素子40の受光面41において、この反射光が集光される領域は、極めて微小となる。 When the reflecting surface 21 is not provided, the light collected by the image pickup element 40 is only the reflected light that is directly applied to the object from the light source unit 10 and reflected on the surface of the object. In this case, if the object to be imaged in the subject area A10 is a minute object such as dust, the reflected light taken in by the image pickup element 40 is the reflected light reflected in the minute area on the surface of the object. Therefore, on the light receiving surface 41 of the image sensor 40, the region where the reflected light is collected becomes extremely small.
 これに対し、上記のように反射面21が設けられると、光源部10から直接物体に照射される光の方向と、反射面21を介して物体に照射される光の方向とが相違する。これにより、光源部10から光が直接物体に照射された場合に反射光が撮像素子40に取り込まれる物体表面の領域の他、この領域の外側の領域で反射された反射光も、撮像素子40に集光される。このため、撮像素子40の受光面41において、物体からの反射光が集光される領域を広げることができる。よって、塵埃等の微細な物体であっても、撮像画像からより適正に物体を検出できる。 On the other hand, when the reflecting surface 21 is provided as described above, the direction of the light radiated directly to the object from the light source unit 10 and the direction of the light radiated to the object via the reflecting surface 21 are different. As a result, in addition to the region on the surface of the object where the reflected light is taken into the image pickup element 40 when the light is directly applied to the object from the light source unit 10, the reflected light reflected in the region outside this region is also the image pickup element 40. Is focused on. Therefore, on the light receiving surface 41 of the image sensor 40, the region where the reflected light from the object is collected can be widened. Therefore, even a fine object such as dust can be detected more appropriately from the captured image.
 集光レンズ30の光軸31の周りに2つの光源部10が配置されている。この構成によれば、各光源部10からの光を、反射面21を介して、異なる方向から物体に照射できる。これにより、撮像素子40に反射光が取り込まれる物体表面の領域を、光源部10が1つの場合に比べて広げることができ、撮像素子40の受光面41において、物体からの反射光が集光される領域を広げることができる。よって、撮像画像からより適正に物体を検出できる。 Two light source units 10 are arranged around the optical axis 31 of the condenser lens 30. According to this configuration, the light from each light source unit 10 can be applied to the object from different directions via the reflecting surface 21. As a result, the area on the surface of the object on which the reflected light is taken into the image sensor 40 can be expanded as compared with the case where the light source unit 10 is one, and the reflected light from the object is collected on the light receiving surface 41 of the image sensor 40. The area to be used can be expanded. Therefore, the object can be detected more appropriately from the captured image.
 <実施形態2>
 上記実施形態1では、2つの光源部10が、それぞれ、集光レンズ30のY軸正側およびY軸負側に配置され、X-Z平面に平行な2つの反射部材20が、それぞれ、集光レンズ30のY軸正側およびY軸負側に配置された。これに対し、実施形態2では、4つの光源部10と4つの反射部材20が配置される。
<Embodiment 2>
In the first embodiment, the two light source units 10 are arranged on the Y-axis positive side and the Y-axis negative side of the condenser lens 30, respectively, and the two reflecting members 20 parallel to the XX plane are collected. It was arranged on the positive side of the Y-axis and the negative side of the Y-axis of the optical lens 30. On the other hand, in the second embodiment, four light source units 10 and four reflecting members 20 are arranged.
 図3は、実施形態2に係る、X軸正方向に見た場合の撮像装置1の構成を模式的に示す側面図である。図3では、便宜上、撮像素子40の位置が破線で示されている。 FIG. 3 is a side view schematically showing the configuration of the image pickup apparatus 1 when viewed in the positive direction of the X-axis according to the second embodiment. In FIG. 3, for convenience, the position of the image sensor 40 is indicated by a broken line.
 図3に示すように、実施形態2では、集光レンズ30の光軸31の周りに4つの光源部10が配置されている。すなわち、実施形態2では、実施形態1と比較して、集光レンズ30のZ軸正側およびZ軸負側に、それぞれ光源部10が配置されている。また、実施形態2では、4つの光源部10の外側に、4つの反射部材20が配置されている。また、実施形態2においても、各光源11に対して、それぞれ光源駆動回路102(図2参照)が配置される。 As shown in FIG. 3, in the second embodiment, four light source units 10 are arranged around the optical axis 31 of the condenser lens 30. That is, in the second embodiment, the light source units 10 are arranged on the Z-axis positive side and the Z-axis negative side of the condenser lens 30, respectively, as compared with the first embodiment. Further, in the second embodiment, four reflecting members 20 are arranged outside the four light source units 10. Further, also in the second embodiment, the light source drive circuit 102 (see FIG. 2) is arranged for each light source 11.
 また、実施形態2では、集光レンズ30の光軸31の周りに4つの反射部材20が配置されている。すなわち、実施形態2では、実施形態1と比較して、集光レンズ30のZ軸正側およびZ軸負側に、それぞれX-Y平面に平行な反射部材20(反射面21)が配置されている。各反射部材20は、反射面21が光源部10に対向するように配置されている。 Further, in the second embodiment, four reflecting members 20 are arranged around the optical axis 31 of the condenser lens 30. That is, in the second embodiment, as compared with the first embodiment, the reflecting members 20 (reflecting surfaces 21) parallel to the XY plane are arranged on the Z-axis positive side and the Z-axis negative side of the condenser lens 30, respectively. ing. Each reflecting member 20 is arranged so that the reflecting surface 21 faces the light source unit 10.
 実施形態2によれば、集光レンズ30の光軸31の周りに4つの光源部10が配置されているため、実施形態1と比較して、撮像素子40に反射光が取り込まれる物体表面の領域範囲をさらに広げることができ、撮像素子40の受光面41において、物体からの反射光が集光される領域を広げることができる。よって、撮像画像からより適正に物体を検出できる。 According to the second embodiment, since the four light source units 10 are arranged around the optical axis 31 of the condenser lens 30, the surface of the object on which the reflected light is taken into the image pickup element 40 is compared with the first embodiment. The range of the region can be further expanded, and the region in which the reflected light from the object is collected can be expanded on the light receiving surface 41 of the image pickup element 40. Therefore, the object can be detected more appropriately from the captured image.
 反射部材20の反射面21が、集光レンズ30の光軸31を囲むように配置されている。これにより、集光レンズ30の光軸31に対して全周に亘って反射面21からの反射光を物体に照射できるため、撮像素子40の受光面41上における物体表面の投影領域を拡大できる。 The reflecting surface 21 of the reflecting member 20 is arranged so as to surround the optical axis 31 of the condensing lens 30. As a result, the object can be irradiated with the reflected light from the reflecting surface 21 over the entire circumference of the optical axis 31 of the condenser lens 30, so that the projected region of the object surface on the light receiving surface 41 of the image sensor 40 can be expanded. ..
 すなわち、上記実施形態1の図1の構成では、受光面41上における物体表面の投影領域は、Y軸方向に並ぶ2つの反射面21によって、Y軸正負の方向に拡張されたが、実施形態2の図3の構成では、さらに、Z軸方向に並ぶ2つの反射面21が配置されるため、受光面41上における物体表面の投影領域は、さらに、Z軸正負の方向にも拡張され、結果、集光レンズ30の光軸31に対して全周に亘って、物体表面の投影領域が拡張される。よって、図1の構成に比べて、撮像素子40の受光面41上における物体表面の投影領域を拡大できる。 That is, in the configuration of FIG. 1 of the first embodiment, the projected region of the object surface on the light receiving surface 41 is expanded in the positive and negative directions of the Y axis by the two reflecting surfaces 21 arranged in the Y axis direction. In the configuration of FIG. 3, since two reflecting surfaces 21 arranged in the Z-axis direction are further arranged, the projected region of the object surface on the light receiving surface 41 is further expanded in the positive and negative directions of the Z-axis. As a result, the projected region on the surface of the object is expanded over the entire circumference with respect to the optical axis 31 of the condenser lens 30. Therefore, as compared with the configuration of FIG. 1, the projection region of the object surface on the light receiving surface 41 of the image sensor 40 can be enlarged.
 なお、実施形態2のように、4つの反射部材20が集光レンズ30の周囲を囲むように配置される場合、X軸方向に見た場合の反射部材20の形状は、図3に示す正方形に限らず、長方形やひし形でもよい。また、集光レンズ30の周囲に配置される反射部材20の数は、実施形態2のように4個に限らず、他の数でもよい。他の個数の反射部材20が配置される場合も、各反射部材20は、反射面21により反射された光が、光源部10から直接照射される光とは異なる物体表面の領域に照射されるように配置される。すなわち、集光レンズ30に対して光源部10の外側に反射面21が位置付けられるように、各反射部材20が配置される。また、X軸方向に見て集光レンズ30の周囲に配置される反射部材20は、図3に示したように隣り合う反射部材20が接続されることに限らず、隣り合う反射部材20の間に隙間が空いていてもよい。また、集光レンズ30の周囲に配置される光源部10の数は、実施形態1、2のように2個や4個に限らず、他の個数でもよい。 When the four reflecting members 20 are arranged so as to surround the periphery of the condenser lens 30 as in the second embodiment, the shape of the reflecting member 20 when viewed in the X-axis direction is a square shown in FIG. It is not limited to the above, and may be a rectangle or a rhombus. Further, the number of the reflecting members 20 arranged around the condenser lens 30 is not limited to four as in the second embodiment, and may be another number. Even when another number of reflecting members 20 are arranged, the light reflected by the reflecting surface 21 is applied to a region on the surface of the object different from the light directly emitted from the light source unit 10 in each reflecting member 20. Arranged like this. That is, each reflecting member 20 is arranged so that the reflecting surface 21 is positioned outside the light source unit 10 with respect to the condenser lens 30. Further, the reflective members 20 arranged around the condenser lens 30 when viewed in the X-axis direction are not limited to the adjacent reflective members 20 being connected as shown in FIG. There may be a gap between them. Further, the number of the light source units 10 arranged around the condenser lens 30 is not limited to two or four as in the first and second embodiments, and may be another number.
 <物体検出の検証実験>
 次に、発明者が実際に行った物体検出の検証実験について説明する。
<Verification experiment of object detection>
Next, the verification experiment of object detection actually performed by the inventor will be described.
 図4(a)、(b)は、本検証実験において設定された実施形態の構成の各部のサイズを示す図である。 4 (a) and 4 (b) are diagrams showing the sizes of each part of the configuration of the embodiment set in this verification experiment.
 本検証実験における実施形態の構成では、Z軸負側が開放された直方体形状の箱部材を、図1の2つの反射部材20以外の光学系に被せて、この光学系を箱部材の内部に収容した。箱部材の5つの内側面により、それぞれ反射面21を構成した。したがって、実施形態の構成では、集光レンズ30のY軸正側、Y軸負側およびZ軸正側に、それぞれ、反射面21が配置された。また、物体が位置付けられる被写領域A10のX軸正側、および撮像素子40のX軸負側にも、それぞれ、反射面21が配置された。集光レンズ30のZ軸負側には、反射面を有しない下面部材を配置した。 In the configuration of the embodiment in this verification experiment, a rectangular parallelepiped box member with the negative side of the Z axis open is covered with an optical system other than the two reflective members 20 of FIG. 1, and this optical system is housed inside the box member. bottom. The reflective surface 21 was formed by each of the five inner surfaces of the box member. Therefore, in the configuration of the embodiment, the reflecting surfaces 21 are arranged on the Y-axis positive side, the Y-axis negative side, and the Z-axis positive side of the condenser lens 30, respectively. Further, the reflecting surfaces 21 are also arranged on the X-axis positive side of the subject area A10 where the object is positioned and on the X-axis negative side of the image sensor 40, respectively. On the negative side of the Z-axis of the condenser lens 30, a lower surface member having no reflecting surface was arranged.
 集光レンズ30と光源11とのY軸方向の間隔d1を3.3cmとした。なお、間隔d1を2cm~5cmとしてもよい。光源11と、光源11のY軸方向の外側に位置する反射面21との間隔d2を、実施形態の構成の場合、約10cmとした。なお、実施形態の構成の場合、間隔d2を10cm~20cmとしてもよい。光源11と、Z軸正側の反射面21との間隔d3を約80cmとした。すなわち、Z軸正側の反射面21で反射された反射光は物体の撮像に略影響しないように、間隔d3を設定した。集光レンズ30と撮像素子40との距離d4を3.75cmとした。なお、距離d4を2cm~5cmとしてもよい。集光レンズ30の焦点距離fを約6cmとした。なお、焦点距離fを5cm~7cmとしてもよい。撮像素子40から物体までの距離を約15cmとした。 The distance d1 between the condenser lens 30 and the light source 11 in the Y-axis direction was set to 3.3 cm. The interval d1 may be 2 cm to 5 cm. In the case of the configuration of the embodiment, the distance d2 between the light source 11 and the reflecting surface 21 located outside the light source 11 in the Y-axis direction is set to about 10 cm. In the case of the configuration of the embodiment, the interval d2 may be 10 cm to 20 cm. The distance d3 between the light source 11 and the reflecting surface 21 on the positive side of the Z axis was set to about 80 cm. That is, the interval d3 is set so that the reflected light reflected by the reflecting surface 21 on the positive side of the Z axis does not substantially affect the imaging of the object. The distance d4 between the condenser lens 30 and the image sensor 40 was set to 3.75 cm. The distance d4 may be 2 cm to 5 cm. The focal length f of the condenser lens 30 was set to about 6 cm. The focal length f may be 5 cm to 7 cm. The distance from the image sensor 40 to the object was set to about 15 cm.
 比較例の構成では、間隔d2を100cm以上とした。すなわち、比較例では、実施形態に比べて、Y軸正負側の反射面21を光源11から大きく遠ざけて、これら反射面による反射光が物体の撮像に略影響しないように調整した。その他の条件は、実施形態の構成と同様とした。 In the configuration of the comparative example, the interval d2 was set to 100 cm or more. That is, in the comparative example, the reflecting surface 21 on the positive and negative sides of the Y-axis was far away from the light source 11 as compared with the embodiment, and the light reflected by these reflecting surfaces was adjusted so as not to substantially affect the imaging of the object. Other conditions were the same as the configuration of the embodiment.
 この条件の下、発明者は、被写領域A10に対して上から100μm程度の微小な物体を落下させ、比較例および実施形態の構成を用いて物体を撮像した。 Under this condition, the inventor dropped a minute object of about 100 μm from the top with respect to the subject area A10, and imaged the object using the configurations of the comparative example and the embodiment.
 図5(a)は、比較例の構成を用いて撮像した撮像画像を示す図である。 FIG. 5A is a diagram showing a captured image captured using the configuration of the comparative example.
 図5(a)に示すように、比較例の構成による実験では、撮像のタイミングにおいて、4つの微小な物体が撮像された。図5(a)の撮像画像に含まれる4つの輝点は、それぞれ、互いに異なる微小な物体の表面で光源11からの光が反射された反射光の受光領域に対応する。図5(a)から分かるとおり、撮像画像における各物体の領域は、極めて微小であった。これは、比較例の構成では、主として、光源11から物体に直接照射された光が物体表面で反射されて撮像素子40に投影されるため、投影される物体表面の領域が極めて微小になることによるものであると考察され得る。 As shown in FIG. 5A, in the experiment with the configuration of the comparative example, four minute objects were imaged at the timing of imaging. The four bright spots included in the captured image of FIG. 5A correspond to the light receiving regions of the reflected light in which the light from the light source 11 is reflected on the surfaces of minute objects different from each other. As can be seen from FIG. 5A, the region of each object in the captured image was extremely small. This is because, in the configuration of the comparative example, the light directly radiated to the object from the light source 11 is mainly reflected on the object surface and projected onto the image sensor 40, so that the projected object surface region becomes extremely small. Can be considered to be due to.
 すなわち、比較例の構成では、図4(a)、(b)の間隔d2、d3が大きいため、Y軸正側、Y軸負側およびZ軸正側の各反射面21からの反射光は、撮像に略影響しないと考えられる。このため、比較例の構成では、主として、光源11から物体に直接照射されたことによる微小な表面領域からの反射光が撮像素子40に集光され、これにより、各物体からの反射光の受光領域が極めて小さくなったものと考えられる。 That is, in the configuration of the comparative example, since the intervals d2 and d3 in FIGS. 4A and 4B are large, the reflected light from the reflecting surfaces 21 on the positive side of the Y axis, the negative side of the Y axis, and the positive side of the Z axis is , It is considered that there is almost no effect on imaging. Therefore, in the configuration of the comparative example, the reflected light from the minute surface region due to the direct irradiation of the object from the light source 11 is mainly collected by the image sensor 40, whereby the reflected light from each object is received. It is probable that the area became extremely small.
 図5(b)は、実施形態の構成を用いて撮像した撮像画像を示す図である。 FIG. 5B is a diagram showing a captured image captured using the configuration of the embodiment.
 図5(b)に示すように、実施形態の構成による実験では、撮像のタイミングにおいて、1つの微小な物体が撮像された。図5(b)の撮像画像に含まれる1つの輝点は、1つの微小な物体の表面で光源11からの光が反射された反射光の受光領域に対応する。図5(b)から分かるとおり、撮像画像における物体の領域は、比較例と比べて、数段大きくなった。これは、実施形態の構成の場合、Y軸正側およびY軸負側の反射面21が集光レンズ30に十分に近い位置に配置されており、光源部10からの光が直接物体に照射される領域の外側の領域にも、光源部10からの光が反射面21を介して照射されるためであると考察される。これにより、光源部10から光が直接物体に照射された物体表面の領域からの反射光だけでなく、この領域の外側の領域からの反射光も、撮像素子40に集光される。このため、実施形態の構成では、物体からの反射光の受光領域が、比較例の場合に比べて、数段大きくなったものと考えられる。 As shown in FIG. 5B, in the experiment according to the configuration of the embodiment, one minute object was imaged at the timing of imaging. One bright spot included in the captured image of FIG. 5B corresponds to a light receiving region of the reflected light in which the light from the light source 11 is reflected on the surface of one minute object. As can be seen from FIG. 5B, the area of the object in the captured image is several steps larger than that of the comparative example. In the case of the configuration of the embodiment, the reflecting surfaces 21 on the positive side and the negative side of the Y axis are arranged at positions sufficiently close to the condenser lens 30, and the light from the light source unit 10 directly irradiates the object. It is considered that this is because the light from the light source unit 10 is also irradiated to the region outside the region to be formed through the reflecting surface 21. As a result, not only the reflected light from the region on the surface of the object in which the light is directly applied to the object from the light source unit 10 but also the reflected light from the region outside this region is collected by the image pickup element 40. Therefore, in the configuration of the embodiment, it is considered that the light receiving region of the reflected light from the object is several steps larger than that in the case of the comparative example.
 以上の検証実験の結果から、実施形態の構成のように、反射面21を集光レンズ30に近い位置に適切に配置することにより、撮像素子40の受光面41において、物体からの反射光が集光される領域を広げることができることが分かった。よって、実施形態の構成によれば、塵埃等の微細な物体であっても、撮像画像からより適正に物体を検出できることが分かった。 From the results of the above verification experiments, as in the configuration of the embodiment, by appropriately arranging the reflecting surface 21 at a position close to the condenser lens 30, the reflected light from the object is emitted from the light receiving surface 41 of the image sensor 40. It was found that the area to be focused can be expanded. Therefore, according to the configuration of the embodiment, it was found that even a fine object such as dust can be detected more appropriately from the captured image.
 <実施形態2の変更例>
 実施形態2では、平板形状の4つの反射部材20が、X軸方向に見て正方形の形状となるように配置されたが、本変更例では、円筒形状の反射部材20が配置される。
<Example of modification of Embodiment 2>
In the second embodiment, the four flat plate-shaped reflecting members 20 are arranged so as to have a square shape when viewed in the X-axis direction, but in this modified example, the cylindrical reflecting member 20 is arranged.
 図6は、本変更例に係る、X軸正方向に見た場合の撮像装置1の構成を模式的に示す側面図である。 FIG. 6 is a side view schematically showing the configuration of the image pickup apparatus 1 when viewed in the positive direction of the X-axis according to this modified example.
 本変更例では、図3に示した実施形態2と比較して、4つの反射部材20に代えて、円柱の側面形状を有する反射部材20が配置されている。反射部材20の内側に配置された反射面21も、円柱の側面形状を有する。また、本変更例では、集光レンズ30の周囲に、8つの光源部10が配置されている。 In this modified example, as compared with the second embodiment shown in FIG. 3, a reflective member 20 having a cylindrical side surface shape is arranged instead of the four reflective members 20. The reflective surface 21 arranged inside the reflective member 20 also has a cylindrical side surface shape. Further, in this modified example, eight light source units 10 are arranged around the condenser lens 30.
 本変更例によれば、反射面21が、円筒形状の反射部材20の内側面に形成されている。これにより、集光レンズ30の光軸31に対し全周に亘って反射面21からの反射光を一様に物体に照射できるため、撮像素子40の受光面41上における投影領域を全周に亘って一様に拡大できる。 According to this modified example, the reflective surface 21 is formed on the inner surface of the cylindrical reflective member 20. As a result, the object can be uniformly irradiated with the reflected light from the reflecting surface 21 over the entire circumference of the optical axis 31 of the condenser lens 30, so that the projection region on the light receiving surface 41 of the image sensor 40 is all around. It can be expanded uniformly over.
 また、実施形態2と比較して、集光レンズ30の周囲にさらに多くの光源部10が配置されているため、より広い範囲の物体の表面に光を照射させることができる。よって、撮像画像からより適正に物体を検出できる。 Further, since more light source units 10 are arranged around the condenser lens 30 as compared with the second embodiment, it is possible to irradiate the surface of an object in a wider range with light. Therefore, the object can be detected more appropriately from the captured image.
 なお、X軸方向に見た場合の反射部材20の形状は、図6に示す真円に限らず、楕円や、円の円周に突起が形成された形状であってもよい。 The shape of the reflective member 20 when viewed in the X-axis direction is not limited to the perfect circle shown in FIG. 6, and may be an ellipse or a shape in which protrusions are formed on the circumference of the circle.
 <実施形態3>
 実施形態2では、4つの光源部10が同様に構成され、4つの光源11は同じ波長の光を出射するように構成された。これに対し、実施形態3では、4つの光源11に代えて、互いに異なる波長の光を出射する光源11aと光源11bとが配置される。
<Embodiment 3>
In the second embodiment, the four light source units 10 are similarly configured, and the four light sources 11 are configured to emit light of the same wavelength. On the other hand, in the third embodiment, instead of the four light sources 11, a light source 11a and a light source 11b that emit light having different wavelengths are arranged.
 図7は、実施形態3に係る、X軸正方向に見た場合の撮像装置1の構成を模式的に示す側面図である。 FIG. 7 is a side view schematically showing the configuration of the image pickup apparatus 1 when viewed in the positive direction of the X-axis according to the third embodiment.
 実施形態3では、集光レンズ30のY軸正側およびY軸負側の光源部10は、図3に示した実施形態2と比較して、光源11に代えて光源11aを備える。集光レンズ30のZ軸正側およびZ軸負側の光源部10は、図3に示した実施形態2と比較して、光源11に代えて光源11bを備える。光源11aは、波長λ1の光を出射するよう構成され、光源11bは、波長λ1とは異なる波長λ2の光を出射するよう構成される。また、実施形態3においても、各光源11a、11bに対して、それぞれ光源駆動回路102(図2参照)が配置される。撮像素子40は、光源11a、11bから出射される光と同じ波長帯の光を受光可能に構成される。 In the third embodiment, the light source unit 10 on the positive side of the Y-axis and the negative side of the Y-axis of the condenser lens 30 includes a light source 11a instead of the light source 11 as compared with the second embodiment shown in FIG. The light source unit 10 on the Z-axis positive side and the Z-axis negative side of the condenser lens 30 includes a light source 11b instead of the light source 11 as compared with the second embodiment shown in FIG. The light source 11a is configured to emit light having a wavelength λ1 and the light source 11b is configured to emit light having a wavelength λ2 different from the wavelength λ1. Further, also in the third embodiment, the light source drive circuit 102 (see FIG. 2) is arranged for each of the light sources 11a and 11b. The image sensor 40 is configured to be capable of receiving light in the same wavelength band as the light emitted from the light sources 11a and 11b.
 実施形態3では、信号処理部101は、たとえば、2つの光源11aおよび2つの光源11bが全て同時に光を出射するように、光源11a、11bを駆動させる。こうすると、撮像対象の物体が一方の波長の光を吸収する場合でも、当該物体が他方の波長の光を反射することにより、当該物体の撮像が可能となる。よって、撮像画像から物体をより確実に検出できる。 In the third embodiment, the signal processing unit 101 drives the light sources 11a and 11b so that, for example, the two light sources 11a and the two light sources 11b all emit light at the same time. By doing so, even when the object to be imaged absorbs the light of one wavelength, the object reflects the light of the other wavelength, so that the object can be imaged. Therefore, the object can be detected more reliably from the captured image.
 また、実施形態3では、信号処理部101は、たとえば、2つの光源11aの発光タイミングと2つの光源11bの発光タイミングとが異なるよう、光源11a、11bを駆動させる。こうすると、撮像対象の物体の吸収波長が異なる場合、光源11aの発光タイミングに対応する撮像素子40の受光光量と、光源11bの発光タイミングに対応する撮像素子40の受光光量とに差が生じる。これにより、一方の波長では物体を撮像できたが、他方の波長では物体を撮像できない場合が生じ得る。したがって、この場合に、物体を撮像できた出射波長と、物体を撮像できなかった出射波長とに基づいて、その物体の吸収波長や反射波長を把握でき、その物体がどのような材料からなっているかを示す情報(材料情報)を取得できる。あるいは、各波長を用いた場合の受光光量の差により、その物体の吸収波長や反射波長を把握でき、これにより、その物体の材料情報を取得できる。 Further, in the third embodiment, the signal processing unit 101 drives the light sources 11a and 11b so that, for example, the light emission timings of the two light sources 11a and the light emission timings of the two light sources 11b are different. Then, when the absorption wavelengths of the objects to be imaged are different, the amount of received light of the image pickup element 40 corresponding to the light emission timing of the light source 11a and the amount of light received by the image pickup element 40 corresponding to the light emission timing of the light source 11b are different. As a result, there may be a case where the object can be imaged at one wavelength, but the object cannot be imaged at the other wavelength. Therefore, in this case, the absorption wavelength and the reflection wavelength of the object can be grasped based on the emission wavelength at which the object can be imaged and the emission wavelength at which the object cannot be imaged, and what kind of material the object is made of. Information (material information) indicating whether or not the object can be obtained. Alternatively, the absorption wavelength and the reflection wavelength of the object can be grasped from the difference in the amount of received light when each wavelength is used, and thus the material information of the object can be acquired.
 なお、撮像素子40の前段に、波長λ1の反射光を透過し波長λ2の反射光を反射するダイクロイックミラーを配置してもよい。この場合、光源11a、11bが同時に光を出射し、ダイクロイックミラーを透過した波長λ1の反射光は、撮像素子40により受光され、ダイクロイックミラーで反射された波長λ2の反射光は、さらに別の撮像素子により受光される。ダイクロイックミラー以外の分光素子で波長λ1、λ2の反射光を分光し、各波長の反射光を、それぞれ、対応する撮像素子で受光してもよい。 A dichroic mirror that transmits the reflected light of the wavelength λ1 and reflects the reflected light of the wavelength λ2 may be arranged in front of the image sensor 40. In this case, the light sources 11a and 11b simultaneously emit light, the reflected light of wavelength λ1 transmitted through the dichroic mirror is received by the image pickup element 40, and the reflected light of wavelength λ2 reflected by the dichroic mirror is further imaged. Received light by the element. The reflected light of wavelengths λ1 and λ2 may be separated by a spectroscopic element other than the dichroic mirror, and the reflected light of each wavelength may be received by the corresponding image sensor.
 また、実施形態3では、2種類の異なる波長を出射する複数の光源11が配置されたが、これに限らず、3種類以上の異なる波長を出射する複数の光源11が配置されてもよい。こうすると、さらに詳細に物体の情報を取得できる。 Further, in the third embodiment, a plurality of light sources 11 that emit two different wavelengths are arranged, but the present invention is not limited to this, and a plurality of light sources 11 that emit three or more different wavelengths may be arranged. In this way, the information of the object can be acquired in more detail.
 <実施形態4>
 実施形態1では、2つの光源部10は、撮像装置1内に固定された。これに対し、実施形態4では、2つの光源部10は、集光レンズ30の光軸31に垂直な方向(Y軸方向)に移動可能となるよう撮像装置1内に設置される。
<Embodiment 4>
In the first embodiment, the two light source units 10 are fixed in the image pickup apparatus 1. On the other hand, in the fourth embodiment, the two light source units 10 are installed in the image pickup apparatus 1 so as to be movable in a direction (Y-axis direction) perpendicular to the optical axis 31 of the condenser lens 30.
 図8は、実施形態4に係る、Z軸負方向に見た場合の撮像装置1の構成を模式的に示す平面図である。 FIG. 8 is a plan view schematically showing the configuration of the image pickup apparatus 1 when viewed in the negative direction of the Z axis according to the fourth embodiment.
 実施形態4の撮像装置1は、実施形態1と比較して、さらに2つの駆動部50を備える。2つの駆動部50の構成は同じである。駆動部50は、光源部10のZ軸負側に配置されている。 The image pickup apparatus 1 of the fourth embodiment further includes two drive units 50 as compared with the first embodiment. The configurations of the two drive units 50 are the same. The drive unit 50 is arranged on the negative side of the Z axis of the light source unit 10.
 駆動部50は、支持板51と、ガイドレール52と、ギアシャフト53と、モータ54と、を備える。支持板51は、平板形状の板部材であり、Z軸正側の面に形成された枠部に、光源11と拡散板12が設置される。支持板51は、ガイドレール52に支持されるとともに、ガイドレール52に沿って摺動可能に構成される。ガイドレール52は、撮像装置1内に固定されている。ギアシャフト53は、ネジ溝が形成された軸状の部材であり、支持板51に設けられたネジ孔(図示せず)に通される。モータ54は、たとえば、ステッピングモータである。モータ54の回転軸54aは、ギアシャフト53に接続されている。モータ54が駆動されると、回転軸54aの回転に応じてギアシャフト53が回転する。これにより、支持板51がガイドレール52に沿ってY軸方向に移動し、光源11および拡散板12がY軸方向に移動する。 The drive unit 50 includes a support plate 51, a guide rail 52, a gear shaft 53, and a motor 54. The support plate 51 is a flat plate-shaped plate member, and the light source 11 and the diffusion plate 12 are installed on a frame portion formed on the surface on the positive side of the Z axis. The support plate 51 is supported by the guide rail 52 and is slidable along the guide rail 52. The guide rail 52 is fixed in the image pickup apparatus 1. The gear shaft 53 is a shaft-shaped member having a screw groove formed therein, and is passed through a screw hole (not shown) provided in the support plate 51. The motor 54 is, for example, a stepping motor. The rotation shaft 54a of the motor 54 is connected to the gear shaft 53. When the motor 54 is driven, the gear shaft 53 rotates according to the rotation of the rotating shaft 54a. As a result, the support plate 51 moves in the Y-axis direction along the guide rail 52, and the light source 11 and the diffuser plate 12 move in the Y-axis direction.
 図9は、実施形態4に係る、撮像装置1の構成を示すブロック図である。 FIG. 9 is a block diagram showing the configuration of the image pickup apparatus 1 according to the fourth embodiment.
 実施形態4の撮像装置1は、図2に示した実施形態1と比較して、さらにモータ54とモータ駆動回路104を備える。なお、図9には、便宜上、一方の光源部10に対して設けられるモータ54およびモータ駆動回路104が図示されているが、他方の光源部10についても同様にモータ54およびモータ駆動回路104が設けられる。 The image pickup apparatus 1 of the fourth embodiment further includes a motor 54 and a motor drive circuit 104 as compared with the first embodiment shown in FIG. Note that FIG. 9 shows a motor 54 and a motor drive circuit 104 provided for one light source unit 10 for convenience, but the motor 54 and the motor drive circuit 104 are similarly provided for the other light source unit 10. Provided.
 図10は、実施形態4に係る、信号処理部101による撮像画像の生成処理を示すフローチャートである。以下の生成処理において、信号処理部101は、駆動部50を駆動させることにより、すなわち、モータ駆動回路104を介してモータ54を駆動させることにより、光源部10をY軸方向に移動させる。 FIG. 10 is a flowchart showing a captured image generation process by the signal processing unit 101 according to the fourth embodiment. In the following generation process, the signal processing unit 101 moves the light source unit 10 in the Y-axis direction by driving the drive unit 50, that is, by driving the motor 54 via the motor drive circuit 104.
 信号処理部101は、光源11を駆動させて、光源11から光を出射させる(S11)。続いて、信号処理部101は、撮像素子40の受光面41上の投影領域が所定の大きさ以上であるか否かを判定する(S12)。具体的には、信号処理部101は、撮像素子40から出力される撮像信号に基づいて、信号レベルがノイズを除去するための閾値よりも大きい画素41aの数、言い換えれば、同一の物質として区分される範囲に含まれる画素41aの数を取得する。 The signal processing unit 101 drives the light source 11 to emit light from the light source 11 (S11). Subsequently, the signal processing unit 101 determines whether or not the projection region on the light receiving surface 41 of the image sensor 40 is a predetermined size or larger (S12). Specifically, the signal processing unit 101 classifies the number of pixels 41a whose signal level is larger than the threshold value for removing noise, in other words, as the same substance, based on the image pickup signal output from the image pickup element 40. The number of pixels 41a included in the range is acquired.
 図11(a)、(b)は、撮像素子40の受光面41上における投影領域を模式的に示す図である。図11(a)の場合、信号レベルが閾値よりも大きい画素41aの数は13個であり、投影領域は13個の画素41aに対応する範囲(大きさ)である。図11(b)の場合、信号レベルが閾値よりも大きい画素41aの数は28個であり、投影領域は28個の画素41aに対応する範囲(大きさ)である。 11 (a) and 11 (b) are diagrams schematically showing a projection region on the light receiving surface 41 of the image sensor 40. In the case of FIG. 11A, the number of pixels 41a whose signal level is larger than the threshold value is 13, and the projection area is a range (size) corresponding to the 13 pixels 41a. In the case of FIG. 11B, the number of pixels 41a whose signal level is larger than the threshold value is 28, and the projection area is a range (size) corresponding to the 28 pixels 41a.
 ここで、たとえば、ステップS12の判定で用いる投影領域の大きさの閾値を25個とすると、図11(a)の場合は、ステップS12の判定結果はNOとなり、図11(b)の場合は、ステップS12の判定結果はYESとなる。ステップS12の判定で用いる投影領域の大きさの閾値は、受光面41上の画素41aの数に応じて決められ、あらかじめ信号処理部101の記憶部に記憶される。 Here, for example, assuming that the threshold value of the size of the projection area used in the determination in step S12 is 25, the determination result in step S12 is NO in the case of FIG. 11 (a), and NO in the case of FIG. 11 (b). , The determination result in step S12 is YES. The threshold value of the size of the projection region used in the determination in step S12 is determined according to the number of pixels 41a on the light receiving surface 41, and is stored in advance in the storage unit of the signal processing unit 101.
 図10に戻り、投影領域が所定の大きさ未満である場合(S12:NO)、信号処理部101は、2つの光源部10を、それぞれ内側方向に移動させる(S13)。すなわち、信号処理部101は、Y軸正側の光源部10をY軸負方向に所定距離だけ移動させ、Y軸負側の光源部10をY軸正方向に所定距離だけ移動させる。 Returning to FIG. 10, when the projection area is smaller than a predetermined size (S12: NO), the signal processing unit 101 moves the two light source units 10 inward, respectively (S13). That is, the signal processing unit 101 moves the light source unit 10 on the positive side of the Y-axis by a predetermined distance in the negative direction of the Y-axis, and the light source unit 10 on the negative side of the Y-axis by a predetermined distance in the positive direction of the Y-axis.
 続いて、信号処理部101は、撮像素子40の撮像信号に基づいて、ステップS13の処理の結果、受光面41上における投影領域が大きくなったか否かを判定する。投影領域が大きくなった場合(S14:YES)、信号処理部101は、2つの光源部10の移動方向を内側方向に決定する(S15)。他方、投影領域が小さくなった場合または投影領域の大きさが変わらなかった場合(S14:NO)、信号処理部101は、2つの光源部10の移動方向を外側方向に決定する(S16)。 Subsequently, the signal processing unit 101 determines whether or not the projection region on the light receiving surface 41 has become larger as a result of the processing in step S13, based on the image pickup signal of the image pickup element 40. When the projection area becomes large (S14: YES), the signal processing unit 101 determines the moving directions of the two light source units 10 inward (S15). On the other hand, when the projection area becomes smaller or the size of the projection area does not change (S14: NO), the signal processing unit 101 determines the moving directions of the two light source units 10 in the outward direction (S16).
 続いて、信号処理部101は、ステップS12と同様、撮像素子40の受光面41上の投影領域が所定の大きさ以上であるか否かを判定する(S17)。投影領域が所定の大きさ未満である場合(S17:NO)、信号処理部101は、ステップS15またはS16で決定した方向に、2つの光源部10を所定距離だけ移動させる(S18)。その後、処理がステップS17に戻され、再度ステップS17の判定が行われる。 Subsequently, the signal processing unit 101 determines whether or not the projection region on the light receiving surface 41 of the image sensor 40 is a predetermined size or larger, as in step S12 (S17). When the projection region is smaller than a predetermined size (S17: NO), the signal processing unit 101 moves the two light source units 10 by a predetermined distance in the direction determined in step S15 or S16 (S18). After that, the process is returned to step S17, and the determination in step S17 is performed again.
 ステップS12またはS17において、投影領域が所定の大きさ以上であると判定された場合(S12:YESまたはS17:YES)、信号処理部101は、撮像素子40の撮像信号に基づいて、撮像画像を生成する(S19)。こうして撮像画像の生成処理が終了する。 When it is determined in step S12 or S17 that the projection region is equal to or larger than a predetermined size (S12: YES or S17: YES), the signal processing unit 101 captures the captured image based on the image pickup signal of the image pickup element 40. Generate (S19). In this way, the process of generating the captured image is completed.
 実施形態4によれば、駆動部50は、光源部10を集光レンズ30の光軸31に垂直な方向(Y軸方向)に移動させる。このように光源部10を移動させることにより、光源部10から直接、または、反射面21を介して、被写領域A10に照射される光の方向を変化させることができ、物体に対する光の照射状態を変化させることができる。よって、光源部10を移動させることにより、たとえば、撮像素子40の受光面41上における物体の投影領域の大きさを適正な大きさに拡大できる。 According to the fourth embodiment, the drive unit 50 moves the light source unit 10 in the direction perpendicular to the optical axis 31 of the condenser lens 30 (Y-axis direction). By moving the light source unit 10 in this way, the direction of the light emitted to the projected area A10 can be changed directly from the light source unit 10 or through the reflecting surface 21, and the object is irradiated with the light. The state can be changed. Therefore, by moving the light source unit 10, for example, the size of the projection region of the object on the light receiving surface 41 of the image sensor 40 can be expanded to an appropriate size.
 図10に示したように、信号処理部101は、撮像素子40の受光面41上における物体の投影領域を特定し、特定した投影領域に基づいて光源部10の位置を制御する。光源部10の位置を制御することにより、たとえば、撮像素子40の受光面41上における物体の投影領域の大きさを適正な大きさに拡大できる。 As shown in FIG. 10, the signal processing unit 101 specifies a projection region of an object on the light receiving surface 41 of the image sensor 40, and controls the position of the light source unit 10 based on the specified projection region. By controlling the position of the light source unit 10, for example, the size of the projection region of the object on the light receiving surface 41 of the image sensor 40 can be expanded to an appropriate size.
 なお、実施形態4では、駆動部50は、少なくとも光源部10を集光レンズ30の光軸31に垂直な方向(Y軸方向)に移動させればよく、光源部10を移動させる方向に、X軸方向やZ軸方向の成分が含まれてもよい。また、光源部10の数も、図8に示された数に限られるものではなく、たとえば、図3に示すように、4つの光源部10が配置され、各光源部10が、光軸31に対して接近および離間する方向に駆動されてもよい。この場合、たとえば、4つの光源部10に対して、図10の制御が行われればよく、あるいは、投影領域の大きさが適正な大きさになるように、各光源部10の駆動が個別に制御されてもよい。 In the fourth embodiment, the drive unit 50 may move at least the light source unit 10 in the direction perpendicular to the optical axis 31 of the condenser lens 30 (Y-axis direction), and in the direction in which the light source unit 10 is moved. Components in the X-axis direction and the Z-axis direction may be included. Further, the number of the light source units 10 is not limited to the number shown in FIG. 8, and for example, as shown in FIG. 3, four light source units 10 are arranged, and each light source unit 10 has an optical axis 31. It may be driven in the direction of approaching and separating from the light source. In this case, for example, the control of FIG. 10 may be performed on the four light source units 10, or each light source unit 10 is individually driven so that the size of the projection area becomes an appropriate size. It may be controlled.
 <実施形態5>
 実施形態4では、撮像素子40の受光面41上における物体の投影領域に基づいて、光源部10の位置が制御された。これに対し、実施形態5では、物体までの距離に基づいて光源部10の位置が制御される。
<Embodiment 5>
In the fourth embodiment, the position of the light source unit 10 is controlled based on the projection region of the object on the light receiving surface 41 of the image sensor 40. On the other hand, in the fifth embodiment, the position of the light source unit 10 is controlled based on the distance to the object.
 図12(a)は、実施形態5に係る、信号処理部101による撮像画像の生成処理を示すフローチャートである。 FIG. 12A is a flowchart showing the generation processing of the captured image by the signal processing unit 101 according to the fifth embodiment.
 信号処理部101は、光源11を駆動させて、光源11から光を出射させる(S21)。続いて、信号処理部101は、撮像素子40から出力された撮像信号に基づいて、物体までの距離を算出する(S22)。具体的には、信号処理部101は、光源11の発光タイミングと、画素41a(図11(a)、(b)参照)における反射光の受光タイミングとの時間差に基づいて、画素41aごとに物体までの距離を算出する。そして、信号処理部101は、受光面41上における物体の投影領域を特定し、画素41aごとの距離と投影領域とに基づいて、物体までの距離を算出する。たとえば、投影領域内の各画素41aに基づく距離の平均値が、物体までの距離として算出される。 The signal processing unit 101 drives the light source 11 to emit light from the light source 11 (S21). Subsequently, the signal processing unit 101 calculates the distance to the object based on the image pickup signal output from the image pickup device 40 (S22). Specifically, the signal processing unit 101 is an object for each pixel 41a based on the time difference between the light emission timing of the light source 11 and the light reception timing of the reflected light in the pixels 41a (see FIGS. 11A and 11B). Calculate the distance to. Then, the signal processing unit 101 specifies the projection region of the object on the light receiving surface 41, and calculates the distance to the object based on the distance for each pixel 41a and the projection region. For example, the average value of the distances based on each pixel 41a in the projection area is calculated as the distance to the object.
 続いて、信号処理部101は、ステップS22で算出した物体までの距離に基づいて、図12(b)に示すテーブルを用いて、光源部10の最適位置を取得する(S23)。信号処理部101は、信号処理部101内の記憶部に、図12(b)に示すようなテーブルを記憶している。図12(b)のテーブルには、物体までの距離Dnに対応して、光源部10の最適位置Pn、すなわち撮像素子40の受光面41上における投影領域が最大となるときの光源部10の位置が記憶されている。物体までの距離が近いとき、最適位置は集光レンズ30に近い位置となり、物体までの距離が遠いとき、最適位置は集光レンズ30から遠い位置となる。 Subsequently, the signal processing unit 101 acquires the optimum position of the light source unit 10 using the table shown in FIG. 12B based on the distance to the object calculated in step S22 (S23). The signal processing unit 101 stores a table as shown in FIG. 12B in a storage unit in the signal processing unit 101. In the table of FIG. 12B, the optimum position Pn of the light source unit 10, that is, the light source unit 10 when the projection region on the light receiving surface 41 of the image sensor 40 is maximized corresponding to the distance Dn to the object. The position is remembered. When the distance to the object is short, the optimum position is close to the condenser lens 30, and when the distance to the object is long, the optimum position is far from the condenser lens 30.
 続いて、信号処理部101は、駆動部50を駆動して、ステップS23で取得した最適位置に光源部10を移動する(S24)。その後、信号処理部101は、撮像素子40の撮像信号に基づいて、撮像画像を生成する(S25)。こうして撮像画像の生成処理が終了する。 Subsequently, the signal processing unit 101 drives the drive unit 50 to move the light source unit 10 to the optimum position acquired in step S23 (S24). After that, the signal processing unit 101 generates an image to be captured based on the image pickup signal of the image pickup device 40 (S25). In this way, the process of generating the captured image is completed.
 実施形態5によれば、図12(a)に示したように、信号処理部101は、撮像素子40からの撮像信号に基づいて撮像画像中の物体までの距離を測定し、距離の測定結果に基づいて光源部10の位置を制御する。距離に適する位置に光源部10の位置を制御することにより、たとえば、撮像画像上における物体の投影領域の大きさを適正な大きさに拡大できる。 According to the fifth embodiment, as shown in FIG. 12A, the signal processing unit 101 measures the distance to the object in the captured image based on the image pickup signal from the image pickup element 40, and the measurement result of the distance. The position of the light source unit 10 is controlled based on the above. By controlling the position of the light source unit 10 to a position suitable for the distance, for example, the size of the projected region of the object on the captured image can be enlarged to an appropriate size.
 <実施形態6>
 実施形態4では、2つの光源部10が、集光レンズ30の光軸31に垂直な方向に移動可能に配置された。これに対し、実施形態6では、2つの光源部10が、Z軸方向を回動軸として回動可能に配置される。
<Embodiment 6>
In the fourth embodiment, the two light source units 10 are arranged so as to be movable in the direction perpendicular to the optical axis 31 of the condenser lens 30. On the other hand, in the sixth embodiment, the two light source units 10 are rotatably arranged with the Z-axis direction as the rotation axis.
 図13(a)は、実施形態6に係る、Z軸負方向に見た場合の光源部10の近傍の構成を示す平面図である。図13(b)は、実施形態6に係る、Y軸正方向に見た場合の光源部10の近傍の構成を示す側面図である。 FIG. 13A is a plan view showing a configuration in the vicinity of the light source unit 10 when viewed in the negative direction of the Z axis according to the sixth embodiment. FIG. 13B is a side view showing the configuration of the vicinity of the light source unit 10 when viewed in the positive direction of the Y-axis according to the sixth embodiment.
 実施形態6の撮像装置1は、実施形態4と比較して、2つの駆動部50に代えて、2つの駆動部60を備える。2つの駆動部60の構成は同じである。2つの駆動部60は、それぞれ、2つの光源部10のZ軸負側に配置されている。 The image pickup apparatus 1 of the sixth embodiment includes two drive units 60 instead of the two drive units 50 as compared with the fourth embodiment. The configurations of the two drive units 60 are the same. The two drive units 60 are arranged on the negative side of the Z axis of the two light source units 10, respectively.
 駆動部60は、支持板61と、軸部材62と、モータ63と、を備える。支持板61は、平板形状の板部材であり、Z軸正側の面に、光源部10(光源11と拡散板12)が設置される。軸部材62は、Z軸方向に延びた軸形状の部材であり、支持板61の下面(Z軸負側の面)に設置されている。モータ63は、たとえば、ステッピングモータである。モータ63の本体は、撮像装置1内に固定されている。モータ63の回転軸63aは、軸部材62に接続されている。モータ63が駆動されると、回転軸63aの回転に応じて軸部材62が回転する。これにより、支持板61がZ軸方向を回動軸として回動し、光源部10(光源11および拡散板12)が回動する。 The drive unit 60 includes a support plate 61, a shaft member 62, and a motor 63. The support plate 61 is a flat plate-shaped plate member, and a light source portion 10 (light source 11 and diffusion plate 12) is installed on the surface on the positive side of the Z axis. The shaft member 62 is a shaft-shaped member extending in the Z-axis direction, and is installed on the lower surface (the surface on the negative side of the Z-axis) of the support plate 61. The motor 63 is, for example, a stepping motor. The main body of the motor 63 is fixed in the image pickup apparatus 1. The rotating shaft 63a of the motor 63 is connected to the shaft member 62. When the motor 63 is driven, the shaft member 62 rotates according to the rotation of the rotating shaft 63a. As a result, the support plate 61 rotates about the Z-axis direction, and the light source unit 10 (light source 11 and diffusion plate 12) rotates.
 また、実施形態6においても、図9に示した実施形態4と同様、2つのモータ63をそれぞれ回転させるためモータ駆動回路(図示せず)が設けられる。 Further, also in the sixth embodiment, as in the fourth embodiment shown in FIG. 9, a motor drive circuit (not shown) is provided to rotate each of the two motors 63.
 図14は、実施形態6に係る、信号処理部101による撮像画像の生成処理を示すフローチャートである。以下の生成処理において、信号処理部101は、駆動部60を駆動させることにより、すなわち、モータ駆動回路(図示せず)を介して2つのモータ63を駆動させることにより、Z軸方向を回動軸として光源部10を回動させる。 FIG. 14 is a flowchart showing a captured image generation process by the signal processing unit 101 according to the sixth embodiment. In the following generation process, the signal processing unit 101 rotates in the Z-axis direction by driving the drive unit 60, that is, by driving two motors 63 via a motor drive circuit (not shown). The light source unit 10 is rotated as an axis.
 図14の処理では、図10の処理と比較して、ステップS13、S15、S16、S18に代えて、それぞれ、ステップS31~S34が追加される。以下、ステップS31~S34の処理について説明する。 In the process of FIG. 14, steps S31 to S34 are added in place of steps S13, S15, S16, and S18, respectively, as compared with the process of FIG. Hereinafter, the processes of steps S31 to S34 will be described.
 信号処理部101は、ステップS12において投影領域が所定の大きさ未満であると判定すると(S12:NO)、2つの光源部10を、それぞれ内側方向に回動させる(S31)。すなわち、信号処理部101は、Z軸負方向に見て、Y軸正側の光源部10を時計回りに所定角度だけ回動させ、Y軸負側の光源部10を反時計回りに所定角度だけ回動させる。 When the signal processing unit 101 determines in step S12 that the projection region is smaller than a predetermined size (S12: NO), the signal processing unit 101 rotates the two light source units 10 inward, respectively (S31). That is, the signal processing unit 101 rotates the light source unit 10 on the positive side of the Y axis by a predetermined angle when viewed in the negative direction of the Z axis, and rotates the light source unit 10 on the negative side of the Y axis counterclockwise by a predetermined angle. Rotate only.
 信号処理部101は、ステップS14において投影領域が大きくなったと判定すると(S14:YES)、光源部10の回動方向を内側方向に決定する(S32)。他方、信号処理部101は、ステップS14において投影領域が小さくなった、または投影領域の大きさが変わらなかったと判定すると(S14:NO)、光源部10の回動方向を外側方向に決定する(S33)。 When the signal processing unit 101 determines in step S14 that the projection region has become large (S14: YES), the signal processing unit 101 determines the rotation direction of the light source unit 10 inward (S32). On the other hand, when the signal processing unit 101 determines in step S14 that the projection area has become smaller or the size of the projection area has not changed (S14: NO), the signal processing unit 101 determines the rotation direction of the light source unit 10 in the outward direction (S14: NO). S33).
 信号処理部101は、ステップS17において投影領域が所定の大きさ未満であると判定すると(S17:NO)、ステップS32またはS33で決定した回動方向に、2つの光源部10を所定角度だけ回動させる(S34)。 When the signal processing unit 101 determines in step S17 that the projection region is smaller than a predetermined size (S17: NO), the signal processing unit 101 rotates the two light source units 10 by a predetermined angle in the rotation direction determined in step S32 or S33. Move (S34).
 実施形態6によれば、駆動部60は、光源部10から出射される光の方向が変化するよう、光源部10を回動させる。このように光源部10を回動させることにより、光源部10から直接、または、反射面21を介して、被写領域A10に照射される光の方向を変化させることができ、物体に対する光の照射状態を変化させることができる。よって、光源部10を移動させることにより、たとえば、撮像素子40の受光面41上における物体の投影領域の大きさを適正な大きさに拡大できる。 According to the sixth embodiment, the drive unit 60 rotates the light source unit 10 so that the direction of the light emitted from the light source unit 10 changes. By rotating the light source unit 10 in this way, the direction of the light emitted from the light source unit 10 directly or via the reflecting surface 21 can be changed, and the direction of the light applied to the object can be changed. The irradiation state can be changed. Therefore, by moving the light source unit 10, for example, the size of the projection region of the object on the light receiving surface 41 of the image sensor 40 can be expanded to an appropriate size.
 図14に示したように、信号処理部101は、撮像素子40の受光面41上における物体の投影領域を特定し、特定した投影領域に基づいて光源部10の回動位置を制御する。光源部10の回動位置を制御することにより、たとえば、撮像素子40の受光面41上における物体の投影領域の大きさを適正な大きさに拡大できる。 As shown in FIG. 14, the signal processing unit 101 specifies a projection region of an object on the light receiving surface 41 of the image sensor 40, and controls the rotation position of the light source unit 10 based on the specified projection region. By controlling the rotation position of the light source unit 10, for example, the size of the projection region of the object on the light receiving surface 41 of the image sensor 40 can be expanded to an appropriate size.
 なお、図13(a)、(b)の構成では、光源部10を回動させることにより、光源部10から出射される光の方向を変化させたが、他の方法により、光の方向を変化させてもよい。たとえば、光源部10を揺動させることにより、光源部10から出射される光の方向を変化させてもよい。 In the configurations of FIGS. 13A and 13B, the direction of the light emitted from the light source unit 10 was changed by rotating the light source unit 10, but the direction of the light was changed by another method. It may be changed. For example, the direction of the light emitted from the light source unit 10 may be changed by swinging the light source unit 10.
 また、実施形態6においても、実施形態5と同様に、信号処理部101が、撮像素子40からの信号に基づいて撮像画像中の物体までの距離を測定し、距離の測定結果に基づいて光源部10の回動位置を制御してもよい。この場合、図12(b)に示すテーブルに、各距離に対応して、最適となる回動位置が記憶され、このテーブルを用いて、光源部10の回動位置が制御される。物体までの距離が近いとき、最適な回動位置は、光源11の正面方向が集光レンズ30に近付いた回動位置となり、物体までの距離が遠いとき、最適な回動位置は、光源11の正面方向が集光レンズ30から離れた回動位置となる。このように回動位置の制御が行われる場合も、撮像画像上における物体の投影領域の大きさを適正な大きさに拡大できる。 Further, also in the sixth embodiment, as in the fifth embodiment, the signal processing unit 101 measures the distance to the object in the captured image based on the signal from the image sensor 40, and the light source is based on the measurement result of the distance. The rotation position of the unit 10 may be controlled. In this case, the optimum rotation position is stored in the table shown in FIG. 12B corresponding to each distance, and the rotation position of the light source unit 10 is controlled by using this table. When the distance to the object is short, the optimum rotation position is the rotation position where the front direction of the light source 11 is close to the condenser lens 30, and when the distance to the object is long, the optimum rotation position is the light source 11. The front direction of is the rotation position away from the condensing lens 30. Even when the rotation position is controlled in this way, the size of the projected region of the object on the captured image can be enlarged to an appropriate size.
 また、実施形態4の光源部10を直線的に移動させる構成と、実施形態6の光源部10を回動させる構成とが組み合わされてもよい。 Further, the configuration in which the light source unit 10 of the fourth embodiment is linearly moved and the configuration in which the light source unit 10 of the sixth embodiment is rotated may be combined.
 <実施形態7>
 実施形態1では、光源11はLEDにより構成された。これに対し、実施形態7では、光源11に代えて、半導体レーザ素子により構成された光源13が配置され、被写領域A10の物体に、所定の偏光方向のレーザ光が照射される。
<Embodiment 7>
In the first embodiment, the light source 11 is composed of LEDs. On the other hand, in the seventh embodiment, the light source 13 composed of the semiconductor laser element is arranged instead of the light source 11, and the object in the subject area A10 is irradiated with the laser light in the predetermined polarization direction.
 図15は、実施形態7に係る、Z軸方向に見た場合の撮像装置1の構成を模式的に示す平面図である。 FIG. 15 is a plan view schematically showing the configuration of the image pickup apparatus 1 when viewed in the Z-axis direction according to the seventh embodiment.
 実施形態7の光源部10は、実施形態1と比較して、光源11に代えて光源13を備え、拡散板12に代えてコリメータレンズ14および凹レンズ15を備える。光源13は、半導体レーザ素子により構成され、直線偏光のレーザ光を出射する。2つの光源13は、偏光方向が一方向に揃った光を被写領域A10に照射するように配置される。光源13から出射された光は、コリメータレンズ14により平行光に変換され、凹レンズ15により拡散光に変換される。 Compared with the first embodiment, the light source unit 10 of the seventh embodiment includes a light source 13 instead of the light source 11, and a collimator lens 14 and a concave lens 15 instead of the diffuser plate 12. The light source 13 is composed of a semiconductor laser element and emits linearly polarized laser light. The two light sources 13 are arranged so as to irradiate the subject area A10 with light whose polarization directions are aligned in one direction. The light emitted from the light source 13 is converted into parallel light by the collimator lens 14 and converted into diffused light by the concave lens 15.
 また、実施形態7の撮像装置1は、実施形態1と比較して、集光レンズ30と被写領域A10との間に偏光フィルタ70を備える。偏光フィルタ70は、透過する偏光方向が、光源13により被写領域A10に投射された光の偏光方向に一致するように配置される。偏光フィルタ70は、図示しない支持部を介して、撮像装置1の筐体に固定されている。 Further, the image pickup apparatus 1 of the seventh embodiment includes a polarizing filter 70 between the condenser lens 30 and the subject area A10 as compared with the first embodiment. The polarizing filter 70 is arranged so that the transmitted polarization direction coincides with the polarization direction of the light projected on the subject area A10 by the light source 13. The polarizing filter 70 is fixed to the housing of the image pickup apparatus 1 via a support portion (not shown).
 実施形態7によれば、光源部10は、被写領域A10に対し、偏光方向を一方向に揃えて光を照射する。これにより、物体の表面の状態に応じて、物体表面で反射された光の偏光状態が変化し得る。上記構成によれば、物体により反射された光の偏光状態に応じて偏光フィルタ70を透過する光の光量が変化する。よって、撮像素子40の受光面41上における受光光量を参照することにより、物体表面の状態を検出できる。 According to the seventh embodiment, the light source unit 10 irradiates the subject area A10 with light by aligning the polarization directions in one direction. As a result, the polarization state of the light reflected on the surface of the object can change depending on the state of the surface of the object. According to the above configuration, the amount of light transmitted through the polarizing filter 70 changes according to the polarization state of the light reflected by the object. Therefore, the state of the object surface can be detected by referring to the amount of light received on the light receiving surface 41 of the image sensor 40.
 たとえば、物体の表面が凹凸の少ない金属面等である場合、物体表面からの反射光の偏光方向は、光源11から出射される光の偏光方向と略同じである。この場合、偏光フィルタ70を透過する反射光の光量が大きくなるため、撮像素子40の受光面41上における受光光量が大きくなる。一方、物体の表面に微小な凹凸等が形成されている場合、光が物体表面に照射されると、光の偏光状態が変化し、反射光には種々の偏光状態の光が混ざることになる。この場合、偏光フィルタ70を透過する反射光の光量が減少するため、撮像素子40の受光面41上における受光光量が小さくなる。よって、信号処理部101は、撮像素子40から出力される撮像信号に基づいて受光光量を所定の閾値と比較することにより、物体表面の状態を判定できる。 For example, when the surface of an object is a metal surface with few irregularities, the polarization direction of the reflected light from the object surface is substantially the same as the polarization direction of the light emitted from the light source 11. In this case, since the amount of reflected light transmitted through the polarizing filter 70 is large, the amount of light received on the light receiving surface 41 of the image sensor 40 is large. On the other hand, when the surface of an object has minute irregularities or the like, when the surface of the object is irradiated with light, the polarization state of the light changes, and the reflected light is mixed with light of various polarization states. .. In this case, since the amount of reflected light transmitted through the polarizing filter 70 is reduced, the amount of light received on the light receiving surface 41 of the image sensor 40 is reduced. Therefore, the signal processing unit 101 can determine the state of the object surface by comparing the received light amount with a predetermined threshold value based on the image pickup signal output from the image pickup element 40.
 なお、偏光フィルタ70は、被写領域A10と集光レンズ30との間に配置されることに限らず、被写領域A10と撮像素子40との間に配置されればよい。たとえば、偏光フィルタ70は、集光レンズ30と撮像素子40との間に配置されてもよい。 The polarizing filter 70 is not limited to being arranged between the image area A10 and the condenser lens 30, but may be arranged between the image area A10 and the image sensor 40. For example, the polarizing filter 70 may be arranged between the condenser lens 30 and the image sensor 40.
 <実施形態8>
 実施形態1では、撮像素子40は、撮像装置1内に固定された。これに対し、実施形態8では、撮像素子40は、集光レンズ30の光軸31に平行な方向(X軸方向)に移動可能に配置される。
<Embodiment 8>
In the first embodiment, the image pickup device 40 is fixed in the image pickup device 1. On the other hand, in the eighth embodiment, the image sensor 40 is movably arranged in a direction parallel to the optical axis 31 of the condenser lens 30 (X-axis direction).
 図16は、実施形態8に係る、Z軸方向に見た場合の撮像装置1の構成を模式的に示す平面図である。 FIG. 16 is a plan view schematically showing the configuration of the image pickup apparatus 1 when viewed in the Z-axis direction according to the eighth embodiment.
 実施形態8の撮像装置1は、実施形態1と比較して、さらに距離変更部80を備える。距離変更部80は、撮像素子40のZ軸負側に配置されている。 The imaging device 1 of the eighth embodiment further includes a distance changing unit 80 as compared with the first embodiment. The distance changing unit 80 is arranged on the negative side of the Z axis of the image sensor 40.
 距離変更部80は、図8に示した実施形態4の駆動部50と同様の構成を備える。距離変更部80は、支持板81と、ガイドレール82と、ギアシャフト83と、モータ84と、を備える。支持板81は、平板形状の板部材であり、Z軸正側の面に形成された枠部に、撮像素子40が設置される。支持板81は、ガイドレール82に支持されるとともに、ガイドレール82に沿って摺動可能に構成される。ガイドレール82は、撮像装置1内に固定されている。ギアシャフト83は、ネジ溝が形成された軸状の部材であり、支持板81に設けられたネジ孔(図示せず)に通される。モータ84は、たとえば、ステッピングモータである。モータ84の回転軸84aは、ギアシャフト83に接続されている。モータ84が駆動されると、回転軸84aの回転に応じてギアシャフト83が回転する。これにより、支持板81がガイドレール82に沿ってX軸方向に移動し、撮像素子40がX軸方向に移動する。 The distance changing unit 80 has the same configuration as the driving unit 50 of the fourth embodiment shown in FIG. The distance changing unit 80 includes a support plate 81, a guide rail 82, a gear shaft 83, and a motor 84. The support plate 81 is a flat plate-shaped plate member, and the image sensor 40 is installed on a frame portion formed on the surface on the positive side of the Z axis. The support plate 81 is supported by the guide rail 82 and is slidable along the guide rail 82. The guide rail 82 is fixed in the image pickup apparatus 1. The gear shaft 83 is a shaft-shaped member having a screw groove formed therein, and is passed through a screw hole (not shown) provided in the support plate 81. The motor 84 is, for example, a stepping motor. The rotation shaft 84a of the motor 84 is connected to the gear shaft 83. When the motor 84 is driven, the gear shaft 83 rotates according to the rotation of the rotating shaft 84a. As a result, the support plate 81 moves in the X-axis direction along the guide rail 82, and the image sensor 40 moves in the X-axis direction.
 図17は、実施形態8に係る、撮像装置1の構成を示すブロック図である。 FIG. 17 is a block diagram showing the configuration of the image pickup apparatus 1 according to the eighth embodiment.
 実施形態8の撮像装置1は、図2に示した実施形態1と比較して、さらにモータ84とモータ駆動回路105を備える。 The image pickup apparatus 1 of the eighth embodiment further includes a motor 84 and a motor drive circuit 105 as compared with the first embodiment shown in FIG.
 図18は、実施形態8に係る、信号処理部101による撮像画像の生成処理を示すフローチャートである。以下の生成処理において、信号処理部101は、距離変更部80を駆動させることにより、すなわちモータ駆動回路105を介してモータ84を駆動させることにより、撮像素子40をX軸方向に移動させる。 FIG. 18 is a flowchart showing a captured image generation process by the signal processing unit 101 according to the eighth embodiment. In the following generation processing, the signal processing unit 101 moves the image sensor 40 in the X-axis direction by driving the distance changing unit 80, that is, by driving the motor 84 via the motor drive circuit 105.
 図18の処理では、図10の処理と比較して、ステップS13、S15、S16、S18に代えて、それぞれ、ステップS41~S44が追加される。以下、ステップS41~S44の処理について説明する。 In the process of FIG. 18, steps S41 to S44 are added in place of steps S13, S15, S16, and S18, respectively, as compared with the process of FIG. Hereinafter, the processes of steps S41 to S44 will be described.
 信号処理部101は、ステップS12において投影領域が所定の大きさ未満であると判定すると(S12:NO)、撮像素子40をX軸正方向に所定距離だけ移動させる(S41)。信号処理部101は、ステップS14において投影領域が大きくなったと判定すると(S14:YES)、撮像素子40の移動方向をX軸正方向に決定する(S42)。他方、信号処理部101は、ステップS14において投影領域が小さくなった、または投影領域の大きさが変わらなかったと判定すると(S14:NO)、撮像素子40の移動方向をX軸負方向に決定する(S43)。 When the signal processing unit 101 determines in step S12 that the projection region is smaller than a predetermined size (S12: NO), the signal processing unit 101 moves the image sensor 40 in the positive direction of the X-axis by a predetermined distance (S41). When the signal processing unit 101 determines in step S14 that the projection region has become large (S14: YES), the signal processing unit 101 determines the moving direction of the image sensor 40 in the positive X-axis direction (S42). On the other hand, when the signal processing unit 101 determines in step S14 that the projection region has become smaller or the size of the projection region has not changed (S14: NO), the signal processing unit 101 determines the moving direction of the image sensor 40 in the negative X-axis direction. (S43).
 信号処理部101は、ステップS17において投影領域が所定の大きさ未満であると判定すると(S17:NO)、ステップS42またはS43で決定した移動方向に撮像素子40を所定距離だけ移動させる(S44)。 When the signal processing unit 101 determines in step S17 that the projection region is smaller than a predetermined size (S17: NO), the signal processing unit 101 moves the image sensor 40 by a predetermined distance in the moving direction determined in step S42 or S43 (S44). ..
 実施形態8によれば、距離変更部80は、集光レンズ30と撮像素子40との間の距離を変化させる。集光レンズ30と撮像素子40との間の距離を変化させることにより、撮像素子40の受光面41上における物体の投影領域をデフォーカスさせて、投影領域の大きさを拡大できる。これにより、撮像画像からより適正に物体を検出できる。 According to the eighth embodiment, the distance changing unit 80 changes the distance between the condenser lens 30 and the image sensor 40. By changing the distance between the condenser lens 30 and the image sensor 40, the projection region of the object on the light receiving surface 41 of the image sensor 40 can be defocused and the size of the projection region can be expanded. As a result, the object can be detected more appropriately from the captured image.
 図18に示したように、信号処理部101は、撮像素子40の受光面41上における物体の投影領域を特定し、特定した投影領域に基づいて集光レンズ30と撮像素子40との間の距離を制御する。このように、物体の投影領域の大きさを距離調節にフィードバックさせることにより、たとえば、物体の投影領域が最大となるように、デフォーカス量を適正に調整できる。 As shown in FIG. 18, the signal processing unit 101 identifies a projection region of an object on the light receiving surface 41 of the image sensor 40, and based on the specified projection region, between the condenser lens 30 and the image sensor 40. Control the distance. By feeding back the size of the projection area of the object to the distance adjustment in this way, for example, the defocus amount can be appropriately adjusted so that the projection area of the object is maximized.
 なお、集光レンズ30と撮像素子40との距離が変更され、投影領域が最大となるようデフォーカス量が調整されると、1つの画素41a(図11(a)、(b)参照)の受光光量が低下する。1つの画素41aの受光光量の低下が問題となる場合には、光源11の出力が高められてもよい。この場合、光源11の出射波長が赤外波長帯であれば、光源11の出力が高められても、安全性を確保することができる。 When the distance between the condenser lens 30 and the image sensor 40 is changed and the defocus amount is adjusted so as to maximize the projection region, one pixel 41a (see FIGS. 11A and 11B) The amount of received light decreases. When the decrease in the amount of received light of one pixel 41a becomes a problem, the output of the light source 11 may be increased. In this case, if the emission wavelength of the light source 11 is in the infrared wavelength band, safety can be ensured even if the output of the light source 11 is increased.
 また、集光レンズ30と撮像素子40との間の距離は、撮像素子40のX軸方向の移動により変更されることに限らず、集光レンズ30のX軸方向の移動により変更されてもよい。 Further, the distance between the condenser lens 30 and the image pickup element 40 is not limited to being changed by the movement of the image pickup element 40 in the X-axis direction, and may be changed by the movement of the condenser lens 30 in the X-axis direction. good.
 <その他の変更例>
 撮像装置1の構成は、上記実施形態および変更例に示した構成以外に、種々の変更が可能である。
<Other changes>
The configuration of the image pickup apparatus 1 can be changed in various ways in addition to the configurations shown in the above-described embodiment and modification examples.
 たとえば、上記実施形態4では、光源部10がY軸方向に移動されることにより、物体に対する光の照射状態を変化させたが、これに限らず、物体に対する光の照射状態を変化させる構成として、2つの反射部材20が、それぞれY軸方向に移動させられてもよい。 For example, in the fourth embodiment, the light source unit 10 is moved in the Y-axis direction to change the light irradiation state of the object, but the present invention is not limited to this, and the configuration is such that the light irradiation state of the object is changed. The two reflecting members 20 may be moved in the Y-axis direction, respectively.
 また、上記実施形態1、4~8では、集光レンズ30のまわりに、2個の光源部10が配置されたが、光源部10の数はこれに限らず、3個以上の光源部10が配置されてもよい。実施形態4~6においては、3個以上配置された各光源部10に対して、それぞれ、光源部10を駆動させるための駆動部が設けられてもよい。 Further, in the above embodiments 1, 4 to 8, two light source units 10 are arranged around the condenser lens 30, but the number of light source units 10 is not limited to this, and three or more light source units 10 are arranged. May be placed. In the fourth to sixth embodiments, each of the three or more light source units 10 may be provided with a drive unit for driving the light source unit 10.
 また、上記実施形態1、4~8では、集光レンズ30のまわりに、2個の反射部材20(反射面21)が配置されたが、反射部材20(反射面21)の数はこれに限らず、3個以上の反射部材20が配置され、3個以上の反射面21が集光レンズ30の光軸31を囲んでもよい。また、上記実施形態1、3~8においても、実施形態2の変更例のように、筒状の反射部材20(反射面21)が配置されてもよい。 Further, in the above-described first, fourth to eighth embodiments, two reflecting members 20 (reflecting surfaces 21) are arranged around the condenser lens 30, but the number of reflecting members 20 (reflecting surfaces 21) is the same. Not limited to this, three or more reflecting members 20 may be arranged, and three or more reflecting surfaces 21 may surround the optical axis 31 of the condenser lens 30. Further, also in the above-described first to third to eighth embodiments, the cylindrical reflecting member 20 (reflecting surface 21) may be arranged as in the modified example of the second embodiment.
 この他、本発明の実施形態は、特許請求の範囲に示された技術的思想の範囲内において、適宜、種々の変更が可能である。 In addition, various modifications of the embodiment of the present invention can be made as appropriate within the scope of the technical idea shown in the claims.
 1 撮像装置
 10 光源部
 21 反射面
 30 集光レンズ
 31 光軸
 40 撮像素子
 41 受光面
 50 駆動部
 60 駆動部
 70 偏光フィルタ
 80 距離変更部
 101 信号処理部
 A10 被写領域
1 Image pickup device 10 Light source unit 21 Reflection surface 30 Condensing lens 31 Optical axis 40 Imaging element 41 Light receiving surface 50 Drive unit 60 Drive unit 70 Polarizing filter 80 Distance change unit 101 Signal processing unit A10 Imaging area

Claims (12)

  1.  被写領域に光を照射する光源部と、
     前記被写領域を撮像する撮像素子と、
     前記被写領域からの光を前記撮像素子に集光させる集光レンズと、
     前記光源部から出射された前記光の一部を前記被写領域に向けて反射する反射面と、を備える、
    ことを特徴とする撮像装置。
     
    A light source that irradiates the image area with light,
    An image sensor that captures the image of the subject area and
    A condensing lens that condenses light from the subject area on the image sensor,
    A reflecting surface that reflects a part of the light emitted from the light source unit toward the subject area is provided.
    An imaging device characterized by this.
  2.  請求項1に記載の撮像装置において、
     前記集光レンズの光軸の周りに複数の前記光源部が配置されている、
    ことを特徴とする撮像装置。
     
    In the imaging device according to claim 1,
    A plurality of the light source units are arranged around the optical axis of the condenser lens.
    An imaging device characterized by this.
  3.  請求項2に記載の撮像装置において、
     前記複数の光源部は、出射波長が異なる複数種類の光源部からなっている、
    ことを特徴とする撮像装置。
     
    In the imaging device according to claim 2,
    The plurality of light source units are composed of a plurality of types of light source units having different emission wavelengths.
    An imaging device characterized by this.
  4.  請求項1ないし3の何れか一項に記載の撮像装置において、
     前記光源部を、少なくとも前記集光レンズの光軸に垂直な方向に移動させる駆動部を備える、
    ことを特徴とする撮像装置。
     
    In the imaging device according to any one of claims 1 to 3,
    A drive unit for moving the light source unit at least in a direction perpendicular to the optical axis of the condenser lens is provided.
    An imaging device characterized by this.
  5.  請求項1ないし4の何れか一項に記載の撮像装置において、
     前記光源部から出射される前記光の方向が変化するよう、前記光源部を駆動させる駆動部を備える、
    ことを特徴とする撮像装置。
     
    In the imaging device according to any one of claims 1 to 4.
    A drive unit for driving the light source unit so that the direction of the light emitted from the light source unit changes.
    An imaging device characterized by this.
  6.  請求項4または5に記載の撮像装置において、
     前記撮像素子からの信号を処理する信号処理部を備え、
     前記信号処理部は、前記撮像素子の受光面上における物体の投影領域を特定し、特定した前記投影領域に基づいて前記光源部の位置を制御する、
    ことを特徴とする撮像装置。
     
    In the imaging device according to claim 4 or 5,
    A signal processing unit for processing a signal from the image sensor is provided.
    The signal processing unit specifies a projection region of an object on the light receiving surface of the image sensor, and controls the position of the light source unit based on the specified projection region.
    An imaging device characterized by this.
  7.  請求項4ないし6の何れか一項に記載の撮像装置において、
     前記撮像素子からの信号を処理する信号処理部を備え、
     前記信号処理部は、前記撮像素子からの信号に基づいて撮像画像中の物体までの距離を測定し、前記距離の測定結果に基づいて前記光源部の位置を制御する、
    ことを特徴とする撮像装置。
     
    In the imaging device according to any one of claims 4 to 6.
    A signal processing unit for processing a signal from the image sensor is provided.
    The signal processing unit measures the distance to an object in the captured image based on the signal from the image pickup element, and controls the position of the light source unit based on the measurement result of the distance.
    An imaging device characterized by this.
  8.  請求項1ないし7の何れか一項に記載の撮像装置において、
     前記被写領域と前記撮像素子との間に偏光フィルタを備え、
     前記光源部は、前記被写領域に対し、偏光方向を一方向に揃えて前記光を照射する、
    ことを特徴とする撮像装置。
     
    In the imaging device according to any one of claims 1 to 7.
    A polarizing filter is provided between the subject area and the image sensor.
    The light source unit irradiates the subject area with the light by aligning the polarization directions in one direction.
    An imaging device characterized by this.
  9.  請求項1ないし8の何れか一項に記載の撮像装置において、
     前記集光レンズと前記撮像素子との間の距離を変化させる距離変更部を備える、
    ことを特徴とする撮像装置。
     
    In the imaging device according to any one of claims 1 to 8.
    A distance changing unit for changing the distance between the condenser lens and the image sensor is provided.
    An imaging device characterized by this.
  10.  請求項9に記載の撮像装置において、
     前記撮像素子からの信号を処理する信号処理部を備え、
     前記信号処理部は、前記撮像素子の受光面上における物体の投影領域を特定し、特定した前記投影領域に基づいて前記集光レンズと前記撮像素子との間の前記距離を制御する、
    ことを特徴とする撮像装置。
     
    In the imaging device according to claim 9,
    A signal processing unit for processing a signal from the image sensor is provided.
    The signal processing unit specifies a projection region of an object on the light receiving surface of the image sensor, and controls the distance between the condenser lens and the image sensor based on the specified projection region.
    An imaging device characterized by this.
  11.  請求項1ないし10の何れか一項に記載の撮像装置において、
     前記反射面は、前記集光レンズの光軸を囲む、
    ことを特徴とする撮像装置。
     
    In the imaging device according to any one of claims 1 to 10.
    The reflective surface surrounds the optical axis of the condenser lens.
    An imaging device characterized by this.
  12.  請求項11に記載の撮像装置において、
     前記反射面は、円筒の内側面に形成されている、
    ことを特徴とする撮像装置。
    In the imaging device according to claim 11,
    The reflective surface is formed on the inner surface of the cylinder.
    An imaging device characterized by this.
PCT/JP2021/003108 2020-04-24 2021-01-28 Imaging device WO2021215064A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020077638 2020-04-24
JP2020-077638 2020-04-24

Publications (1)

Publication Number Publication Date
WO2021215064A1 true WO2021215064A1 (en) 2021-10-28

Family

ID=78270437

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/003108 WO2021215064A1 (en) 2020-04-24 2021-01-28 Imaging device

Country Status (1)

Country Link
WO (1) WO2021215064A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001311986A (en) * 2000-04-27 2001-11-09 Canon Inc Camera, flashing device and camera with built-in flashing device
JP2005158490A (en) * 2003-11-26 2005-06-16 Kyoto Denkiki Kk Ring-shape lighting device for image processing inspection
JP2005326451A (en) * 2004-05-12 2005-11-24 Konica Minolta Photo Imaging Inc Imaging apparatus
JP2008003626A (en) * 2000-07-25 2008-01-10 Fujifilm Corp Light source and photographing apparatus
WO2009157129A1 (en) * 2008-06-26 2009-12-30 パナソニック株式会社 Image processing apparatus, image division program and image synthesising method
JP2011123019A (en) * 2009-12-14 2011-06-23 Olympus Corp Image inspection apparatus
JP2016015017A (en) * 2014-07-02 2016-01-28 ソニー株式会社 Imaging system, light projector and image processing method, beam light control method and program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001311986A (en) * 2000-04-27 2001-11-09 Canon Inc Camera, flashing device and camera with built-in flashing device
JP2008003626A (en) * 2000-07-25 2008-01-10 Fujifilm Corp Light source and photographing apparatus
JP2005158490A (en) * 2003-11-26 2005-06-16 Kyoto Denkiki Kk Ring-shape lighting device for image processing inspection
JP2005326451A (en) * 2004-05-12 2005-11-24 Konica Minolta Photo Imaging Inc Imaging apparatus
WO2009157129A1 (en) * 2008-06-26 2009-12-30 パナソニック株式会社 Image processing apparatus, image division program and image synthesising method
JP2011123019A (en) * 2009-12-14 2011-06-23 Olympus Corp Image inspection apparatus
JP2016015017A (en) * 2014-07-02 2016-01-28 ソニー株式会社 Imaging system, light projector and image processing method, beam light control method and program

Similar Documents

Publication Publication Date Title
JP7105639B2 (en) Laser processing equipment
TWI464362B (en) Apparatus for measuring a height and obtaining a focused image of and object and method thereof
US9606069B2 (en) Method, apparatus and system for generating multiple spatially separated inspection regions on a substrate
WO2006038439A1 (en) Observation apparatus with focus position control mechanism
JP4640174B2 (en) Laser dicing equipment
TW201341097A (en) Laser machining apparatus
JP6387381B2 (en) Autofocus system, method and image inspection apparatus
JP4890039B2 (en) Confocal imaging device
TWI771474B (en) Laser beam profiler unit and laser processing device
US20200088649A1 (en) Method of detecting a defect on a substrate, apparatus for performing the same and method of manufacturing semiconductor device using the same
WO2021215064A1 (en) Imaging device
TWI632971B (en) Laser processing apparatus and laser processing method
JP5213761B2 (en) Illumination optical system and image projection apparatus having the same
JP7222906B2 (en) LASER PROCESSING METHOD AND LASER PROCESSING APPARATUS
JP4550488B2 (en) Detection optical device and defect inspection device
KR102107998B1 (en) Automatic Focus Control Module and Surface Defects Detection Device Having The Same
JPH10302587A (en) Optical sensor
JPH0616483B2 (en) Projection optics
JP2021085815A (en) Light irradiation device, inspection system, and light irradiation method
JP2002277729A (en) Device and method for automatic focusing of microscope
TW202327774A (en) Laser processing apparatus
JPH11121351A (en) Method for adjusting beam in focal position detector
WO2020090891A1 (en) Laser machining device
TW202331205A (en) Laser processing apparatus
JP2001133232A (en) Measuring apparatus for inclination of object to be inspected

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21793779

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21793779

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP