WO2021215064A1 - Dispositif d'imagerie - Google Patents

Dispositif d'imagerie Download PDF

Info

Publication number
WO2021215064A1
WO2021215064A1 PCT/JP2021/003108 JP2021003108W WO2021215064A1 WO 2021215064 A1 WO2021215064 A1 WO 2021215064A1 JP 2021003108 W JP2021003108 W JP 2021003108W WO 2021215064 A1 WO2021215064 A1 WO 2021215064A1
Authority
WO
WIPO (PCT)
Prior art keywords
light source
light
imaging device
image sensor
signal processing
Prior art date
Application number
PCT/JP2021/003108
Other languages
English (en)
Japanese (ja)
Inventor
雅之 高瀬
大塚 信之
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2021215064A1 publication Critical patent/WO2021215064A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B11/00Filters or other obturators specially adapted for photographic purposes
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof

Definitions

  • the present invention relates to an imaging device, and is particularly suitable for imaging a fine object such as dust.
  • Patent Document 1 describes a ranging system that irradiates light from two light sources, captures the reflected light with an imaging unit, and measures the distance to an object. In this distance measuring system, the distance information calculated for each pixel is combined to generate a distance image for one frame.
  • the reflected light of the light radiated to the front surface of the object is collected on the light receiving surface of the image sensor.
  • the object as the subject has a fine shape such as dust
  • the reflected light from the minute region on the front surface of the object is collected on the light receiving surface of the image sensor, and the reflected light is incident on the light receiving surface.
  • the area becomes extremely narrow. Therefore, it becomes difficult to properly detect an object from the captured image.
  • an object of the present invention is to provide an imaging device capable of appropriately detecting an object from an captured image even if it is a minute object such as dust.
  • the image pickup apparatus is a light source unit that irradiates a light source area with light, an image pickup element that images the picture area, and a collection that collects light from the picture area on the image pickup device. It includes an optical lens and a reflecting surface that reflects a part of the light emitted from the light source unit toward the subject area.
  • the light collected by the image pickup element is only the reflected light that is directly applied to the object from the light source unit and reflected on the surface of the object.
  • the reflected light captured by the image pickup element is the reflected light reflected in the minute area on the surface of the object. Therefore, on the light receiving surface of the image sensor, the region where the reflected light is collected becomes extremely small.
  • the reflecting surface is provided as described above, the direction of the light radiated directly to the object from the light source unit and the direction of the light radiated to the object through the reflecting surface are different.
  • the reflected light reflected in the area outside this area is also focused on the image pickup element. Will be done. Therefore, on the light receiving surface of the image sensor, the region where the reflected light from the object is collected can be widened. Therefore, according to the image pickup apparatus according to this aspect, even a minute object such as dust can be detected more appropriately from the captured image.
  • an imaging device capable of appropriately detecting an object from an captured image even if it is a minute object such as dust.
  • FIG. 1 is a plan view schematically showing a configuration of an image pickup apparatus when viewed in the negative direction of the Z axis according to the first embodiment.
  • FIG. 2 is a block diagram showing a configuration of an imaging device according to the first embodiment.
  • FIG. 3 is a side view schematically showing the configuration of the image pickup apparatus when viewed in the positive direction of the X-axis according to the second embodiment.
  • 4 (a) and 4 (b) are diagrams showing the size of each part of the configuration of the embodiment set in the verification experiment of object detection.
  • FIG. 5A is a diagram showing a captured image captured using the configuration of the comparative example.
  • FIG. 5B is a diagram showing an captured image captured using the configuration of the embodiment.
  • FIG. 6 is a side view schematically showing the configuration of the image pickup apparatus when viewed in the positive direction of the X-axis according to the modified example of the second embodiment.
  • FIG. 7 is a side view schematically showing the configuration of the image pickup apparatus when viewed in the positive direction of the X-axis according to the third embodiment.
  • FIG. 8 is a plan view schematically showing the configuration of the image pickup apparatus when viewed in the negative direction of the Z axis according to the fourth embodiment.
  • FIG. 9 is a block diagram showing the configuration of the image pickup apparatus according to the fourth embodiment.
  • FIG. 10 is a flowchart showing a captured image generation process by the signal processing unit according to the fourth embodiment.
  • FIG. 11 (a) and 11 (b) are diagrams schematically showing a projection region on the light receiving surface of the image sensor according to the fourth embodiment.
  • FIG. 12A is a flowchart showing a captured image generation process by the signal processing unit according to the fifth embodiment.
  • FIG. 12B is a diagram schematically showing the configuration of a table showing the optimum position of the light source unit corresponding to the distance to the object according to the fifth embodiment.
  • FIG. 13A is a plan view showing a configuration in the vicinity of the light source portion when viewed in the negative direction of the Z axis according to the sixth embodiment.
  • FIG. 13B is a side view showing a configuration in the vicinity of the light source portion when viewed in the positive direction of the Y-axis according to the sixth embodiment.
  • FIG. 14 is a flowchart showing a captured image generation process by the signal processing unit according to the sixth embodiment.
  • FIG. 15 is a plan view schematically showing the configuration of the image pickup apparatus when viewed in the Z-axis direction according to the seventh embodiment.
  • FIG. 16 is a plan view schematically showing the configuration of the image pickup apparatus when viewed in the Z-axis direction according to the eighth embodiment.
  • FIG. 17 is a block diagram showing a configuration of an image pickup apparatus according to the eighth embodiment.
  • FIG. 18 is a flowchart showing a captured image generation process by the signal processing unit according to the eighth embodiment.
  • the XY plane is a horizontal plane
  • the Z-axis direction is a vertical direction.
  • FIG. 1 is a plan view schematically showing the configuration of the image pickup apparatus 1 when viewed in the negative direction of the Z axis.
  • the image pickup device 1 includes two light source units 10, two reflection members 20, a condenser lens 30, and an image pickup element 40.
  • the light source unit 10 and the condenser lens 30 are fixed to the housing of the image pickup apparatus 1 via a support unit (not shown).
  • the reflection member 20 and the image pickup device 40 are fixed to the housing of the image pickup apparatus 1 directly or via another member.
  • the configuration of the two light source units 10 is the same.
  • the two light source units 10 are arranged on the Y-axis positive side and the Y-axis negative side of the condenser lens 30, respectively.
  • the two light source units 10 are arranged symmetrically in the Y-axis direction with respect to the optical axis 31 of the condenser lens 30.
  • the light source unit 10 includes a light source 11 and a diffuser plate 12.
  • the light source 11 is composed of, for example, an LED.
  • the light source 11 emits light having a predetermined spreading angle in the positive direction of the X-axis, and irradiates the projected area A10 with the light.
  • the emitted wave of the light source 11 is, for example, an infrared wavelength band.
  • the diffuser plate 12 is composed of, for example, a microlens array, and increases the spread angle of the light emitted from the light source 11. Of the light emitted from the light source 11 and transmitted through the diffuser plate 12, a part of the light is directly irradiated to the projection area A10 without passing through the reflection member 20, and the other part is reflected by the reflection member 20 and covered.
  • the image area A10 is irradiated.
  • the configurations of the two reflective members 20 are the same.
  • the two reflecting members 20 are the light source unit 10 arranged on the positive side of the Y-axis of the condensing lens 30 and the negative side of the Y-axis of the condensing lens 30, respectively. It is located on the negative side of the Y-axis.
  • the reflective member 20 has a flat plate shape and includes a reflective surface 21 parallel to the XX plane.
  • the reflecting surface 21 is composed of, for example, a mirror.
  • the reflecting member 20 is arranged so that the reflecting surface 21 is symmetrical with respect to the optical axis 31 of the condenser lens 30.
  • the reflecting surface 21 reflects a part of the light emitted from the light source 11 toward the subject area A10.
  • the projected area A10 is a spatial area located on the positive side of the X-axis of the two light source units 10 and near the middle of the two reflecting members 20.
  • the light from the subject area A10 is taken into the condenser lens 30 and condensed on the light receiving surface 41 of the image sensor 40. That is, an image of an object existing in the subject area A10 is formed on the light receiving surface 41.
  • the object to be imaged existing in the subject area A10 is imaged.
  • the object to be imaged is, for example, fine particles, so-called particles.
  • the light from the light source unit 10 irradiates the object in the subject area A10. The light radiated to the object is reflected on the surface of the object.
  • the condensing lens 30 condenses the light from the subject area A10 on the image sensor 40.
  • the condenser lens 30 is arranged near the middle of the two light source units 10.
  • the optical axis 31 of the condenser lens 30 is parallel to the X-axis direction, and the optical axis 31 passes through the center of the light receiving surface 41 of the image sensor 40.
  • the condenser lens 30 guides the reflected light from the image pickup region A10 to the light receiving surface 41 of the image sensor 40, and forms an image of the region of the light irradiated to the object on the light receiving surface 41.
  • the condenser lens 30 does not have to be one lens, and may be configured by combining a plurality of lenses.
  • the image sensor 40 images the image area A10.
  • the image sensor 40 is composed of a CMOS image sensor or a CCD image sensor.
  • a plurality of pixels 41a are arranged vertically and horizontally on the light receiving surface 41 of the image sensor 40.
  • the image pickup device 40 is configured to be able to receive light in the same wavelength band as the light emitted from the light source 11.
  • a filter that transmits the wavelength band of light emitted from the light source 11 may be arranged in front of the image sensor 40.
  • the image sensor 40 images the reflected light from the object irradiated on the light receiving surface 41, and outputs the image pickup signal to the signal processing circuit 103 (see FIG. 2) in the subsequent stage.
  • the object in the subject area A10 has both the light directly emitted from the light source 11 and the light emitted from the light source 11 reflected by the reflecting surface 21 of the reflecting member 20. Be irradiated. Therefore, on the object, a region where the light is directly irradiated from the light source 11 and a region where the light is irradiated through the reflecting surface 21 are generated.
  • the region where the light is irradiated through the reflecting surface 21 extends to the outside of the region where the light is directly irradiated from the light source 11. Therefore, the reflected light reflected in the region on the object directly irradiated with the light from the light source 11 is irradiated to the projection region A21 on the light receiving surface 41.
  • the reflected light reflected in the region on the object irradiated with the light through the reflecting surface 21 is irradiated to the projection region A22 extending outward from the vicinity of the outer edge of the projection region A21 on the light receiving surface 41. Therefore, when the reflecting surface 21 is arranged, the projection area is the projection area A20 in which the projection areas A21 and A22 are combined.
  • FIG. 2 is a block diagram showing the configuration of the image pickup apparatus 1.
  • the image pickup apparatus 1 includes a signal processing unit 101, a light source drive circuit 102, and a signal processing circuit 103.
  • FIG. 2 shows a light source drive circuit 102 provided for one light source 11 for convenience, but a light source drive circuit 102 is similarly provided for the other light source 11.
  • the signal processing unit 101 is composed of a microprocessor or the like, and includes a control unit and a storage unit.
  • the light source drive circuit 102 drives the light source 11 according to an instruction signal from the signal processing unit 101.
  • the signal processing circuit 103 performs signal processing on the image pickup signal output from the image pickup element 40, and outputs the processed image pickup signal to the signal processing unit 101.
  • the signal processing unit 101 processes the imaging signal from the signal processing unit 101 to generate an captured image.
  • the generated captured image is output to, for example, a display unit or a control device (not shown).
  • the signal processing unit 101 may process the captured image, detect the object to be imaged, and output the detection result to an external control device.
  • the light collected by the image pickup element 40 is only the reflected light that is directly applied to the object from the light source unit 10 and reflected on the surface of the object.
  • the reflected light taken in by the image pickup element 40 is the reflected light reflected in the minute area on the surface of the object. Therefore, on the light receiving surface 41 of the image sensor 40, the region where the reflected light is collected becomes extremely small.
  • the reflecting surface 21 when the reflecting surface 21 is provided as described above, the direction of the light radiated directly to the object from the light source unit 10 and the direction of the light radiated to the object via the reflecting surface 21 are different. As a result, in addition to the region on the surface of the object where the reflected light is taken into the image pickup element 40 when the light is directly applied to the object from the light source unit 10, the reflected light reflected in the region outside this region is also the image pickup element 40. Is focused on. Therefore, on the light receiving surface 41 of the image sensor 40, the region where the reflected light from the object is collected can be widened. Therefore, even a fine object such as dust can be detected more appropriately from the captured image.
  • Two light source units 10 are arranged around the optical axis 31 of the condenser lens 30. According to this configuration, the light from each light source unit 10 can be applied to the object from different directions via the reflecting surface 21. As a result, the area on the surface of the object on which the reflected light is taken into the image sensor 40 can be expanded as compared with the case where the light source unit 10 is one, and the reflected light from the object is collected on the light receiving surface 41 of the image sensor 40. The area to be used can be expanded. Therefore, the object can be detected more appropriately from the captured image.
  • the two light source units 10 are arranged on the Y-axis positive side and the Y-axis negative side of the condenser lens 30, respectively, and the two reflecting members 20 parallel to the XX plane are collected. It was arranged on the positive side of the Y-axis and the negative side of the Y-axis of the optical lens 30.
  • the second embodiment four light source units 10 and four reflecting members 20 are arranged.
  • FIG. 3 is a side view schematically showing the configuration of the image pickup apparatus 1 when viewed in the positive direction of the X-axis according to the second embodiment.
  • the position of the image sensor 40 is indicated by a broken line.
  • the light source units 10 are arranged around the optical axis 31 of the condenser lens 30. That is, in the second embodiment, the light source units 10 are arranged on the Z-axis positive side and the Z-axis negative side of the condenser lens 30, respectively, as compared with the first embodiment. Further, in the second embodiment, four reflecting members 20 are arranged outside the four light source units 10. Further, also in the second embodiment, the light source drive circuit 102 (see FIG. 2) is arranged for each light source 11.
  • each reflecting member 20 is arranged so that the reflecting surface 21 faces the light source unit 10.
  • the surface of the object on which the reflected light is taken into the image pickup element 40 is compared with the first embodiment.
  • the range of the region can be further expanded, and the region in which the reflected light from the object is collected can be expanded on the light receiving surface 41 of the image pickup element 40. Therefore, the object can be detected more appropriately from the captured image.
  • the reflecting surface 21 of the reflecting member 20 is arranged so as to surround the optical axis 31 of the condensing lens 30.
  • the object can be irradiated with the reflected light from the reflecting surface 21 over the entire circumference of the optical axis 31 of the condenser lens 30, so that the projected region of the object surface on the light receiving surface 41 of the image sensor 40 can be expanded. ..
  • the projected region of the object surface on the light receiving surface 41 is expanded in the positive and negative directions of the Y axis by the two reflecting surfaces 21 arranged in the Y axis direction.
  • the projected region of the object surface on the light receiving surface 41 is further expanded in the positive and negative directions of the Z-axis.
  • the projected region on the surface of the object is expanded over the entire circumference with respect to the optical axis 31 of the condenser lens 30. Therefore, as compared with the configuration of FIG. 1, the projection region of the object surface on the light receiving surface 41 of the image sensor 40 can be enlarged.
  • the shape of the reflecting member 20 when viewed in the X-axis direction is a square shown in FIG. It is not limited to the above, and may be a rectangle or a rhombus.
  • the number of the reflecting members 20 arranged around the condenser lens 30 is not limited to four as in the second embodiment, and may be another number. Even when another number of reflecting members 20 are arranged, the light reflected by the reflecting surface 21 is applied to a region on the surface of the object different from the light directly emitted from the light source unit 10 in each reflecting member 20. Arranged like this.
  • each reflecting member 20 is arranged so that the reflecting surface 21 is positioned outside the light source unit 10 with respect to the condenser lens 30.
  • the reflective members 20 arranged around the condenser lens 30 when viewed in the X-axis direction are not limited to the adjacent reflective members 20 being connected as shown in FIG. There may be a gap between them.
  • the number of the light source units 10 arranged around the condenser lens 30 is not limited to two or four as in the first and second embodiments, and may be another number.
  • 4 (a) and 4 (b) are diagrams showing the sizes of each part of the configuration of the embodiment set in this verification experiment.
  • a rectangular parallelepiped box member with the negative side of the Z axis open is covered with an optical system other than the two reflective members 20 of FIG. 1, and this optical system is housed inside the box member. bottom.
  • the reflective surface 21 was formed by each of the five inner surfaces of the box member. Therefore, in the configuration of the embodiment, the reflecting surfaces 21 are arranged on the Y-axis positive side, the Y-axis negative side, and the Z-axis positive side of the condenser lens 30, respectively. Further, the reflecting surfaces 21 are also arranged on the X-axis positive side of the subject area A10 where the object is positioned and on the X-axis negative side of the image sensor 40, respectively. On the negative side of the Z-axis of the condenser lens 30, a lower surface member having no reflecting surface was arranged.
  • the distance d1 between the condenser lens 30 and the light source 11 in the Y-axis direction was set to 3.3 cm.
  • the interval d1 may be 2 cm to 5 cm.
  • the distance d2 between the light source 11 and the reflecting surface 21 located outside the light source 11 in the Y-axis direction is set to about 10 cm.
  • the interval d2 may be 10 cm to 20 cm.
  • the distance d3 between the light source 11 and the reflecting surface 21 on the positive side of the Z axis was set to about 80 cm. That is, the interval d3 is set so that the reflected light reflected by the reflecting surface 21 on the positive side of the Z axis does not substantially affect the imaging of the object.
  • the distance d4 between the condenser lens 30 and the image sensor 40 was set to 3.75 cm.
  • the distance d4 may be 2 cm to 5 cm.
  • the focal length f of the condenser lens 30 was set to about 6 cm.
  • the focal length f may be 5 cm to 7 cm.
  • the distance from the image sensor 40 to the object was set to about 15 cm.
  • the interval d2 was set to 100 cm or more. That is, in the comparative example, the reflecting surface 21 on the positive and negative sides of the Y-axis was far away from the light source 11 as compared with the embodiment, and the light reflected by these reflecting surfaces was adjusted so as not to substantially affect the imaging of the object. Other conditions were the same as the configuration of the embodiment.
  • the inventor dropped a minute object of about 100 ⁇ m from the top with respect to the subject area A10, and imaged the object using the configurations of the comparative example and the embodiment.
  • FIG. 5A is a diagram showing a captured image captured using the configuration of the comparative example.
  • FIG. 5A in the experiment with the configuration of the comparative example, four minute objects were imaged at the timing of imaging.
  • the four bright spots included in the captured image of FIG. 5A correspond to the light receiving regions of the reflected light in which the light from the light source 11 is reflected on the surfaces of minute objects different from each other.
  • the region of each object in the captured image was extremely small. This is because, in the configuration of the comparative example, the light directly radiated to the object from the light source 11 is mainly reflected on the object surface and projected onto the image sensor 40, so that the projected object surface region becomes extremely small. Can be considered to be due to.
  • the reflected light from the reflecting surfaces 21 on the positive side of the Y axis, the negative side of the Y axis, and the positive side of the Z axis is , It is considered that there is almost no effect on imaging. Therefore, in the configuration of the comparative example, the reflected light from the minute surface region due to the direct irradiation of the object from the light source 11 is mainly collected by the image sensor 40, whereby the reflected light from each object is received. It is probable that the area became extremely small.
  • FIG. 5B is a diagram showing a captured image captured using the configuration of the embodiment.
  • one minute object was imaged at the timing of imaging.
  • One bright spot included in the captured image of FIG. 5B corresponds to a light receiving region of the reflected light in which the light from the light source 11 is reflected on the surface of one minute object.
  • the area of the object in the captured image is several steps larger than that of the comparative example.
  • the reflecting surfaces 21 on the positive side and the negative side of the Y axis are arranged at positions sufficiently close to the condenser lens 30, and the light from the light source unit 10 directly irradiates the object.
  • the light from the light source unit 10 is also irradiated to the region outside the region to be formed through the reflecting surface 21.
  • the light receiving region of the reflected light from the object is several steps larger than that in the case of the comparative example.
  • the four flat plate-shaped reflecting members 20 are arranged so as to have a square shape when viewed in the X-axis direction, but in this modified example, the cylindrical reflecting member 20 is arranged.
  • FIG. 6 is a side view schematically showing the configuration of the image pickup apparatus 1 when viewed in the positive direction of the X-axis according to this modified example.
  • a reflective member 20 having a cylindrical side surface shape is arranged instead of the four reflective members 20.
  • the reflective surface 21 arranged inside the reflective member 20 also has a cylindrical side surface shape.
  • eight light source units 10 are arranged around the condenser lens 30.
  • the reflective surface 21 is formed on the inner surface of the cylindrical reflective member 20.
  • the object can be uniformly irradiated with the reflected light from the reflecting surface 21 over the entire circumference of the optical axis 31 of the condenser lens 30, so that the projection region on the light receiving surface 41 of the image sensor 40 is all around. It can be expanded uniformly over.
  • the shape of the reflective member 20 when viewed in the X-axis direction is not limited to the perfect circle shown in FIG. 6, and may be an ellipse or a shape in which protrusions are formed on the circumference of the circle.
  • the four light source units 10 are similarly configured, and the four light sources 11 are configured to emit light of the same wavelength.
  • the third embodiment instead of the four light sources 11, a light source 11a and a light source 11b that emit light having different wavelengths are arranged.
  • FIG. 7 is a side view schematically showing the configuration of the image pickup apparatus 1 when viewed in the positive direction of the X-axis according to the third embodiment.
  • the light source unit 10 on the positive side of the Y-axis and the negative side of the Y-axis of the condenser lens 30 includes a light source 11a instead of the light source 11 as compared with the second embodiment shown in FIG.
  • the light source unit 10 on the Z-axis positive side and the Z-axis negative side of the condenser lens 30 includes a light source 11b instead of the light source 11 as compared with the second embodiment shown in FIG.
  • the light source 11a is configured to emit light having a wavelength ⁇ 1 and the light source 11b is configured to emit light having a wavelength ⁇ 2 different from the wavelength ⁇ 1.
  • the light source drive circuit 102 (see FIG. 2) is arranged for each of the light sources 11a and 11b.
  • the image sensor 40 is configured to be capable of receiving light in the same wavelength band as the light emitted from the light sources 11a and 11b.
  • the signal processing unit 101 drives the light sources 11a and 11b so that, for example, the two light sources 11a and the two light sources 11b all emit light at the same time. By doing so, even when the object to be imaged absorbs the light of one wavelength, the object reflects the light of the other wavelength, so that the object can be imaged. Therefore, the object can be detected more reliably from the captured image.
  • the signal processing unit 101 drives the light sources 11a and 11b so that, for example, the light emission timings of the two light sources 11a and the light emission timings of the two light sources 11b are different. Then, when the absorption wavelengths of the objects to be imaged are different, the amount of received light of the image pickup element 40 corresponding to the light emission timing of the light source 11a and the amount of light received by the image pickup element 40 corresponding to the light emission timing of the light source 11b are different. As a result, there may be a case where the object can be imaged at one wavelength, but the object cannot be imaged at the other wavelength.
  • the absorption wavelength and the reflection wavelength of the object can be grasped based on the emission wavelength at which the object can be imaged and the emission wavelength at which the object cannot be imaged, and what kind of material the object is made of.
  • Information material information indicating whether or not the object can be obtained.
  • the absorption wavelength and the reflection wavelength of the object can be grasped from the difference in the amount of received light when each wavelength is used, and thus the material information of the object can be acquired.
  • a dichroic mirror that transmits the reflected light of the wavelength ⁇ 1 and reflects the reflected light of the wavelength ⁇ 2 may be arranged in front of the image sensor 40.
  • the light sources 11a and 11b simultaneously emit light
  • the reflected light of wavelength ⁇ 1 transmitted through the dichroic mirror is received by the image pickup element 40
  • the reflected light of wavelength ⁇ 2 reflected by the dichroic mirror is further imaged.
  • the reflected light of wavelengths ⁇ 1 and ⁇ 2 may be separated by a spectroscopic element other than the dichroic mirror, and the reflected light of each wavelength may be received by the corresponding image sensor.
  • a plurality of light sources 11 that emit two different wavelengths are arranged, but the present invention is not limited to this, and a plurality of light sources 11 that emit three or more different wavelengths may be arranged. In this way, the information of the object can be acquired in more detail.
  • the two light source units 10 are fixed in the image pickup apparatus 1.
  • the two light source units 10 are installed in the image pickup apparatus 1 so as to be movable in a direction (Y-axis direction) perpendicular to the optical axis 31 of the condenser lens 30.
  • FIG. 8 is a plan view schematically showing the configuration of the image pickup apparatus 1 when viewed in the negative direction of the Z axis according to the fourth embodiment.
  • the image pickup apparatus 1 of the fourth embodiment further includes two drive units 50 as compared with the first embodiment.
  • the configurations of the two drive units 50 are the same.
  • the drive unit 50 is arranged on the negative side of the Z axis of the light source unit 10.
  • the drive unit 50 includes a support plate 51, a guide rail 52, a gear shaft 53, and a motor 54.
  • the support plate 51 is a flat plate-shaped plate member, and the light source 11 and the diffusion plate 12 are installed on a frame portion formed on the surface on the positive side of the Z axis.
  • the support plate 51 is supported by the guide rail 52 and is slidable along the guide rail 52.
  • the guide rail 52 is fixed in the image pickup apparatus 1.
  • the gear shaft 53 is a shaft-shaped member having a screw groove formed therein, and is passed through a screw hole (not shown) provided in the support plate 51.
  • the motor 54 is, for example, a stepping motor.
  • the rotation shaft 54a of the motor 54 is connected to the gear shaft 53.
  • the gear shaft 53 rotates according to the rotation of the rotating shaft 54a.
  • the support plate 51 moves in the Y-axis direction along the guide rail 52, and the light source 11 and the diffuser plate 12 move in the Y-axis direction.
  • FIG. 9 is a block diagram showing the configuration of the image pickup apparatus 1 according to the fourth embodiment.
  • the image pickup apparatus 1 of the fourth embodiment further includes a motor 54 and a motor drive circuit 104 as compared with the first embodiment shown in FIG.
  • FIG. 9 shows a motor 54 and a motor drive circuit 104 provided for one light source unit 10 for convenience, but the motor 54 and the motor drive circuit 104 are similarly provided for the other light source unit 10. Provided.
  • FIG. 10 is a flowchart showing a captured image generation process by the signal processing unit 101 according to the fourth embodiment.
  • the signal processing unit 101 moves the light source unit 10 in the Y-axis direction by driving the drive unit 50, that is, by driving the motor 54 via the motor drive circuit 104.
  • the signal processing unit 101 drives the light source 11 to emit light from the light source 11 (S11). Subsequently, the signal processing unit 101 determines whether or not the projection region on the light receiving surface 41 of the image sensor 40 is a predetermined size or larger (S12). Specifically, the signal processing unit 101 classifies the number of pixels 41a whose signal level is larger than the threshold value for removing noise, in other words, as the same substance, based on the image pickup signal output from the image pickup element 40. The number of pixels 41a included in the range is acquired.
  • 11 (a) and 11 (b) are diagrams schematically showing a projection region on the light receiving surface 41 of the image sensor 40.
  • the number of pixels 41a whose signal level is larger than the threshold value is 13
  • the projection area is a range (size) corresponding to the 13 pixels 41a.
  • the number of pixels 41a whose signal level is larger than the threshold value is 28, and the projection area is a range (size) corresponding to the 28 pixels 41a.
  • the determination result in step S12 is NO in the case of FIG. 11 (a), and NO in the case of FIG. 11 (b).
  • the determination result in step S12 is YES.
  • the threshold value of the size of the projection region used in the determination in step S12 is determined according to the number of pixels 41a on the light receiving surface 41, and is stored in advance in the storage unit of the signal processing unit 101.
  • the signal processing unit 101 moves the two light source units 10 inward, respectively (S13). That is, the signal processing unit 101 moves the light source unit 10 on the positive side of the Y-axis by a predetermined distance in the negative direction of the Y-axis, and the light source unit 10 on the negative side of the Y-axis by a predetermined distance in the positive direction of the Y-axis.
  • the signal processing unit 101 determines whether or not the projection region on the light receiving surface 41 has become larger as a result of the processing in step S13, based on the image pickup signal of the image pickup element 40.
  • the signal processing unit 101 determines the moving directions of the two light source units 10 inward (S15).
  • the signal processing unit 101 determines the moving directions of the two light source units 10 in the outward direction (S16).
  • the signal processing unit 101 determines whether or not the projection region on the light receiving surface 41 of the image sensor 40 is a predetermined size or larger, as in step S12 (S17).
  • the signal processing unit 101 moves the two light source units 10 by a predetermined distance in the direction determined in step S15 or S16 (S18). After that, the process is returned to step S17, and the determination in step S17 is performed again.
  • step S12 or S17 When it is determined in step S12 or S17 that the projection region is equal to or larger than a predetermined size (S12: YES or S17: YES), the signal processing unit 101 captures the captured image based on the image pickup signal of the image pickup element 40. Generate (S19). In this way, the process of generating the captured image is completed.
  • the drive unit 50 moves the light source unit 10 in the direction perpendicular to the optical axis 31 of the condenser lens 30 (Y-axis direction).
  • the direction of the light emitted to the projected area A10 can be changed directly from the light source unit 10 or through the reflecting surface 21, and the object is irradiated with the light.
  • the state can be changed. Therefore, by moving the light source unit 10, for example, the size of the projection region of the object on the light receiving surface 41 of the image sensor 40 can be expanded to an appropriate size.
  • the signal processing unit 101 specifies a projection region of an object on the light receiving surface 41 of the image sensor 40, and controls the position of the light source unit 10 based on the specified projection region.
  • the position of the light source unit 10 for example, the size of the projection region of the object on the light receiving surface 41 of the image sensor 40 can be expanded to an appropriate size.
  • the drive unit 50 may move at least the light source unit 10 in the direction perpendicular to the optical axis 31 of the condenser lens 30 (Y-axis direction), and in the direction in which the light source unit 10 is moved. Components in the X-axis direction and the Z-axis direction may be included.
  • the number of the light source units 10 is not limited to the number shown in FIG. 8, and for example, as shown in FIG. 3, four light source units 10 are arranged, and each light source unit 10 has an optical axis 31. It may be driven in the direction of approaching and separating from the light source. In this case, for example, the control of FIG. 10 may be performed on the four light source units 10, or each light source unit 10 is individually driven so that the size of the projection area becomes an appropriate size. It may be controlled.
  • the position of the light source unit 10 is controlled based on the projection region of the object on the light receiving surface 41 of the image sensor 40.
  • the position of the light source unit 10 is controlled based on the distance to the object.
  • FIG. 12A is a flowchart showing the generation processing of the captured image by the signal processing unit 101 according to the fifth embodiment.
  • the signal processing unit 101 drives the light source 11 to emit light from the light source 11 (S21). Subsequently, the signal processing unit 101 calculates the distance to the object based on the image pickup signal output from the image pickup device 40 (S22). Specifically, the signal processing unit 101 is an object for each pixel 41a based on the time difference between the light emission timing of the light source 11 and the light reception timing of the reflected light in the pixels 41a (see FIGS. 11A and 11B). Calculate the distance to. Then, the signal processing unit 101 specifies the projection region of the object on the light receiving surface 41, and calculates the distance to the object based on the distance for each pixel 41a and the projection region. For example, the average value of the distances based on each pixel 41a in the projection area is calculated as the distance to the object.
  • the signal processing unit 101 acquires the optimum position of the light source unit 10 using the table shown in FIG. 12B based on the distance to the object calculated in step S22 (S23).
  • the signal processing unit 101 stores a table as shown in FIG. 12B in a storage unit in the signal processing unit 101.
  • the optimum position Pn of the light source unit 10 that is, the light source unit 10 when the projection region on the light receiving surface 41 of the image sensor 40 is maximized corresponding to the distance Dn to the object.
  • the position is remembered.
  • the optimum position is close to the condenser lens 30, and when the distance to the object is long, the optimum position is far from the condenser lens 30.
  • the signal processing unit 101 drives the drive unit 50 to move the light source unit 10 to the optimum position acquired in step S23 (S24). After that, the signal processing unit 101 generates an image to be captured based on the image pickup signal of the image pickup device 40 (S25). In this way, the process of generating the captured image is completed.
  • the signal processing unit 101 measures the distance to the object in the captured image based on the image pickup signal from the image pickup element 40, and the measurement result of the distance.
  • the position of the light source unit 10 is controlled based on the above. By controlling the position of the light source unit 10 to a position suitable for the distance, for example, the size of the projected region of the object on the captured image can be enlarged to an appropriate size.
  • the two light source units 10 are arranged so as to be movable in the direction perpendicular to the optical axis 31 of the condenser lens 30.
  • the two light source units 10 are rotatably arranged with the Z-axis direction as the rotation axis.
  • FIG. 13A is a plan view showing a configuration in the vicinity of the light source unit 10 when viewed in the negative direction of the Z axis according to the sixth embodiment.
  • FIG. 13B is a side view showing the configuration of the vicinity of the light source unit 10 when viewed in the positive direction of the Y-axis according to the sixth embodiment.
  • the image pickup apparatus 1 of the sixth embodiment includes two drive units 60 instead of the two drive units 50 as compared with the fourth embodiment.
  • the configurations of the two drive units 60 are the same.
  • the two drive units 60 are arranged on the negative side of the Z axis of the two light source units 10, respectively.
  • the drive unit 60 includes a support plate 61, a shaft member 62, and a motor 63.
  • the support plate 61 is a flat plate-shaped plate member, and a light source portion 10 (light source 11 and diffusion plate 12) is installed on the surface on the positive side of the Z axis.
  • the shaft member 62 is a shaft-shaped member extending in the Z-axis direction, and is installed on the lower surface (the surface on the negative side of the Z-axis) of the support plate 61.
  • the motor 63 is, for example, a stepping motor.
  • the main body of the motor 63 is fixed in the image pickup apparatus 1.
  • the rotating shaft 63a of the motor 63 is connected to the shaft member 62.
  • the shaft member 62 rotates according to the rotation of the rotating shaft 63a.
  • the support plate 61 rotates about the Z-axis direction, and the light source unit 10 (light source 11 and diffusion plate 12) rotates.
  • a motor drive circuit (not shown) is provided to rotate each of the two motors 63.
  • FIG. 14 is a flowchart showing a captured image generation process by the signal processing unit 101 according to the sixth embodiment.
  • the signal processing unit 101 rotates in the Z-axis direction by driving the drive unit 60, that is, by driving two motors 63 via a motor drive circuit (not shown).
  • the light source unit 10 is rotated as an axis.
  • steps S31 to S34 are added in place of steps S13, S15, S16, and S18, respectively, as compared with the process of FIG.
  • steps S31 to S34 will be described.
  • the signal processing unit 101 determines in step S12 that the projection region is smaller than a predetermined size (S12: NO)
  • the signal processing unit 101 rotates the two light source units 10 inward, respectively (S31). That is, the signal processing unit 101 rotates the light source unit 10 on the positive side of the Y axis by a predetermined angle when viewed in the negative direction of the Z axis, and rotates the light source unit 10 on the negative side of the Y axis counterclockwise by a predetermined angle. Rotate only.
  • step S14 determines the projection region has become large (S14: YES)
  • the signal processing unit 101 determines the rotation direction of the light source unit 10 inward (S32).
  • the signal processing unit 101 determines the rotation direction of the light source unit 10 in the outward direction (S14: NO). S33).
  • step S17 When the signal processing unit 101 determines in step S17 that the projection region is smaller than a predetermined size (S17: NO), the signal processing unit 101 rotates the two light source units 10 by a predetermined angle in the rotation direction determined in step S32 or S33. Move (S34).
  • the drive unit 60 rotates the light source unit 10 so that the direction of the light emitted from the light source unit 10 changes.
  • the direction of the light emitted from the light source unit 10 directly or via the reflecting surface 21 can be changed, and the direction of the light applied to the object can be changed.
  • the irradiation state can be changed. Therefore, by moving the light source unit 10, for example, the size of the projection region of the object on the light receiving surface 41 of the image sensor 40 can be expanded to an appropriate size.
  • the signal processing unit 101 specifies a projection region of an object on the light receiving surface 41 of the image sensor 40, and controls the rotation position of the light source unit 10 based on the specified projection region.
  • the rotation position of the light source unit 10 for example, the size of the projection region of the object on the light receiving surface 41 of the image sensor 40 can be expanded to an appropriate size.
  • the direction of the light emitted from the light source unit 10 was changed by rotating the light source unit 10, but the direction of the light was changed by another method. It may be changed.
  • the direction of the light emitted from the light source unit 10 may be changed by swinging the light source unit 10.
  • the signal processing unit 101 measures the distance to the object in the captured image based on the signal from the image sensor 40, and the light source is based on the measurement result of the distance.
  • the rotation position of the unit 10 may be controlled.
  • the optimum rotation position is stored in the table shown in FIG. 12B corresponding to each distance, and the rotation position of the light source unit 10 is controlled by using this table.
  • the optimum rotation position is the rotation position where the front direction of the light source 11 is close to the condenser lens 30, and when the distance to the object is long, the optimum rotation position is the light source 11.
  • the front direction of is the rotation position away from the condensing lens 30. Even when the rotation position is controlled in this way, the size of the projected region of the object on the captured image can be enlarged to an appropriate size.
  • the configuration in which the light source unit 10 of the fourth embodiment is linearly moved and the configuration in which the light source unit 10 of the sixth embodiment is rotated may be combined.
  • the light source 11 is composed of LEDs.
  • the light source 13 composed of the semiconductor laser element is arranged instead of the light source 11, and the object in the subject area A10 is irradiated with the laser light in the predetermined polarization direction.
  • FIG. 15 is a plan view schematically showing the configuration of the image pickup apparatus 1 when viewed in the Z-axis direction according to the seventh embodiment.
  • the light source unit 10 of the seventh embodiment includes a light source 13 instead of the light source 11, and a collimator lens 14 and a concave lens 15 instead of the diffuser plate 12.
  • the light source 13 is composed of a semiconductor laser element and emits linearly polarized laser light.
  • the two light sources 13 are arranged so as to irradiate the subject area A10 with light whose polarization directions are aligned in one direction.
  • the light emitted from the light source 13 is converted into parallel light by the collimator lens 14 and converted into diffused light by the concave lens 15.
  • the image pickup apparatus 1 of the seventh embodiment includes a polarizing filter 70 between the condenser lens 30 and the subject area A10 as compared with the first embodiment.
  • the polarizing filter 70 is arranged so that the transmitted polarization direction coincides with the polarization direction of the light projected on the subject area A10 by the light source 13.
  • the polarizing filter 70 is fixed to the housing of the image pickup apparatus 1 via a support portion (not shown).
  • the light source unit 10 irradiates the subject area A10 with light by aligning the polarization directions in one direction.
  • the polarization state of the light reflected on the surface of the object can change depending on the state of the surface of the object.
  • the amount of light transmitted through the polarizing filter 70 changes according to the polarization state of the light reflected by the object. Therefore, the state of the object surface can be detected by referring to the amount of light received on the light receiving surface 41 of the image sensor 40.
  • the polarization direction of the reflected light from the object surface is substantially the same as the polarization direction of the light emitted from the light source 11.
  • the amount of reflected light transmitted through the polarizing filter 70 is large, the amount of light received on the light receiving surface 41 of the image sensor 40 is large.
  • the surface of an object has minute irregularities or the like, when the surface of the object is irradiated with light, the polarization state of the light changes, and the reflected light is mixed with light of various polarization states. ..
  • the signal processing unit 101 can determine the state of the object surface by comparing the received light amount with a predetermined threshold value based on the image pickup signal output from the image pickup element 40.
  • the polarizing filter 70 is not limited to being arranged between the image area A10 and the condenser lens 30, but may be arranged between the image area A10 and the image sensor 40.
  • the polarizing filter 70 may be arranged between the condenser lens 30 and the image sensor 40.
  • the image pickup device 40 is fixed in the image pickup device 1.
  • the image sensor 40 is movably arranged in a direction parallel to the optical axis 31 of the condenser lens 30 (X-axis direction).
  • FIG. 16 is a plan view schematically showing the configuration of the image pickup apparatus 1 when viewed in the Z-axis direction according to the eighth embodiment.
  • the imaging device 1 of the eighth embodiment further includes a distance changing unit 80 as compared with the first embodiment.
  • the distance changing unit 80 is arranged on the negative side of the Z axis of the image sensor 40.
  • the distance changing unit 80 has the same configuration as the driving unit 50 of the fourth embodiment shown in FIG.
  • the distance changing unit 80 includes a support plate 81, a guide rail 82, a gear shaft 83, and a motor 84.
  • the support plate 81 is a flat plate-shaped plate member, and the image sensor 40 is installed on a frame portion formed on the surface on the positive side of the Z axis.
  • the support plate 81 is supported by the guide rail 82 and is slidable along the guide rail 82.
  • the guide rail 82 is fixed in the image pickup apparatus 1.
  • the gear shaft 83 is a shaft-shaped member having a screw groove formed therein, and is passed through a screw hole (not shown) provided in the support plate 81.
  • the motor 84 is, for example, a stepping motor.
  • the rotation shaft 84a of the motor 84 is connected to the gear shaft 83.
  • the gear shaft 83 rotates according to the rotation of the rotating shaft 84a.
  • the support plate 81 moves in the X-axis direction along the guide rail 82, and the image sensor 40 moves in the X-axis direction.
  • FIG. 17 is a block diagram showing the configuration of the image pickup apparatus 1 according to the eighth embodiment.
  • the image pickup apparatus 1 of the eighth embodiment further includes a motor 84 and a motor drive circuit 105 as compared with the first embodiment shown in FIG.
  • FIG. 18 is a flowchart showing a captured image generation process by the signal processing unit 101 according to the eighth embodiment.
  • the signal processing unit 101 moves the image sensor 40 in the X-axis direction by driving the distance changing unit 80, that is, by driving the motor 84 via the motor drive circuit 105.
  • steps S41 to S44 are added in place of steps S13, S15, S16, and S18, respectively, as compared with the process of FIG.
  • steps S41 to S44 will be described.
  • step S12 determines in step S12 that the projection region is smaller than a predetermined size (S12: NO)
  • the signal processing unit 101 moves the image sensor 40 in the positive direction of the X-axis by a predetermined distance (S41).
  • step S14 determines in step S14 that the projection region has become large (S14: YES)
  • the signal processing unit 101 determines the moving direction of the image sensor 40 in the positive X-axis direction (S42).
  • step S14 determines in step S14 that the projection region has become smaller or the size of the projection region has not changed (S14: NO)
  • the signal processing unit 101 determines the moving direction of the image sensor 40 in the negative X-axis direction. (S43).
  • step S17 When the signal processing unit 101 determines in step S17 that the projection region is smaller than a predetermined size (S17: NO), the signal processing unit 101 moves the image sensor 40 by a predetermined distance in the moving direction determined in step S42 or S43 (S44). ..
  • the distance changing unit 80 changes the distance between the condenser lens 30 and the image sensor 40.
  • the projection region of the object on the light receiving surface 41 of the image sensor 40 can be defocused and the size of the projection region can be expanded. As a result, the object can be detected more appropriately from the captured image.
  • the signal processing unit 101 identifies a projection region of an object on the light receiving surface 41 of the image sensor 40, and based on the specified projection region, between the condenser lens 30 and the image sensor 40. Control the distance. By feeding back the size of the projection area of the object to the distance adjustment in this way, for example, the defocus amount can be appropriately adjusted so that the projection area of the object is maximized.
  • one pixel 41a When the distance between the condenser lens 30 and the image sensor 40 is changed and the defocus amount is adjusted so as to maximize the projection region, one pixel 41a (see FIGS. 11A and 11B) The amount of received light decreases. When the decrease in the amount of received light of one pixel 41a becomes a problem, the output of the light source 11 may be increased. In this case, if the emission wavelength of the light source 11 is in the infrared wavelength band, safety can be ensured even if the output of the light source 11 is increased.
  • the distance between the condenser lens 30 and the image pickup element 40 is not limited to being changed by the movement of the image pickup element 40 in the X-axis direction, and may be changed by the movement of the condenser lens 30 in the X-axis direction. good.
  • the configuration of the image pickup apparatus 1 can be changed in various ways in addition to the configurations shown in the above-described embodiment and modification examples.
  • the light source unit 10 is moved in the Y-axis direction to change the light irradiation state of the object, but the present invention is not limited to this, and the configuration is such that the light irradiation state of the object is changed.
  • the two reflecting members 20 may be moved in the Y-axis direction, respectively.
  • each of the three or more light source units 10 may be provided with a drive unit for driving the light source unit 10.
  • two reflecting members 20 are arranged around the condenser lens 30, but the number of reflecting members 20 (reflecting surfaces 21) is the same. Not limited to this, three or more reflecting members 20 may be arranged, and three or more reflecting surfaces 21 may surround the optical axis 31 of the condenser lens 30. Further, also in the above-described first to third to eighth embodiments, the cylindrical reflecting member 20 (reflecting surface 21) may be arranged as in the modified example of the second embodiment.
  • Image pickup device 10 Light source unit 21 Reflection surface 30 Condensing lens 31 Optical axis 40 Imaging element 41 Light receiving surface 50 Drive unit 60 Drive unit 70 Polarizing filter 80 Distance change unit 101 Signal processing unit A10 Imaging area

Abstract

Selon l'invention, un dispositif d'imagerie (1) comprend une unité source de lumière (10) permettant d'exposer une région de sujet (A10) à de la lumière, et un élément d'imagerie (40) permettant d'effectuer l'imagerie de la région de sujet (A10). Le dispositif d'imagerie (1) est pourvu d'une lentille de focalisation (30) permettant de focaliser de la lumière provenant de la région de sujet (A10) au niveau de l'élément d'imagerie (40). Le dispositif d'imagerie (1) est pourvu d'une surface réfléchissante (21) permettant de réfléchir une partie de la lumière émise par l'unité source de lumière (10) vers la région de sujet (A10).
PCT/JP2021/003108 2020-04-24 2021-01-28 Dispositif d'imagerie WO2021215064A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020077638 2020-04-24
JP2020-077638 2020-04-24

Publications (1)

Publication Number Publication Date
WO2021215064A1 true WO2021215064A1 (fr) 2021-10-28

Family

ID=78270437

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/003108 WO2021215064A1 (fr) 2020-04-24 2021-01-28 Dispositif d'imagerie

Country Status (1)

Country Link
WO (1) WO2021215064A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001311986A (ja) * 2000-04-27 2001-11-09 Canon Inc カメラ、閃光発光装置および閃光装置内蔵カメラ
JP2005158490A (ja) * 2003-11-26 2005-06-16 Kyoto Denkiki Kk 画像処理検査用リング状照明装置
JP2005326451A (ja) * 2004-05-12 2005-11-24 Konica Minolta Photo Imaging Inc 撮像装置
JP2008003626A (ja) * 2000-07-25 2008-01-10 Fujifilm Corp 光源装置及び撮影装置
WO2009157129A1 (fr) * 2008-06-26 2009-12-30 パナソニック株式会社 Appareil de traitement d'image, programme de division d'image et procédé de synthèse d'image
JP2011123019A (ja) * 2009-12-14 2011-06-23 Olympus Corp 画像検査装置
JP2016015017A (ja) * 2014-07-02 2016-01-28 ソニー株式会社 撮像装置、投光装置、および画像処理方法、ビームライト制御方法、並びにプログラム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001311986A (ja) * 2000-04-27 2001-11-09 Canon Inc カメラ、閃光発光装置および閃光装置内蔵カメラ
JP2008003626A (ja) * 2000-07-25 2008-01-10 Fujifilm Corp 光源装置及び撮影装置
JP2005158490A (ja) * 2003-11-26 2005-06-16 Kyoto Denkiki Kk 画像処理検査用リング状照明装置
JP2005326451A (ja) * 2004-05-12 2005-11-24 Konica Minolta Photo Imaging Inc 撮像装置
WO2009157129A1 (fr) * 2008-06-26 2009-12-30 パナソニック株式会社 Appareil de traitement d'image, programme de division d'image et procédé de synthèse d'image
JP2011123019A (ja) * 2009-12-14 2011-06-23 Olympus Corp 画像検査装置
JP2016015017A (ja) * 2014-07-02 2016-01-28 ソニー株式会社 撮像装置、投光装置、および画像処理方法、ビームライト制御方法、並びにプログラム

Similar Documents

Publication Publication Date Title
JP7105639B2 (ja) レーザ加工装置
TWI464362B (zh) 用於測量物體高度及獲得其對焦圖像的裝置與方法
US9606069B2 (en) Method, apparatus and system for generating multiple spatially separated inspection regions on a substrate
WO2006038439A1 (fr) Appareil d'observation a mecanisme de commande de position de mise au point
JP4640174B2 (ja) レーザーダイシング装置
KR101755615B1 (ko) 광학 장치 및 이를 포함하는 광학 검사 장치
TW201341097A (zh) 雷射加工裝置
JP6387381B2 (ja) オートフォーカスシステム、方法及び画像検査装置
JP4890039B2 (ja) 共焦点型撮像装置
US10816480B2 (en) Method of detecting a defect on a substrate, apparatus for performing the same and method of manufacturing semiconductor device using the same
US10532433B2 (en) Laser beam profiler unit and laser processing apparatus
WO2021215064A1 (fr) Dispositif d'imagerie
TWI632971B (zh) 雷射加工裝置與雷射加工方法
JP5213761B2 (ja) 照明光学系及びそれを有する画像投射装置
JP7222906B2 (ja) レーザ加工方法、及び、レーザ加工装置
JP4550488B2 (ja) 検出光学装置及び欠陥検査装置
KR102107998B1 (ko) 자동 초점 조절 모듈 및 이를 구비한 표면 결함 검출 장치
JPH10302587A (ja) 光学式センサ
JPH0616483B2 (ja) 投影光学装置
JP2021085815A (ja) 光照射装置、検査システム、及び、光照射方法
JP2002277729A (ja) 顕微鏡用オートフォーカス装置および方法
JP2019113377A (ja) 検知装置
TW202327774A (zh) 雷射加工裝置
JPH11121351A (ja) 焦点位置検出装置のビーム調整法
WO2020090891A1 (fr) Dispositif d'usinage au laser

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21793779

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21793779

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP