WO2013038595A1 - Dispositif de capture d'images - Google Patents

Dispositif de capture d'images Download PDF

Info

Publication number
WO2013038595A1
WO2013038595A1 PCT/JP2012/005095 JP2012005095W WO2013038595A1 WO 2013038595 A1 WO2013038595 A1 WO 2013038595A1 JP 2012005095 W JP2012005095 W JP 2012005095W WO 2013038595 A1 WO2013038595 A1 WO 2013038595A1
Authority
WO
WIPO (PCT)
Prior art keywords
region
light
diffraction
wavelength band
pixels
Prior art date
Application number
PCT/JP2012/005095
Other languages
English (en)
Japanese (ja)
Inventor
貴真 安藤
今村 典広
是永 継博
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2013038595A1 publication Critical patent/WO2013038595A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00186Optical arrangements with imaging filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00188Optical arrangements with focusing or zooming features
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • A61B1/051Details of CCD assembly
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0208Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using focussing or collimating elements, e.g. lenses or mirrors; performing aberration correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0213Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using attenuators
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • G01J3/18Generating the spectrum; Monochromators using diffraction elements, e.g. grating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • G01J3/26Generating the spectrum; Monochromators using multiple reflection, e.g. Fabry-Perot interferometer, variable interference filters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/50Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors
    • G01J3/51Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors using colour filters
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/10Bifocal lenses; Multifocal lenses
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • G02B5/1814Diffraction gratings structurally combined with one or more further optical elements, e.g. lenses, mirrors, prisms or other diffraction gratings
    • G02B5/1819Plural gratings positioned on the same surface, e.g. array of gratings
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • G02B5/1842Gratings for image generation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/203Filters having holographic or diffractive elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/041Capsule endoscopes for imaging

Definitions

  • This application relates to an imaging device such as a camera.
  • a color filter using an organic material such as a pigment or a dye is formed on each pixel of a solid-state imaging device for color imaging. Since such a color filter transmits infrared light, in order to obtain a good color image in the imaging apparatus, a configuration in which the infrared cut filter is arranged in the optical path on the front side of the solid-state imaging device is generally used. Therefore, it is difficult for an imaging apparatus using a single imaging device to simultaneously acquire both visible light and infrared light image information.
  • a color filter using an organic material has a wide wavelength band. For example, each wavelength band of blue, green, and red overlaps with a relatively wide wavelength band, so that color reproducibility is deteriorated.
  • Patent Documents 1 and 2 techniques relating to a solid-state imaging device in which a color filter using a dielectric multilayer film is formed are disclosed.
  • a color filter using an organic material has difficulty in forming a narrow band spectral characteristic, and it is difficult to capture an image that extracts color information in a narrow wavelength band.
  • Patent Document 3 a technique for acquiring an image by sequentially lighting white light and predetermined narrow band light is disclosed.
  • the solid-state imaging device is expensive or difficult to form with a very small pixel size. Further, when a moving object is imaged, a color shift due to a time difference occurs.
  • One non-limiting exemplary embodiment of the present application provides an imaging apparatus capable of acquiring an image having spectral information for each pixel with a simpler configuration than the conventional one.
  • An imaging device transmits, in a predetermined plane, a first region that transmits light of a first wavelength band and light of a second wavelength band that is different from the first wavelength band.
  • An optical member having at least a second region, a first diffraction step provided in a region where light transmitted through the first region is incident, and a region where light transmitted through the second region is incident
  • a diffraction grating having a second diffraction step having a depth different from that of the first diffraction step
  • an imaging device having at least a plurality of first pixels and a plurality of second pixels, Light that is disposed between the diffraction grating and the imaging device and that has passed through the first region is incident on the plurality of first pixels, and light that has passed through the second region is incident on the plurality of second regions.
  • an arrayed optical element that is incident on the pixels.
  • a multispectral image can be acquired by one imaging using a single imaging system. According to one embodiment of the present invention, it is not necessary to provide a dielectric multilayer film for each pixel. In addition, when a moving image is shot using the image pickup apparatus of the present invention, even if the subject position changes over time, there is no image shift between a plurality of images. Further, by providing a diffraction grating, it is possible to correct chromatic aberration of the optical system and reduce curvature of field.
  • FIG. 1 is a schematic diagram illustrating a first embodiment of an imaging apparatus A.
  • FIG. (A) is the front view which looked at the optical element L1 in Embodiment 1 from the to-be-photographed object side.
  • (B) is a diagram showing the shape of the diffraction grating G on a plane perpendicular to the optical axis V of the lens optical system L.
  • FIG. (C) is a figure which shows the cross-sectional shape in the surface parallel to the optical axis V of the lens L2 in which the diffraction grating G was provided.
  • 2 is a perspective view of an arrayed optical element K in Embodiment 1.
  • FIG. (A) is an enlarged view of the arrayed optical element K and the imaging element N shown in FIG.
  • FIG. 6B is a diagram showing a positional relationship between the arrayed optical element K and the pixels of the image sensor N.
  • (A) is a figure which shows the 1st-order diffraction efficiency of the light beam B1 which passes the optical surface area
  • FIG. (B) is a figure which shows the 1st-order diffraction efficiency of the light beam B2 which passes the optical surface area
  • FIG. 6 is a schematic diagram illustrating a second embodiment of the imaging apparatus A.
  • FIG. (A) is the front view which looked at the optical element L1 in Embodiment 2 from the to-be-photographed object side.
  • FIG. (B) is a diagram showing the shape of the diffraction grating G on a plane perpendicular to the optical axis V of the lens optical system L.
  • FIG. (A) is a figure which expands and shows the array-like optical element K and the image pick-up element N which are shown in FIG.
  • FIG. 6B is a diagram showing a positional relationship between the arrayed optical element K and the pixels of the image sensor N.
  • (A) is a figure which shows the 1st-order diffraction efficiency of the light beam B1 which passes the optical surface area
  • FIG. (B) is a figure which shows the 1st-order diffraction efficiency of the light beam B2 which passes the optical surface area
  • FIG. (C) is a figure which shows the 1st-order diffraction efficiency of the light beam B3 which passes the optical surface area
  • FIG. (A) is the front view which looked at the optical element L1 in Embodiment 3 from the to-be-photographed object side.
  • (B) is a diagram showing the shape of the diffraction grating G on a plane perpendicular to the optical axis V of the lens optical system L.
  • FIG. FIG. 10 is a perspective view of an arrayed optical element K in a third embodiment.
  • FIG. 6B is a diagram showing a positional relationship between the arrayed optical element K and the pixels of the image sensor N.
  • (A) is the front view which looked at the optical element L1 in Embodiment 4 from the to-be-photographed object side.
  • FIG. 6B is a diagram showing a positional relationship between the arrayed optical element K and the pixels of the image sensor N.
  • (A1), (b1), and (c1) are ray diagrams for each subject distance in Embodiment 1, and (a2), (a3), (b2), (b3), (c2), and (c3) are It is a figure explaining the change of a point image and its gravity center.
  • (A1), (b1), (c1), (a2), (b2), (c2) are diagrams for explaining a point image and its center of gravity for each subject distance in the fourth embodiment.
  • 10 is a schematic diagram illustrating Embodiment 5 of an imaging apparatus A.
  • FIG. It is a schematic diagram which shows Embodiment 6 of the imaging device A.
  • FIG. 10 is a schematic diagram illustrating Embodiment 7 of an imaging apparatus A.
  • FIG. 8 It is a schematic diagram which shows Embodiment 8 of the imaging device A.
  • (A) is a figure which shows the 1st-order diffraction efficiency of the light beam B1 which passes the optical surface area
  • FIG. (B) is a figure which shows the 1st-order diffraction efficiency of the light beam B2 which passes the optical surface area
  • FIG. (C) is a figure which shows the 1st-order diffraction efficiency of the light beam B3 which passes the optical surface area
  • FIG. (A) And (b) is the figure which expands and shows the array-like optical element K in Embodiment 9, and the image pick-up element N.
  • FIG. (A) And (b) is a figure which expands and shows the array-like optical element K and the image pick-up element N in another form.
  • 1 is a schematic diagram illustrating an embodiment of an imaging apparatus A.
  • FIG. (A1) And (b1) is a perspective view of the array-like optical element K in another form.
  • (A2) And (b2) is a figure which shows the contour line of each optical element.
  • (A3) And (b3) is a figure which shows the result of a ray tracing simulation.
  • (A) is a figure which shows the metal mold
  • (B) is a figure explaining metallic mold fabrication of lens L2 in other forms. It is a figure which shows the metal mold
  • An imaging device includes a first region that transmits light in a first wavelength band in a predetermined plane and light in a second wavelength band that is different from the first wavelength band.
  • An optical member having at least a second region that transmits light, a first diffraction step provided in a region where light transmitted through the first region is incident, and light transmitted through the second region is incident
  • An imaging device provided at least in a region having a diffraction grating having a second diffraction step having a depth different from that of the first diffraction step, and a plurality of first pixels and a plurality of second pixels And between the diffraction grating and the image sensor, the light transmitted through the first region is incident on the plurality of first pixels, and the light transmitted through the second region is transmitted through the plurality of regions. And an arrayed optical element that is incident on the second pixel.
  • the optical member further includes at least a third region other than the first and second regions, and the diffraction grating is provided in a region where light transmitted through the third region is incident, , A third diffraction step having a depth different from that of the second diffraction step, the image pickup device further includes a plurality of third pixels, and the third region includes the first diffraction step. Transmitting light in a wavelength band and a third wavelength band different from the second wavelength band, and the arrayed optical element causes the light transmitted through the third region to enter the plurality of third pixels. May be.
  • the optical member further includes a fourth region other than the first, second, and third regions, and the diffraction grating is provided in a region where light transmitted through the fourth region is incident, A fourth diffraction step having a depth different from the first, second, and third diffraction steps, the imaging device further includes a plurality of fourth pixels, and the fourth region includes: The arrayed optical element transmits light of a fourth wavelength band different from the first, second, and third wavelength bands, and the arrayed optical element transmits light transmitted through the fourth region to the plurality of fourth pixels. You may make it enter into.
  • the imaging apparatus may further include a first lens that is provided between the optical member and the arrayed optical element and into which light that has passed through the optical member is incident.
  • the diffraction grating may be provided on a surface of the first lens facing the optical member.
  • the diffraction grating may be provided on a surface of the optical member that faces the first lens.
  • the first and second regions of the optical member are composed of a plurality of regions separated from each other across the optical axis of the first lens.
  • At least one of the first region and the second region has a spectral transmittance characteristic that transmits light in the near-infrared wavelength band and blocks light in the visible light region. May be.
  • At least one of the first region and the second region may have a spectral transmittance characteristic that transmits light having a wavelength bandwidth relatively narrower than the wavelength bandwidth of the other region. Good.
  • the first region and the second region may be regions divided with the optical axis of the first lens as a boundary center.
  • the imaging apparatus further includes a second lens, the optical member is disposed between the second lens and the first lens, and a diffraction grating is provided on a surface of the second lens. May be.
  • the imaging device may further include an optical adjustment layer formed on the surface provided with the diffraction grating.
  • the imaging apparatus may further include a light shielding region provided at a position corresponding to a boundary portion between the first region and the second region.
  • the imaging apparatus may further include a diaphragm, and the first area and the second area may be disposed in the vicinity of the diaphragm.
  • the diffraction grating may be arranged in the vicinity of the stop.
  • the depth d 1 of the first diffraction step is 0.9 ⁇ 1 / ⁇ n ( ⁇ 1 ) ⁇ 1.
  • ⁇ ⁇ d 1 ⁇ 1.1 ⁇ 1 / ⁇ n ( ⁇ 1 ) ⁇ 1 ⁇
  • the central value of the wavelength band width of light reaching the second diffraction step is ⁇ 2
  • the second The diffraction step depth d 2 may satisfy 0.9 ⁇ 2 / ⁇ n ( ⁇ 2 ) ⁇ 1 ⁇ ⁇ d 2 ⁇ 1.1 ⁇ 2 / ⁇ n ( ⁇ 2 ) ⁇ 1 ⁇ .
  • the diffraction efficiency of the light in the first wavelength band at the first diffraction step is higher than the diffraction efficiency of the light in the second wavelength band, and the light in the second wavelength band at the second diffraction step.
  • the diffraction efficiency may be higher than the diffraction efficiency of light in the first wavelength band.
  • the imaging apparatus further includes a signal processing unit, wherein the signal processing unit generates first image information corresponding to the first wavelength band from pixel values obtained in the plurality of first pixels, and Second image information corresponding to the second wavelength band may be generated from pixel values obtained in a plurality of second pixels.
  • An imaging system includes an imaging device according to any one of the above, and a first wavelength band corresponding to the first wavelength band from pixel values obtained in the plurality of first pixels in the imaging device.
  • a signal processing device that generates one piece of image information and generates second image information corresponding to the second wavelength band from pixel values obtained in the plurality of second pixels.
  • FIG. 1 is a schematic diagram illustrating an imaging apparatus A according to the first embodiment.
  • An imaging apparatus A according to the present embodiment includes a lens optical system L having V as an optical axis, an array-like optical element K disposed near the focal point of the lens optical system L, an imaging element N, and a first signal processing unit. C.
  • the lens optical system L is disposed in the vicinity of the diaphragm S where light from a subject (not shown) enters, and in the vicinity of the diaphragm S, for example, in a plane perpendicular to the optical axis V of the lens optical system L.
  • a diffraction grating G is formed on the stop S side (surface facing the optical element L1) of the lens L2.
  • the diffraction grating G is provided in a first diffraction grating portion G1 provided in a region where light that has passed through the first optical surface region D1 is incident and a region in which light that has passed through the second optical surface region D2 is incident.
  • the second diffraction grating portion G2 is provided.
  • Each of the first diffraction grating portion G1 and the second diffraction grating portion G2 includes a plurality of annular zones and a plurality of diffraction steps formed concentrically around the optical axis V.
  • the depths d1 of the plurality of diffraction steps in the first diffraction grating part G1 are different from the depths d2 of the plurality of diffraction steps in the second diffraction grating part G2. Details of the diffraction grating G will be described later.
  • the lens L2 has a single lens configuration, but may have a plurality of configurations.
  • the “wavelength band” in the “first wavelength band” and the “second wavelength band” is a continuous band that occupies 50% or more of the total amount of light transmitted through the region. A wavelength that is cut by 95% or more by passing through the region is not included in the “wavelength band”. That is, the first optical surface region D1 selectively transmits light in the first wavelength band, and the second optical surface region D2 selectively transmits light in the second wavelength band.
  • the fact that two wavelength bands are different from each other means that at least one wavelength band includes a wavelength band that is not included in the other wavelength band. Therefore, some wavelength bands may overlap.
  • the configurations in which the wavelength bands to be transmitted are different from each other include a configuration in which a filter using an organic material or a dielectric multilayer film is formed on the surface of the optical element L1 on the stop S side, or the optical element L1 is dyed for each region with a dye-based filter It varies depending on the configuration.
  • a color filter may be formed on one flat plate, or may be formed on a plurality of flat plates divided for each region.
  • the light that has passed through the two optical surface regions D1 and D2 passes through the lens L2 provided with the diffraction grating G, and then enters the arrayed optical element K.
  • the arrayed optical element K causes light that has passed through the optical surface region D1 to enter the plurality of pixels P1 in the image sensor N, and light that has passed through the optical surface region D2 to enter the plurality of pixels P2 in the image sensor N.
  • the signal processing unit C generates image information corresponding to the first wavelength band from the pixel value obtained in the pixel P1, generates image information corresponding to the second wavelength band from the pixel value obtained in the pixel P2, Output.
  • a light beam B1 is a light beam that passes through the optical surface region D1 on the optical element L1
  • a light beam B2 is a light beam that passes through the optical surface region D2 on the optical element L1.
  • the light beams B1 and B2 pass through the diaphragm S, the optical element L1, the lens L2, and the arrayed optical element K in this order, and reach the pixels P1 and P2 (shown in FIG. 4) on the imaging surface Ni on the imaging element N, respectively. To do.
  • FIG. 2A is a front view of the optical element L1 as viewed from the subject side.
  • the optical surface areas D1 and D2 in the optical element L1 are divided into two vertically with the optical axis V as the center of the boundary.
  • the broken line s indicates the position of the diaphragm S.
  • 2B is a diagram showing the shape of the diffraction grating G in a plane perpendicular to the optical axis V of the lens optical system L
  • FIG. 2C is an optical axis of the lens L2 provided with the diffraction grating G. It is a figure which shows the cross-sectional shape in a surface parallel to V.
  • first and second diffraction grating portions G1 and G2 are provided in regions corresponding to the optical surface regions D1 and D2 of the optical element L1. It has been.
  • first and second diffraction grating portions G1 and G2 a plurality of annular zones R1 and R2 are provided, respectively.
  • Diffraction steps A1 and A2 are provided between adjacent annular zones R1 and adjacent annular zones R2, respectively.
  • a plurality of diffraction steps A1 are provided.
  • each of the plurality of diffraction steps A1 has the same depth d1.
  • a plurality of diffraction steps A2 are provided, and typically each of the plurality of diffraction steps A2 has the same depth d2.
  • the depth d1 of the diffraction step A1 in the first diffraction grating part G1 is different from the depth d2 of the diffraction step A2 in the second diffraction grating part G2.
  • the widths of the annular zones R1 and R2 are set according to the target optical power and the like.
  • FIG. 3 is a perspective view of the arrayed optical element K.
  • FIG. 3 On the surface on the imaging element N side of the arrayed optical element K, a plurality of optical elements M1 elongated in the horizontal direction are arranged in the vertical direction.
  • the cross section (longitudinal direction) of each optical element M1 has a curved shape protruding toward the image sensor N side.
  • the arrayed optical element K has a lenticular lens configuration.
  • the array-like optical element K is disposed in the vicinity of the focal point of the lens optical system L, and is disposed at a position away from the imaging surface Ni by a predetermined distance.
  • FIG. 4A is an enlarged view of the arrayed optical element K and the image sensor N shown in FIG. 1, and FIG. 4B is a diagram illustrating the array of the optical element K and the pixels on the image sensor N. It is a figure which shows a positional relationship.
  • the arrayed optical element K is arranged so that the surface on which the optical element M1 is formed faces the imaging surface Ni side.
  • Pixels P are arranged in a matrix on the imaging surface Ni.
  • the pixel P can be distinguished into a pixel P1 and a pixel P2.
  • the color filters corresponding to the pixels P1 and P2 are not provided on the imaging surface Ni of the imaging element N.
  • Each of the pixel P1 and the pixel P2 is arranged in a row in the horizontal direction (row direction). In the vertical direction (column direction), the pixels P1 and P2 are alternately arranged.
  • the arrayed optical element K is arranged so that one of the optical elements M1 corresponds to two rows of pixels including one row of pixels P1 and one row of pixels P2 on the imaging surface Ni.
  • a microlens Ms is provided so as to cover the surfaces of the pixels P1 and P2.
  • the above configuration is realized by appropriately setting parameters such as the refractive index of the arrayed optical element K, the distance from the imaging surface Ni, and the radius of curvature of the surface of the optical element M1.
  • the angle of the light beam at the focal point is determined by the position of the light beam passing through the stop and the angle with respect to the optical axis. Further, the array-like optical element has a function of distributing the emission direction according to the incident angle of the light beam. Therefore, by arranging the optical surface regions D1 and D2 in the vicinity of the stop, and arranging the arrayed optical element K in the vicinity of the focal point as described above, the light beams B1 and B2 that have passed through the optical surface regions are respectively pixelated. It can be separated into P1 and P2.
  • the imaging optical system is an image side telecentric optical system
  • the light rays passing through the stop are parallel, and therefore the angle of the light ray at the focal point is uniquely determined by the position of the light ray passing through the stop.
  • the pixel P1 and the pixel P2 respectively generate image information corresponding to light in different wavelength bands. Since the light beams B1 and B2 are obtained from the subject at the same time, there is no time difference in the image information detected by the pixels P1 and P2. That is, the imaging apparatus A can acquire a plurality of pieces of image information formed by light having different wavelength bands with a single imaging optical system and with one imaging. For this reason, even when a moving body is photographed, an image having spectral information for each pixel can be obtained without causing a color shift due to time difference.
  • the first optical surface region D1 is an optical filter having a characteristic of transmitting visible light as light in the first wavelength band and substantially blocking near-infrared light.
  • the second optical surface region D2 is an optical filter having a characteristic of substantially blocking visible light and transmitting near-infrared light as light in the second wavelength band.
  • the first optical surface region D1 is an optical filter that transmits light of a predetermined wavelength bandwidth
  • the second optical surface region D2 is a bandwidth narrower than the predetermined wavelength bandwidth. It is an optical filter which permeate
  • the second wavelength band may or may not be included in the first wavelength band.
  • one type of light source having spectral radiation characteristics including the first and second wavelength bands, or a plurality of types of spectral radiation characteristics corresponding to the first and second wavelength bands, respectively. A light source may be provided. In such applications, lesions can be easily distinguished by displaying images acquired in a wide band and images acquired in a narrow band in different colors.
  • the optical filter disposed in the first optical surface region D1 and the second optical surface region D2 may be a color filter using an organic material or a color filter using a dielectric multilayer film.
  • a dielectric multilayer film is used, narrow band spectral characteristics can be realized, and photographing that extracts color information in a narrow wavelength band is possible.
  • a color filter made of the dielectric multilayer film can be provided in the first optical surface region D1 and the second optical surface region D2 relatively easily and inexpensively. It is possible to provide. For this reason, it is relatively easy to adjust and change the spectral characteristics in the first optical surface region D1 and the second optical surface region D2.
  • the pixel value of the missing pixel may be generated by interpolating with the pixel value of the pixel adjacent in the y direction, or may be generated by adding the pixel values in the x direction by two pixels.
  • the purpose of forming the diffraction grating G is to correct the chromatic aberration of the optical system and reduce the curvature of field.
  • the Abbe number ⁇ d of the diffraction grating is as shown below.
  • ⁇ d , ⁇ F , and ⁇ C are the wavelengths for the d-line, F-line, and C-line, respectively.
  • Equation 1 shows that the diffraction grating has inverse dispersion and anomalous dispersion. Therefore, since the arrangement of the wavelengths of the chromatic aberration generated on the diffraction grating on the optical axis is opposite to the arrangement of the chromatic aberration generated on the aspheric surface, the chromatic aberration of the aspheric surface can be canceled out. That is, it is possible to correct chromatic aberration generated in the refractive lens by combining the diffraction grating with the aspherical lens.
  • the diffraction grating lens also has a field curvature correction capability.
  • the smaller the ring pitch of the diffraction grating the stronger the diffraction power.
  • Refraction power can be kept relatively small by reducing the ring pitch and gaining the overall power by diffraction. Since the Petzval sum of the diffraction grating is almost zero, the refractive power is small, so that the Petzval sum of the entire optical system is reduced, and field curvature can be reduced. Therefore, by using a diffraction grating, not only chromatic aberration correction but also field curvature correction can be performed.
  • the diffraction efficiency is 100% only at a specific wavelength, and unnecessary orders of diffracted light are generated at other wavelengths, thereby degrading image quality.
  • the configuration of the present embodiment is used, an image with reduced generation of unnecessary diffracted light in the entire visible range can be obtained. The reason for this will be described below.
  • the diffraction grating G is formed on the lens surface on the stop S side.
  • the light beam B1 that passes through the first optical surface region D1 and the light beam that passes through the second optical surface region D2 The region where B2 passes on the diffraction grating surface can be separated.
  • the depth d1 of the diffraction step on the diffraction grating G through which the light beam B1 passes is such that the diffraction efficiency of light in the first wavelength band transmitted through the optical surface region D1 is greater than the diffraction efficiency of light in the second wavelength band. It is better to set it higher.
  • the depth d2 of the diffraction step on the diffraction grating G through which the light beam B2 passes is that the diffraction efficiency of light in the second wavelength band transmitted through the optical surface region D2 is (the diffraction efficiency of light in the first wavelength band). It is better to set it higher.
  • the condition that the diffraction efficiency is theoretically 100% for a light ray incident at an incident angle of 0 ° is expressed by the following equation using the wavelength ⁇ , the depth d of the diffraction step, and the refractive index n.
  • the diffraction grating G changes the wavefront by diffracting incident light rays. For example, under the condition that (Formula 2) is satisfied, in the diffraction grating G, all of the incident light becomes m-order diffracted light, and the direction of the light changes.
  • the bandwidth is wide, as the value of the wavelength ⁇ , a value near the center of the bandwidth may be used, or a wavelength that can reduce the loss due to unnecessary diffraction order light in a balanced manner over the entire bandwidth may be selected.
  • the second diffractive step depth d2 satisfy the relationship of the following formula. 0.9 ⁇ 2 / ⁇ n ( ⁇ 2 ) ⁇ 1 ⁇ ⁇ d2 ⁇ 1.1 ⁇ 2 / ⁇ n ( ⁇ 2 ) ⁇ 1 ⁇
  • the configuration of the present embodiment for example, by combining diffraction gratings having appropriate diffraction step depths d1 and d2 for each wavelength of the predetermined bandwidth of the light beams B1 and B2, for example, diffraction efficiency in the entire visible light region. Almost 100% of the image can be obtained.
  • the wavelength bands of light transmitted through the optical surface regions D1 and D2 are 400 nm to 550 nm and 550 nm to 700 nm, respectively.
  • the first-order diffraction efficiencies of the light beams B1 and B2 are as shown in FIGS. 5A and 5B, respectively. It becomes 90% or more in the band, and the diffraction efficiency can be almost 100% in the entire visible light region as the optical system.
  • the lens L2 in FIG. 1 is configured as a double-sided aspheric surface, and the surface shape is designed asymmetrically in each of the two divided regions with the optical axis V as the boundary center. A way to do this is also possible.
  • the imaging positions in each region can be made substantially coincident.
  • the chromatic aberration of the wavelength within the bandwidth cannot be corrected sufficiently.
  • the diffraction grating can correct the chromatic aberration as compared with an optical system with only an aspherical surface because the imaging position can be made uniform even in a band having a certain width by controlling the number of ring zones. .
  • the diffraction ring zone pitch of the diffraction grating G may be appropriately adjusted in each region through which the light beams B1 and B2 pass.
  • the diffraction zone pitch it is possible to adjust the distribution of diffraction power in accordance with the respective bandwidths, and to correct and evenly align the image formation positions in each region.
  • the configuration of the first embodiment it is possible to correct the chromatic aberration and increase the resolution, and to obtain an image in which the diffraction efficiency is almost 100% and the flare caused by unnecessary orders of diffracted light is reduced.
  • the optical element L1 is divided into three regions, the plurality of diffraction steps of the diffraction grating G have three different heights for each region, and the imaging device This is different from the first embodiment in that it includes three types of pixels.
  • the optical element L1 further includes an optical region D3 that selectively transmits light in a third wavelength band different from the first wavelength band and the second wavelength band.
  • the diffraction grating G further has a third diffraction step G3.
  • the imaging device further includes a plurality of pixels P3.
  • a detailed description of the same contents as in the first embodiment is omitted.
  • FIG. 6 is a schematic diagram illustrating the imaging apparatus A according to the second embodiment.
  • a light beam B1 is a light beam that passes through the optical surface region D1 on the optical element L1
  • a light beam B2 is a light beam that passes through the optical surface region D2 on the optical element L1
  • a light beam B3 is the optical element.
  • the light beam passes through the optical surface region D3 on L1.
  • the light beams B1, B2, and B3 pass through the stop S, the optical element L1, the lens L2, and the arrayed optical element K in this order, and reach the imaging surface Ni (shown in FIG. 8 and the like) on the imaging element N.
  • FIG. 7A is a front view of the optical element L1 as viewed from the subject side.
  • the optical surface regions D1, D2, and D3 are regions in which the optical element L1 is divided into three in the vertical direction within a plane perpendicular to the optical axis V. Further, the wavelength bands of the light transmitted through each optical surface region are different from each other.
  • FIG. 7B is a diagram showing the shape of the diffraction grating G on a plane perpendicular to the optical axis V of the lens optical system L. As shown in FIG. 7B, in the diffraction grating G, first to third diffraction grating portions G1 to G3 are provided in regions corresponding to the optical surface regions D1 to D3 of the optical element L1.
  • a plurality of annular zones R1 to R3 are provided, respectively.
  • Diffraction steps A1, A2, and A3 are provided between the adjacent ring zones R1, between the adjacent ring zones R2, and between the adjacent ring zones R3, respectively.
  • a plurality of diffraction steps A1 to A3 are provided.
  • the plurality of diffraction steps A1, A2, and A3 have the same depths d1, d2, and d3, respectively.
  • FIG. 8A is an enlarged view of the arrayed optical element K and the image sensor N shown in FIG. 6, and FIG. 8B is a diagram illustrating the relationship between the arrayed optical element K and the pixels on the image sensor N. It is a figure which shows a positional relationship.
  • the arrayed optical element K is arranged so that the surface on which the optical element M1 is formed faces the imaging surface Ni side.
  • Pixels P are arranged in a matrix on the imaging surface Ni.
  • the pixel P can be distinguished into a pixel P1, a pixel P2, and a pixel P3.
  • the color filters corresponding to the pixels P1, P2, and P3 are not provided on the imaging surface Ni of the imaging element N.
  • Each of the pixel P1, the pixel P2, and the pixel P3 is arranged in a row in the horizontal direction (row direction). In the vertical direction (column direction), the pixels P1, P2, and P3 are repeatedly arranged.
  • the array-like optical element K is arranged so that one of the optical elements M1 corresponds to three rows of pixels including one row of pixels P1, one row of pixels P2, and one row of pixels P3 on the imaging surface Ni. ing.
  • a microlens Ms is provided so as to cover the surfaces of the pixels P1, P2, and P3.
  • the above-described configuration is realized by appropriately setting parameters such as the refractive index of the arrayed optical element K, the distance from the imaging surface Ni, and the radius of curvature of the surface of the optical element M1.
  • the pixel P1, the pixel P2, and the pixel P3 each generate image information corresponding to light in different wavelength bands. That is, the imaging apparatus A can acquire a plurality of pieces of image information formed by light having different wavelength bands with a single imaging optical system and with one imaging.
  • the structure is such that images of two types of wavelength bands are acquired simultaneously, but in Embodiment 2, images of three types of wavelength bands can be acquired simultaneously.
  • “simultaneous” means that images of light of three types of wavelength bands obtained from the subject at the same time can be acquired, and signal generation of images of three types of wavelength bands may not be simultaneous.
  • the first optical surface region D1 is a blue color filter that transmits light in the blue band and substantially blocks colors in bands other than blue.
  • the second optical surface region D2 is a green color filter that transmits light in the green band and substantially blocks colors in bands other than green.
  • region D3 is set as the red color filter which permeate
  • pixel values in the y direction are missing every two pixels
  • pixel values of missing pixels may be generated by interpolation with pixel values of pixels adjacent in the y direction.
  • the pixel values in the x direction may be generated by adding three pixels at a time.
  • a configuration in which the aspect ratio of each pixel of the image sensor in the x direction and the y direction is 3: 1 may be employed. With such a configuration, the above-described interpolation processing and addition processing are not necessary.
  • the diffraction grating G is formed on the lens surface on the stop S side.
  • the light beam B1 that passes through the first optical surface region D1 and the light beam that passes through the second optical surface region D2 A region where B2 passes on the diffraction grating surface and a region where the light beam B3 passing through the third optical surface region D3 passes on the diffraction grating surface can be separated.
  • the depth d1 of the diffraction step on the diffraction grating G through which the light beam B1 passes is preferably set so that the diffraction efficiency of the light in the first wavelength band transmitted through the optical surface region D1 is high.
  • the depth d2 of the diffraction step on the diffraction grating G through which the light beam B2 passes is preferably set so that the diffraction efficiency of light in the second wavelength band transmitted through the optical surface region D2 is high.
  • the depth d3 of the diffraction step on the diffraction grating G through which the light beam B3 passes may be set so that the diffraction efficiency of the light in the third wavelength band transmitted through the optical surface region D3 is high.
  • the configuration of the present embodiment for example, by combining diffraction gratings having appropriate diffraction step depths d1, d2, and d3 for each wavelength of the predetermined bandwidth of the light beams B1, B2, and B3, for example, in the visible light region.
  • An image having a diffraction efficiency of almost 100% can be obtained over the entire area.
  • the wavelength bands of light transmitted through the optical surface regions D1, D2, and D3 are 400 nm to 500 nm, 500 nm to 600 nm, and 600 nm, respectively.
  • the first-order diffraction efficiencies of the light beams B1 and B2 are shown in FIG. As shown in (b) and (c), both are 90% or more in each wavelength band, and the diffraction efficiency can be almost 100% in the entire visible light region as an optical system.
  • the area division of the optical element L1 in FIG. 1 is four, the plurality of diffraction steps of the diffraction grating G have four different heights for each area, and the imaging element.
  • the optical element L1 includes an optical region D3 that selectively transmits light in a third wavelength band different from the first and second wavelength bands, and a first wavelength band different from the first, second, and third wavelength bands. It further has an optical region D4 that selectively transmits light in the four wavelength bands.
  • the diffraction grating G further includes third and fourth diffraction steps G3 and G4.
  • the imaging device further includes a plurality of pixels P3 and a plurality of pixels P4.
  • a detailed description of the same contents as in the first embodiment is omitted.
  • FIG. 10A is a front view of the optical element L1 viewed from the subject side.
  • Optical surface regions D1, D2, D3, and D4 are regions in which the optical element L1 is divided into four parts in the vertical and horizontal directions within a plane perpendicular to the optical axis V with the optical axis V as the boundary center. Further, the wavelength bands of the light transmitted through each optical surface region are different from each other.
  • FIG. 10B is a diagram showing the shape of the diffraction grating G in a plane perpendicular to the optical axis V of the lens optical system L. As shown in FIG.
  • first to fourth diffraction grating portions G1 to G4 are provided in regions corresponding to the optical surface regions D1 to D4 of the optical element L1.
  • a plurality of annular zones R1 to R4 are provided, respectively.
  • Diffraction steps A1, A2, A3, and A4 are provided between the adjacent ring zones R1, between the adjacent ring zones R2, between the adjacent ring zones R3, and between the adjacent ring zones R4, respectively. .
  • a plurality of diffraction steps A1 to A4 are provided.
  • the plurality of diffraction steps A1, A2, A3, and A4 have the same depths d1, d2, d3, and d4, respectively.
  • the depth d1, the depth d2, the depth d3, and the depth d4 are different from each other.
  • the widths of the annular zones R1, R2, R3, and R4 are set according to the target optical power and the like.
  • FIG. 11 is a perspective view of the arrayed optical element K.
  • FIG. 11 On the surface of the arrayed optical element K on the imaging element N side, optical elements M2 are arranged in a grid pattern. Each optical element M2 has a curved cross-section (vertical and horizontal cross-sections), and each optical element M2 protrudes toward the image sensor N.
  • the optical element M2 is a microlens
  • the arrayed optical element K is a microlens array.
  • FIG. 12A is an enlarged view showing the arrayed optical element K and the image sensor N
  • FIG. 12B shows the positional relationship between the arrayed optical element K and the pixels on the image sensor N.
  • FIG. The arrayed optical element K is arranged so that the surface on which the optical element M2 is formed faces the imaging surface Ni side.
  • Pixels P are arranged in a matrix on the imaging surface Ni.
  • the pixel P can be distinguished into a pixel P1, a pixel P2, a pixel P3, and a pixel P4.
  • the color filters corresponding to the pixels P1, P2, P3, and P4 are not provided on the imaging surface Ni of the imaging element N.
  • the arrayed optical element K is disposed in the vicinity of the focal point of the lens optical system L, and is disposed at a position away from the imaging surface Ni by a predetermined distance.
  • a microlens Ms is provided on the imaging surface Ni so as to cover the surfaces of the pixels P1, P2, P3, and P4.
  • the arrayed optical element K is arranged so that the surface on which the optical element M2 is formed faces the imaging surface Ni side.
  • the arrayed optical element K is arranged so that one of the optical elements M2 corresponds to four pixels of pixels P1 to P4 in 2 rows and 2 columns on the imaging surface Ni.
  • the above-described configuration is realized by appropriately setting parameters such as the refractive index of the arrayed optical element K, the distance from the imaging surface Ni, and the radius of curvature of the surface of the optical element M1.
  • the pixel P1, the pixel P2, the pixel P3, and the pixel P4 each generate image information corresponding to light in different wavelength bands. That is, the imaging apparatus A can acquire a plurality of pieces of image information formed by light of different wavelength bands with a single imaging optical system and one imaging.
  • the structure is such that images of two types and three types of wavelength bands are acquired simultaneously, but in Embodiment 3, images of four types of wavelength bands are acquired simultaneously. Can do.
  • visible light including blue, green, and red is substantially blocked, and near-infrared light that transmits near-infrared light is transmitted.
  • the configuration includes an infrared light filter.
  • the filter is configured to transmit only the wavelength band.
  • the narrow band described above may or may not be included in any of the blue, green, and red color filters.
  • a plurality of types of light sources having radiation characteristics may be provided.
  • the light source provided with the white light source and the light source which has the spectral radiation characteristic containing the said narrow band may be sufficient.
  • the pixel values in the x direction and the y direction are missing every other pixel. Therefore, the pixel values of the missing pixels may be generated by interpolating with the pixel values of the pixels adjacent in the x direction and the y direction, respectively.
  • two regions facing each other across the optical axis among the four divided regions may be the same green color filter.
  • the diffraction grating G is formed on the lens surface on the stop S side.
  • the light beam B1 passing through the first optical surface region D1 passes through the diffraction grating surface and the second optical surface region D2.
  • a region where the light beam B2 passes through the diffraction grating surface, a region where the light beam B3 which passes through the third optical surface region D3 passes through the diffraction grating surface, and a light beam B3 which passes through the fourth optical surface region D4 is diffracted.
  • a region passing on the lattice plane can be separated.
  • the depth d1 of the diffraction step on the diffraction grating G through which the light beam B1 passes is preferably set so that the diffraction efficiency of the light in the first wavelength band transmitted through the optical surface region D1 is high.
  • the depth d2 of the diffraction step on the diffraction grating G through which the light beam B2 passes is preferably set so that the diffraction efficiency of light in the second wavelength band transmitted through the optical surface region D2 is high.
  • the depth d3 of the diffraction step on the diffraction grating G through which the light beam B3 passes may be set so that the diffraction efficiency of the light in the third wavelength band transmitted through the optical surface region D3 is high.
  • the depth d4 of the diffraction step on the diffraction grating G through which the light beam B4 passes may be set so that the diffraction efficiency of the light in the fourth wavelength band transmitted through the optical surface region D4 is high.
  • diffraction gratings having appropriate diffraction step depths d1, d2, d3, and d4 for each wavelength of a predetermined bandwidth of the light beams B1, B2, B3, and B4.
  • an image having a diffraction efficiency of almost 100% can be obtained in the entire visible light region and infrared region.
  • the fourth embodiment is different from the first embodiment in that each of the first and second regions is separated from the optical axis, and the array-like optical element is replaced with a microlens from the lenticular. 1 and different.
  • a detailed description of the same contents as in the first embodiment is omitted.
  • FIG. 13A is a front view of the optical element L1 viewed from the subject side, and each of the optical surface regions D1 and D2 are arranged separately in the axially symmetric direction with the optical axis V as the center.
  • FIG. 13B is a diagram showing the positional relationship between the arrayed optical element K and the pixels on the image sensor N.
  • the odd-numbered row odd-numbered column and the even-numbered row even-numbered column are added and passed through the optical surface region D2. Since the light rays reach the even-numbered and odd-numbered columns and the odd-numbered and even-numbered columns, the even-numbered and odd-numbered columns and the odd-numbered and even-numbered columns are added to generate an image.
  • each of the first optical surface region D1 and the second optical surface region D2 is divided into two semicircular regions. For this reason, the center of gravity of the spot on the image plane of the light passing through each optical surface region may change depending on the subject distance, and parallax may occur.
  • FIG. 14 is a diagram for explaining a ray diagram for each subject distance and a point image and a change in its center of gravity in the first embodiment.
  • (a1), (b1), and (c1) are ray diagrams for each subject distance, and O is an object point.
  • Other symbols are the same as those in FIG. (A2) and (a3), (b2) and (b3), and (c2) and (c3) in FIG. 14 are a point image (shown as a semicircle) and a center of gravity (shown as a black dot) captured through a lenticular. These correspond to the subject distances of (a1), (b1), and (c1) in FIG.
  • each point image the image information (a2, b2, c2) extracted for each odd column and the pixel information (a3, b3, c3) extracted for each even column are doubled in the Y direction by interpolation processing. It is schematically shown as a thing. As shown in the figure, the spot diameter increases as the object point O approaches, but each point image has a semicircular shape. Therefore, when the acquired image is separated into an odd-numbered image and an even-numbered image, each image is displayed. The distance d between the centroids of the point image increases as the object point approaches. The center-to-center distance d is not preferable because it becomes parallax.
  • each of the optical surface regions D1 and D2 is arranged separately in the axially symmetric direction with the optical axis as the center, the distance between the center of gravity of the point image is changed even if the subject distance changes. d does not change.
  • FIG. 15 is a diagram for explaining a point image and its center of gravity for each subject distance.
  • (a1) and (a2), (b1) and (b2), and (c1) and (c2) are point images (shown as semicircles) imaged through a microlens and their center of gravity (black dots). These correspond to the subject distances of (a1), (b1), and (c1) in FIG.
  • Each point image includes image information (a1, b1, c1) obtained by adding odd rows and odd columns and even rows and even columns, and image information (a2, b2, c2) obtained by adding even rows and odd columns and odd rows and even columns.
  • image information (a1, b1, c1) obtained by adding odd rows and odd columns and even rows and even columns
  • image information (a2, b2, c2) obtained by adding even rows and odd columns and odd rows and even columns.
  • the first and second regions are separated from each other with the optical axis interposed therebetween, so that parallax occurs between acquired images even when the subject distance changes. You can avoid it.
  • the fifth embodiment differs from the first, second, and third embodiments in that a diffraction grating G is provided on the surface of the optical element L1 facing the lens L2 instead of providing the diffraction grating G on the lens L2. Yes.
  • a detailed description of the same contents as in the first, second, and third embodiments is omitted.
  • FIG. 16 is a schematic diagram illustrating the imaging apparatus A according to the fifth embodiment.
  • the manufacturing can be facilitated as compared with the case of providing the aspherical shape.
  • the diffraction grating G may be formed by processing the surface of the optical element L by a semiconductor process such as photolithography or etching.
  • the diffraction grating G can be processed on the surface of the optical element L by an electron beam drawing apparatus (EB drawing or the like).
  • the diffraction grating G is provided on the surface of the optical element L1 facing the lens L2, but the diffraction grating G may be provided on the surface of the optical element L1 on the subject side.
  • the sixth embodiment is different from the first, second, and third embodiments in that a lens L3 is added in addition to the lens L2.
  • a lens L3 is added in addition to the lens L2.
  • FIG. 17 is a schematic diagram illustrating the imaging apparatus A according to the sixth embodiment.
  • the lens L3 By adding the lens L3, the aberration generated in the optical system can be further reduced, and a higher-resolution optical system can be realized.
  • the seventh embodiment is different from the first, second, and third embodiments in that a lens L3 having a diffraction grating GA is added in addition to the lens L2.
  • a lens L3 having a diffraction grating GA is added in addition to the lens L2.
  • FIG. 18 is a schematic diagram illustrating the imaging apparatus A according to the seventh embodiment.
  • a lens L3 on which the diffraction grating GA is formed is added in the vicinity of the stop S.
  • An optical element L1 is disposed between the lens L3 and the lens L2.
  • the surface of the diffraction grating GA is preferably installed on the lens surface on the stop S side.
  • the power distribution of the diffraction grating can be finely adjusted, and chromatic aberration can be further reduced.
  • only the diffraction grating GA may be provided without providing the diffraction grating G.
  • the eighth embodiment is different from the first, second, and third embodiments in that an optical adjustment layer F1 is added so as to cover the surface of the diffraction grating G.
  • an optical adjustment layer F1 is added so as to cover the surface of the diffraction grating G.
  • FIG. 19 is a schematic diagram illustrating an imaging apparatus A according to the eighth embodiment.
  • the depth d of the diffraction step so as to satisfy (Equation 3), it is possible to further reduce unnecessary orders of diffracted light compared to the first, second, and third embodiments.
  • the wavelength bands of light transmitted through D2 and D3 are 400 nm to 500 nm, 500 nm to 600 nm, and 600 nm to 700 nm, respectively
  • d2 15.0 ⁇ m
  • d3 16.
  • the first-order diffraction efficiencies of the light beams B1, B2, and B3 are almost 100% in the respective wavelength bands as shown in FIGS. 20A, 20B, and 20C, and are visible as an optical system. Nearly 100% diffraction efficiency can be secured over the entire optical region.
  • the ninth embodiment is different from the first to eighth embodiments in that a lenticular lens and a microlens array are formed on the imaging surface.
  • a lenticular lens and a microlens array are formed on the imaging surface.
  • FIG. 21 (a) and 21 (b) are diagrams showing the arrayed optical element K and the imaging element N in an enlarged manner.
  • a lenticular lens (or microlens) Md is formed on the imaging surface Ni of the imaging element N. Pixels P are arranged in a matrix on the imaging surface Ni, as in the first embodiment.
  • One lenticular lens optical element or one microlens corresponds to the plurality of pixels P.
  • the light beams that have passed through different regions on the optical element L1 can be guided to different pixels.
  • FIG. 21B is a diagram showing a modification of the present embodiment. In the configuration shown in FIG.
  • a microlens Ms is formed on the imaging surface Ni so as to cover the pixel P, and an arrayed optical element is stacked on the surface of the microlens Ms.
  • the light collection efficiency can be increased as compared with the configuration in FIG.
  • the array-like optical element is separated from the image sensor as in the first to eighth embodiments, alignment of the array-like optical element and the image sensor becomes difficult.
  • the array-like optical element is difficult to align.
  • FIG. 22A is an enlarged view showing the vicinity of the imaging unit outside the optical axis. In FIG. 22A, only the light beam that passes through one optical surface region out of the light that passes through the arrayed optical element K is shown. As shown in FIG. 22A, when the lens optical system L is an image-side non-telecentric optical system, light leaks to adjacent pixels and crosstalk tends to occur, but as shown in FIG.
  • the offset amount ⁇ may be set according to the incident angle of the light beam on the imaging surface.
  • an image side telecentric optical system may be applied to the lens optical system L of the present embodiment.
  • an image side telecentric optical system even if the angle of view changes, the chief ray is incident on the arrayed optical element K at a value close to 0 degrees, so that crosstalk can be reduced over the entire imaging region.
  • FIG. 23 is a schematic diagram illustrating the imaging apparatus A when the image side telecentric optical system is applied.
  • the L3 lens is used to adjust the principal ray so that it is incident on the arrayed optical element K at a value close to 0 degrees even if the angle of view changes.
  • the arrayed optical element K is a microlens array, but each optical element of the microlens may be rotationally symmetric.
  • FIG. 24 (a3) shows the result of ray tracing simulation when the microlens shown in FIGS. 24 (a1) and (a2) is applied to the arrayed optical element of the present embodiment. In FIG. 24 (a3), only the light beam that passes through one optical surface region of the light that passes through the arrayed optical element K is shown.
  • FIG. 24 (b3) shows the result of ray tracing simulation when the microlens shown in FIGS. 24 (b1) and (b2) is applied to the arrayed optical element of the present embodiment.
  • the diaphragm S has a configuration in which a light shielding region is provided at a position corresponding to the boundary portion of the region.
  • the lens L2 As a method of manufacturing the lens L2, as shown in FIG. 25A, it is preferable to form the irregularities 12 for forming the diffraction grating G on the mold 11 using the electron beam drawing apparatus 10. If the electron beam drawing apparatus 10 is used, a non-rotationally symmetric structure can be easily formed. After forming the mold, as shown in FIG. 25B, a large amount of lenses L2 can be produced by injection molding if it is a resin material and press molding if it is a glass material. As another processing method, as shown in FIG. 26, there is a method in which molds 13A to 13D are formed by dividing each region, and one mold is formed by connecting them.
  • Embodiments 1 to 9 include the signal processing unit C, the imaging apparatus may not include these signal processing units. In that case, the processing performed by the signal processing unit C may be performed using a PC or the like outside the imaging apparatus. That is, according to one aspect of the present invention, it is possible to realize a system including an imaging device including the lens optical system L, the arrayed optical element K, and the imaging device N, and an external signal processing device.
  • the imaging device disclosed in the present application is useful as an imaging device such as a digital still camera or a digital video camera.
  • the present invention can also be applied to medical cameras such as in-vehicle cameras, security cameras, endoscopes and capsule endoscopes, biometric authentication cameras, microscopes, and astronomical telescopes.

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Diffracting Gratings Or Hologram Optical Elements (AREA)

Abstract

La présente invention concerne un dispositif de capture d'images qui est doté d'un élément optique (L1) possédant, dans un plan prédéfini, au moins une première région de surface optique (D1) permettant d'admettre la lumière d'une première bande de longueur d'onde, et au moins une seconde région de surface optique (D2) permettant d'admettre la lumière d'une seconde bande de longueur d'onde qui est différente de la première bande de longueur d'onde; d'un réseau de diffraction (G) possédant un premier niveau de diffraction formé dans une région que la lumière passant à travers la première région de surface optique (D1) vient percuter, la profondeur du second niveau de diffraction étant différente de celle du premier niveau de diffraction; d'un élément de capture d'images (N) possédant au moins une pluralité de premiers pixels et une pluralité de seconds pixels; et d'un élément optique en forme de grille (K) disposé entre le réseau de diffraction (G) et l'élément de capture d'images (N) et conçu pour amener la lumière étant passée à travers la première région de surface optique (D1) à venir percuter la pluralité de premiers pixels et pour amener la lumière étant passée à travers la seconde région de surface optique (D2) à venir percuter la pluralité de seconds pixels.
PCT/JP2012/005095 2011-09-16 2012-08-10 Dispositif de capture d'images WO2013038595A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-203809 2011-09-16
JP2011203809 2011-09-16

Publications (1)

Publication Number Publication Date
WO2013038595A1 true WO2013038595A1 (fr) 2013-03-21

Family

ID=47882848

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/005095 WO2013038595A1 (fr) 2011-09-16 2012-08-10 Dispositif de capture d'images

Country Status (1)

Country Link
WO (1) WO2013038595A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019504325A (ja) * 2016-02-02 2019-02-14 ケーエルエー−テンカー コーポレイション ハイパースペクトルイメージング計量システム及び方法
WO2022264488A1 (fr) * 2021-06-15 2022-12-22 ソニーグループ株式会社 Élément de condensation de lumière

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08220482A (ja) * 1994-12-13 1996-08-30 Olympus Optical Co Ltd 回折光学素子を含む光学系
JP2000098225A (ja) * 1999-08-18 2000-04-07 Matsushita Electric Ind Co Ltd 回折手段一体型レンズの設計方法
JP2004157059A (ja) * 2002-11-08 2004-06-03 Minolta Co Ltd 撮像装置およびレンズ光学系
US7433042B1 (en) * 2003-12-05 2008-10-07 Surface Optics Corporation Spatially corrected full-cubed hyperspectral imager
JP2010134042A (ja) * 2008-12-02 2010-06-17 Fujifilm Corp カラーフィルタの製造方法及び固体撮像装置
WO2010087208A1 (fr) * 2009-02-02 2010-08-05 パナソニック株式会社 Élément optique diffractif et son procédé de fabrication
JP2011075562A (ja) * 2009-09-30 2011-04-14 Ricoh Co Ltd 調節可能なマルチモードの光照射野結像システム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08220482A (ja) * 1994-12-13 1996-08-30 Olympus Optical Co Ltd 回折光学素子を含む光学系
JP2000098225A (ja) * 1999-08-18 2000-04-07 Matsushita Electric Ind Co Ltd 回折手段一体型レンズの設計方法
JP2004157059A (ja) * 2002-11-08 2004-06-03 Minolta Co Ltd 撮像装置およびレンズ光学系
US7433042B1 (en) * 2003-12-05 2008-10-07 Surface Optics Corporation Spatially corrected full-cubed hyperspectral imager
JP2010134042A (ja) * 2008-12-02 2010-06-17 Fujifilm Corp カラーフィルタの製造方法及び固体撮像装置
WO2010087208A1 (fr) * 2009-02-02 2010-08-05 パナソニック株式会社 Élément optique diffractif et son procédé de fabrication
JP2011075562A (ja) * 2009-09-30 2011-04-14 Ricoh Co Ltd 調節可能なマルチモードの光照射野結像システム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019504325A (ja) * 2016-02-02 2019-02-14 ケーエルエー−テンカー コーポレイション ハイパースペクトルイメージング計量システム及び方法
WO2022264488A1 (fr) * 2021-06-15 2022-12-22 ソニーグループ株式会社 Élément de condensation de lumière

Similar Documents

Publication Publication Date Title
US8854525B2 (en) Imaging device, imaging system, and imaging method
US8717483B2 (en) Imaging device, imaging system, and imaging method
US20210082988A1 (en) Image-capture element and image capture device
US8836825B2 (en) Imaging apparatus
JP4077510B2 (ja) 回折撮像レンズと回折撮像レンズ光学系及びこれを用いた撮像装置
EP2083447B1 (fr) Appareil de capture d'images
JP5144841B1 (ja) 撮像装置
TWI443366B (zh) 攝像鏡頭、及攝像模組
WO2016098640A1 (fr) Élément de capture d'images à semi-conducteurs et dispositif électronique
US9099369B2 (en) Solid-state image sensor
JP6008300B2 (ja) 撮像装置
KR20100059896A (ko) 이미지 센서
US11930256B2 (en) Imaging device, imaging optical system, and imaging method
US8902339B2 (en) Solid-state imaging element and dispersing element array for improved color imaging
US9179114B2 (en) Solid-state image sensor
WO2013038595A1 (fr) Dispositif de capture d'images
JP2005341301A (ja) 複眼撮像装置
US20220360759A1 (en) Image Sensor and Image Apparatus
JP2012169673A (ja) 固体撮像装置および電子機器
JP6911353B2 (ja) 固体撮像素子の製造方法
JP2009182550A (ja) カメラモジュール
JP6563243B2 (ja) 撮像装置及びカメラシステム
JP2005338505A (ja) 複眼撮像装置
Radl Optimum pixel design for dispersive filtering

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12832583

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12832583

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP