WO2013038595A1 - Image-capturing device - Google Patents

Image-capturing device Download PDF

Info

Publication number
WO2013038595A1
WO2013038595A1 PCT/JP2012/005095 JP2012005095W WO2013038595A1 WO 2013038595 A1 WO2013038595 A1 WO 2013038595A1 JP 2012005095 W JP2012005095 W JP 2012005095W WO 2013038595 A1 WO2013038595 A1 WO 2013038595A1
Authority
WO
WIPO (PCT)
Prior art keywords
region
light
diffraction
wavelength band
pixels
Prior art date
Application number
PCT/JP2012/005095
Other languages
French (fr)
Japanese (ja)
Inventor
貴真 安藤
今村 典広
是永 継博
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2013038595A1 publication Critical patent/WO2013038595A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00186Optical arrangements with imaging filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00188Optical arrangements with focusing or zooming features
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • A61B1/051Details of CCD assembly
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0208Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using focussing or collimating elements, e.g. lenses or mirrors; performing aberration correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0213Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using attenuators
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • G01J3/18Generating the spectrum; Monochromators using diffraction elements, e.g. grating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • G01J3/26Generating the spectrum; Monochromators using multiple reflection, e.g. Fabry-Perot interferometer, variable interference filters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/50Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors
    • G01J3/51Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors using colour filters
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/10Bifocal lenses; Multifocal lenses
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • G02B5/1814Diffraction gratings structurally combined with one or more further optical elements, e.g. lenses, mirrors, prisms or other diffraction gratings
    • G02B5/1819Plural gratings positioned on the same surface, e.g. array of gratings
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • G02B5/1842Gratings for image generation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/203Filters having holographic or diffractive elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/041Capsule endoscopes for imaging

Definitions

  • This application relates to an imaging device such as a camera.
  • a color filter using an organic material such as a pigment or a dye is formed on each pixel of a solid-state imaging device for color imaging. Since such a color filter transmits infrared light, in order to obtain a good color image in the imaging apparatus, a configuration in which the infrared cut filter is arranged in the optical path on the front side of the solid-state imaging device is generally used. Therefore, it is difficult for an imaging apparatus using a single imaging device to simultaneously acquire both visible light and infrared light image information.
  • a color filter using an organic material has a wide wavelength band. For example, each wavelength band of blue, green, and red overlaps with a relatively wide wavelength band, so that color reproducibility is deteriorated.
  • Patent Documents 1 and 2 techniques relating to a solid-state imaging device in which a color filter using a dielectric multilayer film is formed are disclosed.
  • a color filter using an organic material has difficulty in forming a narrow band spectral characteristic, and it is difficult to capture an image that extracts color information in a narrow wavelength band.
  • Patent Document 3 a technique for acquiring an image by sequentially lighting white light and predetermined narrow band light is disclosed.
  • the solid-state imaging device is expensive or difficult to form with a very small pixel size. Further, when a moving object is imaged, a color shift due to a time difference occurs.
  • One non-limiting exemplary embodiment of the present application provides an imaging apparatus capable of acquiring an image having spectral information for each pixel with a simpler configuration than the conventional one.
  • An imaging device transmits, in a predetermined plane, a first region that transmits light of a first wavelength band and light of a second wavelength band that is different from the first wavelength band.
  • An optical member having at least a second region, a first diffraction step provided in a region where light transmitted through the first region is incident, and a region where light transmitted through the second region is incident
  • a diffraction grating having a second diffraction step having a depth different from that of the first diffraction step
  • an imaging device having at least a plurality of first pixels and a plurality of second pixels, Light that is disposed between the diffraction grating and the imaging device and that has passed through the first region is incident on the plurality of first pixels, and light that has passed through the second region is incident on the plurality of second regions.
  • an arrayed optical element that is incident on the pixels.
  • a multispectral image can be acquired by one imaging using a single imaging system. According to one embodiment of the present invention, it is not necessary to provide a dielectric multilayer film for each pixel. In addition, when a moving image is shot using the image pickup apparatus of the present invention, even if the subject position changes over time, there is no image shift between a plurality of images. Further, by providing a diffraction grating, it is possible to correct chromatic aberration of the optical system and reduce curvature of field.
  • FIG. 1 is a schematic diagram illustrating a first embodiment of an imaging apparatus A.
  • FIG. (A) is the front view which looked at the optical element L1 in Embodiment 1 from the to-be-photographed object side.
  • (B) is a diagram showing the shape of the diffraction grating G on a plane perpendicular to the optical axis V of the lens optical system L.
  • FIG. (C) is a figure which shows the cross-sectional shape in the surface parallel to the optical axis V of the lens L2 in which the diffraction grating G was provided.
  • 2 is a perspective view of an arrayed optical element K in Embodiment 1.
  • FIG. (A) is an enlarged view of the arrayed optical element K and the imaging element N shown in FIG.
  • FIG. 6B is a diagram showing a positional relationship between the arrayed optical element K and the pixels of the image sensor N.
  • (A) is a figure which shows the 1st-order diffraction efficiency of the light beam B1 which passes the optical surface area
  • FIG. (B) is a figure which shows the 1st-order diffraction efficiency of the light beam B2 which passes the optical surface area
  • FIG. 6 is a schematic diagram illustrating a second embodiment of the imaging apparatus A.
  • FIG. (A) is the front view which looked at the optical element L1 in Embodiment 2 from the to-be-photographed object side.
  • FIG. (B) is a diagram showing the shape of the diffraction grating G on a plane perpendicular to the optical axis V of the lens optical system L.
  • FIG. (A) is a figure which expands and shows the array-like optical element K and the image pick-up element N which are shown in FIG.
  • FIG. 6B is a diagram showing a positional relationship between the arrayed optical element K and the pixels of the image sensor N.
  • (A) is a figure which shows the 1st-order diffraction efficiency of the light beam B1 which passes the optical surface area
  • FIG. (B) is a figure which shows the 1st-order diffraction efficiency of the light beam B2 which passes the optical surface area
  • FIG. (C) is a figure which shows the 1st-order diffraction efficiency of the light beam B3 which passes the optical surface area
  • FIG. (A) is the front view which looked at the optical element L1 in Embodiment 3 from the to-be-photographed object side.
  • (B) is a diagram showing the shape of the diffraction grating G on a plane perpendicular to the optical axis V of the lens optical system L.
  • FIG. FIG. 10 is a perspective view of an arrayed optical element K in a third embodiment.
  • FIG. 6B is a diagram showing a positional relationship between the arrayed optical element K and the pixels of the image sensor N.
  • (A) is the front view which looked at the optical element L1 in Embodiment 4 from the to-be-photographed object side.
  • FIG. 6B is a diagram showing a positional relationship between the arrayed optical element K and the pixels of the image sensor N.
  • (A1), (b1), and (c1) are ray diagrams for each subject distance in Embodiment 1, and (a2), (a3), (b2), (b3), (c2), and (c3) are It is a figure explaining the change of a point image and its gravity center.
  • (A1), (b1), (c1), (a2), (b2), (c2) are diagrams for explaining a point image and its center of gravity for each subject distance in the fourth embodiment.
  • 10 is a schematic diagram illustrating Embodiment 5 of an imaging apparatus A.
  • FIG. It is a schematic diagram which shows Embodiment 6 of the imaging device A.
  • FIG. 10 is a schematic diagram illustrating Embodiment 7 of an imaging apparatus A.
  • FIG. 8 It is a schematic diagram which shows Embodiment 8 of the imaging device A.
  • (A) is a figure which shows the 1st-order diffraction efficiency of the light beam B1 which passes the optical surface area
  • FIG. (B) is a figure which shows the 1st-order diffraction efficiency of the light beam B2 which passes the optical surface area
  • FIG. (C) is a figure which shows the 1st-order diffraction efficiency of the light beam B3 which passes the optical surface area
  • FIG. (A) And (b) is the figure which expands and shows the array-like optical element K in Embodiment 9, and the image pick-up element N.
  • FIG. (A) And (b) is a figure which expands and shows the array-like optical element K and the image pick-up element N in another form.
  • 1 is a schematic diagram illustrating an embodiment of an imaging apparatus A.
  • FIG. (A1) And (b1) is a perspective view of the array-like optical element K in another form.
  • (A2) And (b2) is a figure which shows the contour line of each optical element.
  • (A3) And (b3) is a figure which shows the result of a ray tracing simulation.
  • (A) is a figure which shows the metal mold
  • (B) is a figure explaining metallic mold fabrication of lens L2 in other forms. It is a figure which shows the metal mold
  • An imaging device includes a first region that transmits light in a first wavelength band in a predetermined plane and light in a second wavelength band that is different from the first wavelength band.
  • An optical member having at least a second region that transmits light, a first diffraction step provided in a region where light transmitted through the first region is incident, and light transmitted through the second region is incident
  • An imaging device provided at least in a region having a diffraction grating having a second diffraction step having a depth different from that of the first diffraction step, and a plurality of first pixels and a plurality of second pixels And between the diffraction grating and the image sensor, the light transmitted through the first region is incident on the plurality of first pixels, and the light transmitted through the second region is transmitted through the plurality of regions. And an arrayed optical element that is incident on the second pixel.
  • the optical member further includes at least a third region other than the first and second regions, and the diffraction grating is provided in a region where light transmitted through the third region is incident, , A third diffraction step having a depth different from that of the second diffraction step, the image pickup device further includes a plurality of third pixels, and the third region includes the first diffraction step. Transmitting light in a wavelength band and a third wavelength band different from the second wavelength band, and the arrayed optical element causes the light transmitted through the third region to enter the plurality of third pixels. May be.
  • the optical member further includes a fourth region other than the first, second, and third regions, and the diffraction grating is provided in a region where light transmitted through the fourth region is incident, A fourth diffraction step having a depth different from the first, second, and third diffraction steps, the imaging device further includes a plurality of fourth pixels, and the fourth region includes: The arrayed optical element transmits light of a fourth wavelength band different from the first, second, and third wavelength bands, and the arrayed optical element transmits light transmitted through the fourth region to the plurality of fourth pixels. You may make it enter into.
  • the imaging apparatus may further include a first lens that is provided between the optical member and the arrayed optical element and into which light that has passed through the optical member is incident.
  • the diffraction grating may be provided on a surface of the first lens facing the optical member.
  • the diffraction grating may be provided on a surface of the optical member that faces the first lens.
  • the first and second regions of the optical member are composed of a plurality of regions separated from each other across the optical axis of the first lens.
  • At least one of the first region and the second region has a spectral transmittance characteristic that transmits light in the near-infrared wavelength band and blocks light in the visible light region. May be.
  • At least one of the first region and the second region may have a spectral transmittance characteristic that transmits light having a wavelength bandwidth relatively narrower than the wavelength bandwidth of the other region. Good.
  • the first region and the second region may be regions divided with the optical axis of the first lens as a boundary center.
  • the imaging apparatus further includes a second lens, the optical member is disposed between the second lens and the first lens, and a diffraction grating is provided on a surface of the second lens. May be.
  • the imaging device may further include an optical adjustment layer formed on the surface provided with the diffraction grating.
  • the imaging apparatus may further include a light shielding region provided at a position corresponding to a boundary portion between the first region and the second region.
  • the imaging apparatus may further include a diaphragm, and the first area and the second area may be disposed in the vicinity of the diaphragm.
  • the diffraction grating may be arranged in the vicinity of the stop.
  • the depth d 1 of the first diffraction step is 0.9 ⁇ 1 / ⁇ n ( ⁇ 1 ) ⁇ 1.
  • ⁇ ⁇ d 1 ⁇ 1.1 ⁇ 1 / ⁇ n ( ⁇ 1 ) ⁇ 1 ⁇
  • the central value of the wavelength band width of light reaching the second diffraction step is ⁇ 2
  • the second The diffraction step depth d 2 may satisfy 0.9 ⁇ 2 / ⁇ n ( ⁇ 2 ) ⁇ 1 ⁇ ⁇ d 2 ⁇ 1.1 ⁇ 2 / ⁇ n ( ⁇ 2 ) ⁇ 1 ⁇ .
  • the diffraction efficiency of the light in the first wavelength band at the first diffraction step is higher than the diffraction efficiency of the light in the second wavelength band, and the light in the second wavelength band at the second diffraction step.
  • the diffraction efficiency may be higher than the diffraction efficiency of light in the first wavelength band.
  • the imaging apparatus further includes a signal processing unit, wherein the signal processing unit generates first image information corresponding to the first wavelength band from pixel values obtained in the plurality of first pixels, and Second image information corresponding to the second wavelength band may be generated from pixel values obtained in a plurality of second pixels.
  • An imaging system includes an imaging device according to any one of the above, and a first wavelength band corresponding to the first wavelength band from pixel values obtained in the plurality of first pixels in the imaging device.
  • a signal processing device that generates one piece of image information and generates second image information corresponding to the second wavelength band from pixel values obtained in the plurality of second pixels.
  • FIG. 1 is a schematic diagram illustrating an imaging apparatus A according to the first embodiment.
  • An imaging apparatus A according to the present embodiment includes a lens optical system L having V as an optical axis, an array-like optical element K disposed near the focal point of the lens optical system L, an imaging element N, and a first signal processing unit. C.
  • the lens optical system L is disposed in the vicinity of the diaphragm S where light from a subject (not shown) enters, and in the vicinity of the diaphragm S, for example, in a plane perpendicular to the optical axis V of the lens optical system L.
  • a diffraction grating G is formed on the stop S side (surface facing the optical element L1) of the lens L2.
  • the diffraction grating G is provided in a first diffraction grating portion G1 provided in a region where light that has passed through the first optical surface region D1 is incident and a region in which light that has passed through the second optical surface region D2 is incident.
  • the second diffraction grating portion G2 is provided.
  • Each of the first diffraction grating portion G1 and the second diffraction grating portion G2 includes a plurality of annular zones and a plurality of diffraction steps formed concentrically around the optical axis V.
  • the depths d1 of the plurality of diffraction steps in the first diffraction grating part G1 are different from the depths d2 of the plurality of diffraction steps in the second diffraction grating part G2. Details of the diffraction grating G will be described later.
  • the lens L2 has a single lens configuration, but may have a plurality of configurations.
  • the “wavelength band” in the “first wavelength band” and the “second wavelength band” is a continuous band that occupies 50% or more of the total amount of light transmitted through the region. A wavelength that is cut by 95% or more by passing through the region is not included in the “wavelength band”. That is, the first optical surface region D1 selectively transmits light in the first wavelength band, and the second optical surface region D2 selectively transmits light in the second wavelength band.
  • the fact that two wavelength bands are different from each other means that at least one wavelength band includes a wavelength band that is not included in the other wavelength band. Therefore, some wavelength bands may overlap.
  • the configurations in which the wavelength bands to be transmitted are different from each other include a configuration in which a filter using an organic material or a dielectric multilayer film is formed on the surface of the optical element L1 on the stop S side, or the optical element L1 is dyed for each region with a dye-based filter It varies depending on the configuration.
  • a color filter may be formed on one flat plate, or may be formed on a plurality of flat plates divided for each region.
  • the light that has passed through the two optical surface regions D1 and D2 passes through the lens L2 provided with the diffraction grating G, and then enters the arrayed optical element K.
  • the arrayed optical element K causes light that has passed through the optical surface region D1 to enter the plurality of pixels P1 in the image sensor N, and light that has passed through the optical surface region D2 to enter the plurality of pixels P2 in the image sensor N.
  • the signal processing unit C generates image information corresponding to the first wavelength band from the pixel value obtained in the pixel P1, generates image information corresponding to the second wavelength band from the pixel value obtained in the pixel P2, Output.
  • a light beam B1 is a light beam that passes through the optical surface region D1 on the optical element L1
  • a light beam B2 is a light beam that passes through the optical surface region D2 on the optical element L1.
  • the light beams B1 and B2 pass through the diaphragm S, the optical element L1, the lens L2, and the arrayed optical element K in this order, and reach the pixels P1 and P2 (shown in FIG. 4) on the imaging surface Ni on the imaging element N, respectively. To do.
  • FIG. 2A is a front view of the optical element L1 as viewed from the subject side.
  • the optical surface areas D1 and D2 in the optical element L1 are divided into two vertically with the optical axis V as the center of the boundary.
  • the broken line s indicates the position of the diaphragm S.
  • 2B is a diagram showing the shape of the diffraction grating G in a plane perpendicular to the optical axis V of the lens optical system L
  • FIG. 2C is an optical axis of the lens L2 provided with the diffraction grating G. It is a figure which shows the cross-sectional shape in a surface parallel to V.
  • first and second diffraction grating portions G1 and G2 are provided in regions corresponding to the optical surface regions D1 and D2 of the optical element L1. It has been.
  • first and second diffraction grating portions G1 and G2 a plurality of annular zones R1 and R2 are provided, respectively.
  • Diffraction steps A1 and A2 are provided between adjacent annular zones R1 and adjacent annular zones R2, respectively.
  • a plurality of diffraction steps A1 are provided.
  • each of the plurality of diffraction steps A1 has the same depth d1.
  • a plurality of diffraction steps A2 are provided, and typically each of the plurality of diffraction steps A2 has the same depth d2.
  • the depth d1 of the diffraction step A1 in the first diffraction grating part G1 is different from the depth d2 of the diffraction step A2 in the second diffraction grating part G2.
  • the widths of the annular zones R1 and R2 are set according to the target optical power and the like.
  • FIG. 3 is a perspective view of the arrayed optical element K.
  • FIG. 3 On the surface on the imaging element N side of the arrayed optical element K, a plurality of optical elements M1 elongated in the horizontal direction are arranged in the vertical direction.
  • the cross section (longitudinal direction) of each optical element M1 has a curved shape protruding toward the image sensor N side.
  • the arrayed optical element K has a lenticular lens configuration.
  • the array-like optical element K is disposed in the vicinity of the focal point of the lens optical system L, and is disposed at a position away from the imaging surface Ni by a predetermined distance.
  • FIG. 4A is an enlarged view of the arrayed optical element K and the image sensor N shown in FIG. 1, and FIG. 4B is a diagram illustrating the array of the optical element K and the pixels on the image sensor N. It is a figure which shows a positional relationship.
  • the arrayed optical element K is arranged so that the surface on which the optical element M1 is formed faces the imaging surface Ni side.
  • Pixels P are arranged in a matrix on the imaging surface Ni.
  • the pixel P can be distinguished into a pixel P1 and a pixel P2.
  • the color filters corresponding to the pixels P1 and P2 are not provided on the imaging surface Ni of the imaging element N.
  • Each of the pixel P1 and the pixel P2 is arranged in a row in the horizontal direction (row direction). In the vertical direction (column direction), the pixels P1 and P2 are alternately arranged.
  • the arrayed optical element K is arranged so that one of the optical elements M1 corresponds to two rows of pixels including one row of pixels P1 and one row of pixels P2 on the imaging surface Ni.
  • a microlens Ms is provided so as to cover the surfaces of the pixels P1 and P2.
  • the above configuration is realized by appropriately setting parameters such as the refractive index of the arrayed optical element K, the distance from the imaging surface Ni, and the radius of curvature of the surface of the optical element M1.
  • the angle of the light beam at the focal point is determined by the position of the light beam passing through the stop and the angle with respect to the optical axis. Further, the array-like optical element has a function of distributing the emission direction according to the incident angle of the light beam. Therefore, by arranging the optical surface regions D1 and D2 in the vicinity of the stop, and arranging the arrayed optical element K in the vicinity of the focal point as described above, the light beams B1 and B2 that have passed through the optical surface regions are respectively pixelated. It can be separated into P1 and P2.
  • the imaging optical system is an image side telecentric optical system
  • the light rays passing through the stop are parallel, and therefore the angle of the light ray at the focal point is uniquely determined by the position of the light ray passing through the stop.
  • the pixel P1 and the pixel P2 respectively generate image information corresponding to light in different wavelength bands. Since the light beams B1 and B2 are obtained from the subject at the same time, there is no time difference in the image information detected by the pixels P1 and P2. That is, the imaging apparatus A can acquire a plurality of pieces of image information formed by light having different wavelength bands with a single imaging optical system and with one imaging. For this reason, even when a moving body is photographed, an image having spectral information for each pixel can be obtained without causing a color shift due to time difference.
  • the first optical surface region D1 is an optical filter having a characteristic of transmitting visible light as light in the first wavelength band and substantially blocking near-infrared light.
  • the second optical surface region D2 is an optical filter having a characteristic of substantially blocking visible light and transmitting near-infrared light as light in the second wavelength band.
  • the first optical surface region D1 is an optical filter that transmits light of a predetermined wavelength bandwidth
  • the second optical surface region D2 is a bandwidth narrower than the predetermined wavelength bandwidth. It is an optical filter which permeate
  • the second wavelength band may or may not be included in the first wavelength band.
  • one type of light source having spectral radiation characteristics including the first and second wavelength bands, or a plurality of types of spectral radiation characteristics corresponding to the first and second wavelength bands, respectively. A light source may be provided. In such applications, lesions can be easily distinguished by displaying images acquired in a wide band and images acquired in a narrow band in different colors.
  • the optical filter disposed in the first optical surface region D1 and the second optical surface region D2 may be a color filter using an organic material or a color filter using a dielectric multilayer film.
  • a dielectric multilayer film is used, narrow band spectral characteristics can be realized, and photographing that extracts color information in a narrow wavelength band is possible.
  • a color filter made of the dielectric multilayer film can be provided in the first optical surface region D1 and the second optical surface region D2 relatively easily and inexpensively. It is possible to provide. For this reason, it is relatively easy to adjust and change the spectral characteristics in the first optical surface region D1 and the second optical surface region D2.
  • the pixel value of the missing pixel may be generated by interpolating with the pixel value of the pixel adjacent in the y direction, or may be generated by adding the pixel values in the x direction by two pixels.
  • the purpose of forming the diffraction grating G is to correct the chromatic aberration of the optical system and reduce the curvature of field.
  • the Abbe number ⁇ d of the diffraction grating is as shown below.
  • ⁇ d , ⁇ F , and ⁇ C are the wavelengths for the d-line, F-line, and C-line, respectively.
  • Equation 1 shows that the diffraction grating has inverse dispersion and anomalous dispersion. Therefore, since the arrangement of the wavelengths of the chromatic aberration generated on the diffraction grating on the optical axis is opposite to the arrangement of the chromatic aberration generated on the aspheric surface, the chromatic aberration of the aspheric surface can be canceled out. That is, it is possible to correct chromatic aberration generated in the refractive lens by combining the diffraction grating with the aspherical lens.
  • the diffraction grating lens also has a field curvature correction capability.
  • the smaller the ring pitch of the diffraction grating the stronger the diffraction power.
  • Refraction power can be kept relatively small by reducing the ring pitch and gaining the overall power by diffraction. Since the Petzval sum of the diffraction grating is almost zero, the refractive power is small, so that the Petzval sum of the entire optical system is reduced, and field curvature can be reduced. Therefore, by using a diffraction grating, not only chromatic aberration correction but also field curvature correction can be performed.
  • the diffraction efficiency is 100% only at a specific wavelength, and unnecessary orders of diffracted light are generated at other wavelengths, thereby degrading image quality.
  • the configuration of the present embodiment is used, an image with reduced generation of unnecessary diffracted light in the entire visible range can be obtained. The reason for this will be described below.
  • the diffraction grating G is formed on the lens surface on the stop S side.
  • the light beam B1 that passes through the first optical surface region D1 and the light beam that passes through the second optical surface region D2 The region where B2 passes on the diffraction grating surface can be separated.
  • the depth d1 of the diffraction step on the diffraction grating G through which the light beam B1 passes is such that the diffraction efficiency of light in the first wavelength band transmitted through the optical surface region D1 is greater than the diffraction efficiency of light in the second wavelength band. It is better to set it higher.
  • the depth d2 of the diffraction step on the diffraction grating G through which the light beam B2 passes is that the diffraction efficiency of light in the second wavelength band transmitted through the optical surface region D2 is (the diffraction efficiency of light in the first wavelength band). It is better to set it higher.
  • the condition that the diffraction efficiency is theoretically 100% for a light ray incident at an incident angle of 0 ° is expressed by the following equation using the wavelength ⁇ , the depth d of the diffraction step, and the refractive index n.
  • the diffraction grating G changes the wavefront by diffracting incident light rays. For example, under the condition that (Formula 2) is satisfied, in the diffraction grating G, all of the incident light becomes m-order diffracted light, and the direction of the light changes.
  • the bandwidth is wide, as the value of the wavelength ⁇ , a value near the center of the bandwidth may be used, or a wavelength that can reduce the loss due to unnecessary diffraction order light in a balanced manner over the entire bandwidth may be selected.
  • the second diffractive step depth d2 satisfy the relationship of the following formula. 0.9 ⁇ 2 / ⁇ n ( ⁇ 2 ) ⁇ 1 ⁇ ⁇ d2 ⁇ 1.1 ⁇ 2 / ⁇ n ( ⁇ 2 ) ⁇ 1 ⁇
  • the configuration of the present embodiment for example, by combining diffraction gratings having appropriate diffraction step depths d1 and d2 for each wavelength of the predetermined bandwidth of the light beams B1 and B2, for example, diffraction efficiency in the entire visible light region. Almost 100% of the image can be obtained.
  • the wavelength bands of light transmitted through the optical surface regions D1 and D2 are 400 nm to 550 nm and 550 nm to 700 nm, respectively.
  • the first-order diffraction efficiencies of the light beams B1 and B2 are as shown in FIGS. 5A and 5B, respectively. It becomes 90% or more in the band, and the diffraction efficiency can be almost 100% in the entire visible light region as the optical system.
  • the lens L2 in FIG. 1 is configured as a double-sided aspheric surface, and the surface shape is designed asymmetrically in each of the two divided regions with the optical axis V as the boundary center. A way to do this is also possible.
  • the imaging positions in each region can be made substantially coincident.
  • the chromatic aberration of the wavelength within the bandwidth cannot be corrected sufficiently.
  • the diffraction grating can correct the chromatic aberration as compared with an optical system with only an aspherical surface because the imaging position can be made uniform even in a band having a certain width by controlling the number of ring zones. .
  • the diffraction ring zone pitch of the diffraction grating G may be appropriately adjusted in each region through which the light beams B1 and B2 pass.
  • the diffraction zone pitch it is possible to adjust the distribution of diffraction power in accordance with the respective bandwidths, and to correct and evenly align the image formation positions in each region.
  • the configuration of the first embodiment it is possible to correct the chromatic aberration and increase the resolution, and to obtain an image in which the diffraction efficiency is almost 100% and the flare caused by unnecessary orders of diffracted light is reduced.
  • the optical element L1 is divided into three regions, the plurality of diffraction steps of the diffraction grating G have three different heights for each region, and the imaging device This is different from the first embodiment in that it includes three types of pixels.
  • the optical element L1 further includes an optical region D3 that selectively transmits light in a third wavelength band different from the first wavelength band and the second wavelength band.
  • the diffraction grating G further has a third diffraction step G3.
  • the imaging device further includes a plurality of pixels P3.
  • a detailed description of the same contents as in the first embodiment is omitted.
  • FIG. 6 is a schematic diagram illustrating the imaging apparatus A according to the second embodiment.
  • a light beam B1 is a light beam that passes through the optical surface region D1 on the optical element L1
  • a light beam B2 is a light beam that passes through the optical surface region D2 on the optical element L1
  • a light beam B3 is the optical element.
  • the light beam passes through the optical surface region D3 on L1.
  • the light beams B1, B2, and B3 pass through the stop S, the optical element L1, the lens L2, and the arrayed optical element K in this order, and reach the imaging surface Ni (shown in FIG. 8 and the like) on the imaging element N.
  • FIG. 7A is a front view of the optical element L1 as viewed from the subject side.
  • the optical surface regions D1, D2, and D3 are regions in which the optical element L1 is divided into three in the vertical direction within a plane perpendicular to the optical axis V. Further, the wavelength bands of the light transmitted through each optical surface region are different from each other.
  • FIG. 7B is a diagram showing the shape of the diffraction grating G on a plane perpendicular to the optical axis V of the lens optical system L. As shown in FIG. 7B, in the diffraction grating G, first to third diffraction grating portions G1 to G3 are provided in regions corresponding to the optical surface regions D1 to D3 of the optical element L1.
  • a plurality of annular zones R1 to R3 are provided, respectively.
  • Diffraction steps A1, A2, and A3 are provided between the adjacent ring zones R1, between the adjacent ring zones R2, and between the adjacent ring zones R3, respectively.
  • a plurality of diffraction steps A1 to A3 are provided.
  • the plurality of diffraction steps A1, A2, and A3 have the same depths d1, d2, and d3, respectively.
  • FIG. 8A is an enlarged view of the arrayed optical element K and the image sensor N shown in FIG. 6, and FIG. 8B is a diagram illustrating the relationship between the arrayed optical element K and the pixels on the image sensor N. It is a figure which shows a positional relationship.
  • the arrayed optical element K is arranged so that the surface on which the optical element M1 is formed faces the imaging surface Ni side.
  • Pixels P are arranged in a matrix on the imaging surface Ni.
  • the pixel P can be distinguished into a pixel P1, a pixel P2, and a pixel P3.
  • the color filters corresponding to the pixels P1, P2, and P3 are not provided on the imaging surface Ni of the imaging element N.
  • Each of the pixel P1, the pixel P2, and the pixel P3 is arranged in a row in the horizontal direction (row direction). In the vertical direction (column direction), the pixels P1, P2, and P3 are repeatedly arranged.
  • the array-like optical element K is arranged so that one of the optical elements M1 corresponds to three rows of pixels including one row of pixels P1, one row of pixels P2, and one row of pixels P3 on the imaging surface Ni. ing.
  • a microlens Ms is provided so as to cover the surfaces of the pixels P1, P2, and P3.
  • the above-described configuration is realized by appropriately setting parameters such as the refractive index of the arrayed optical element K, the distance from the imaging surface Ni, and the radius of curvature of the surface of the optical element M1.
  • the pixel P1, the pixel P2, and the pixel P3 each generate image information corresponding to light in different wavelength bands. That is, the imaging apparatus A can acquire a plurality of pieces of image information formed by light having different wavelength bands with a single imaging optical system and with one imaging.
  • the structure is such that images of two types of wavelength bands are acquired simultaneously, but in Embodiment 2, images of three types of wavelength bands can be acquired simultaneously.
  • “simultaneous” means that images of light of three types of wavelength bands obtained from the subject at the same time can be acquired, and signal generation of images of three types of wavelength bands may not be simultaneous.
  • the first optical surface region D1 is a blue color filter that transmits light in the blue band and substantially blocks colors in bands other than blue.
  • the second optical surface region D2 is a green color filter that transmits light in the green band and substantially blocks colors in bands other than green.
  • region D3 is set as the red color filter which permeate
  • pixel values in the y direction are missing every two pixels
  • pixel values of missing pixels may be generated by interpolation with pixel values of pixels adjacent in the y direction.
  • the pixel values in the x direction may be generated by adding three pixels at a time.
  • a configuration in which the aspect ratio of each pixel of the image sensor in the x direction and the y direction is 3: 1 may be employed. With such a configuration, the above-described interpolation processing and addition processing are not necessary.
  • the diffraction grating G is formed on the lens surface on the stop S side.
  • the light beam B1 that passes through the first optical surface region D1 and the light beam that passes through the second optical surface region D2 A region where B2 passes on the diffraction grating surface and a region where the light beam B3 passing through the third optical surface region D3 passes on the diffraction grating surface can be separated.
  • the depth d1 of the diffraction step on the diffraction grating G through which the light beam B1 passes is preferably set so that the diffraction efficiency of the light in the first wavelength band transmitted through the optical surface region D1 is high.
  • the depth d2 of the diffraction step on the diffraction grating G through which the light beam B2 passes is preferably set so that the diffraction efficiency of light in the second wavelength band transmitted through the optical surface region D2 is high.
  • the depth d3 of the diffraction step on the diffraction grating G through which the light beam B3 passes may be set so that the diffraction efficiency of the light in the third wavelength band transmitted through the optical surface region D3 is high.
  • the configuration of the present embodiment for example, by combining diffraction gratings having appropriate diffraction step depths d1, d2, and d3 for each wavelength of the predetermined bandwidth of the light beams B1, B2, and B3, for example, in the visible light region.
  • An image having a diffraction efficiency of almost 100% can be obtained over the entire area.
  • the wavelength bands of light transmitted through the optical surface regions D1, D2, and D3 are 400 nm to 500 nm, 500 nm to 600 nm, and 600 nm, respectively.
  • the first-order diffraction efficiencies of the light beams B1 and B2 are shown in FIG. As shown in (b) and (c), both are 90% or more in each wavelength band, and the diffraction efficiency can be almost 100% in the entire visible light region as an optical system.
  • the area division of the optical element L1 in FIG. 1 is four, the plurality of diffraction steps of the diffraction grating G have four different heights for each area, and the imaging element.
  • the optical element L1 includes an optical region D3 that selectively transmits light in a third wavelength band different from the first and second wavelength bands, and a first wavelength band different from the first, second, and third wavelength bands. It further has an optical region D4 that selectively transmits light in the four wavelength bands.
  • the diffraction grating G further includes third and fourth diffraction steps G3 and G4.
  • the imaging device further includes a plurality of pixels P3 and a plurality of pixels P4.
  • a detailed description of the same contents as in the first embodiment is omitted.
  • FIG. 10A is a front view of the optical element L1 viewed from the subject side.
  • Optical surface regions D1, D2, D3, and D4 are regions in which the optical element L1 is divided into four parts in the vertical and horizontal directions within a plane perpendicular to the optical axis V with the optical axis V as the boundary center. Further, the wavelength bands of the light transmitted through each optical surface region are different from each other.
  • FIG. 10B is a diagram showing the shape of the diffraction grating G in a plane perpendicular to the optical axis V of the lens optical system L. As shown in FIG.
  • first to fourth diffraction grating portions G1 to G4 are provided in regions corresponding to the optical surface regions D1 to D4 of the optical element L1.
  • a plurality of annular zones R1 to R4 are provided, respectively.
  • Diffraction steps A1, A2, A3, and A4 are provided between the adjacent ring zones R1, between the adjacent ring zones R2, between the adjacent ring zones R3, and between the adjacent ring zones R4, respectively. .
  • a plurality of diffraction steps A1 to A4 are provided.
  • the plurality of diffraction steps A1, A2, A3, and A4 have the same depths d1, d2, d3, and d4, respectively.
  • the depth d1, the depth d2, the depth d3, and the depth d4 are different from each other.
  • the widths of the annular zones R1, R2, R3, and R4 are set according to the target optical power and the like.
  • FIG. 11 is a perspective view of the arrayed optical element K.
  • FIG. 11 On the surface of the arrayed optical element K on the imaging element N side, optical elements M2 are arranged in a grid pattern. Each optical element M2 has a curved cross-section (vertical and horizontal cross-sections), and each optical element M2 protrudes toward the image sensor N.
  • the optical element M2 is a microlens
  • the arrayed optical element K is a microlens array.
  • FIG. 12A is an enlarged view showing the arrayed optical element K and the image sensor N
  • FIG. 12B shows the positional relationship between the arrayed optical element K and the pixels on the image sensor N.
  • FIG. The arrayed optical element K is arranged so that the surface on which the optical element M2 is formed faces the imaging surface Ni side.
  • Pixels P are arranged in a matrix on the imaging surface Ni.
  • the pixel P can be distinguished into a pixel P1, a pixel P2, a pixel P3, and a pixel P4.
  • the color filters corresponding to the pixels P1, P2, P3, and P4 are not provided on the imaging surface Ni of the imaging element N.
  • the arrayed optical element K is disposed in the vicinity of the focal point of the lens optical system L, and is disposed at a position away from the imaging surface Ni by a predetermined distance.
  • a microlens Ms is provided on the imaging surface Ni so as to cover the surfaces of the pixels P1, P2, P3, and P4.
  • the arrayed optical element K is arranged so that the surface on which the optical element M2 is formed faces the imaging surface Ni side.
  • the arrayed optical element K is arranged so that one of the optical elements M2 corresponds to four pixels of pixels P1 to P4 in 2 rows and 2 columns on the imaging surface Ni.
  • the above-described configuration is realized by appropriately setting parameters such as the refractive index of the arrayed optical element K, the distance from the imaging surface Ni, and the radius of curvature of the surface of the optical element M1.
  • the pixel P1, the pixel P2, the pixel P3, and the pixel P4 each generate image information corresponding to light in different wavelength bands. That is, the imaging apparatus A can acquire a plurality of pieces of image information formed by light of different wavelength bands with a single imaging optical system and one imaging.
  • the structure is such that images of two types and three types of wavelength bands are acquired simultaneously, but in Embodiment 3, images of four types of wavelength bands are acquired simultaneously. Can do.
  • visible light including blue, green, and red is substantially blocked, and near-infrared light that transmits near-infrared light is transmitted.
  • the configuration includes an infrared light filter.
  • the filter is configured to transmit only the wavelength band.
  • the narrow band described above may or may not be included in any of the blue, green, and red color filters.
  • a plurality of types of light sources having radiation characteristics may be provided.
  • the light source provided with the white light source and the light source which has the spectral radiation characteristic containing the said narrow band may be sufficient.
  • the pixel values in the x direction and the y direction are missing every other pixel. Therefore, the pixel values of the missing pixels may be generated by interpolating with the pixel values of the pixels adjacent in the x direction and the y direction, respectively.
  • two regions facing each other across the optical axis among the four divided regions may be the same green color filter.
  • the diffraction grating G is formed on the lens surface on the stop S side.
  • the light beam B1 passing through the first optical surface region D1 passes through the diffraction grating surface and the second optical surface region D2.
  • a region where the light beam B2 passes through the diffraction grating surface, a region where the light beam B3 which passes through the third optical surface region D3 passes through the diffraction grating surface, and a light beam B3 which passes through the fourth optical surface region D4 is diffracted.
  • a region passing on the lattice plane can be separated.
  • the depth d1 of the diffraction step on the diffraction grating G through which the light beam B1 passes is preferably set so that the diffraction efficiency of the light in the first wavelength band transmitted through the optical surface region D1 is high.
  • the depth d2 of the diffraction step on the diffraction grating G through which the light beam B2 passes is preferably set so that the diffraction efficiency of light in the second wavelength band transmitted through the optical surface region D2 is high.
  • the depth d3 of the diffraction step on the diffraction grating G through which the light beam B3 passes may be set so that the diffraction efficiency of the light in the third wavelength band transmitted through the optical surface region D3 is high.
  • the depth d4 of the diffraction step on the diffraction grating G through which the light beam B4 passes may be set so that the diffraction efficiency of the light in the fourth wavelength band transmitted through the optical surface region D4 is high.
  • diffraction gratings having appropriate diffraction step depths d1, d2, d3, and d4 for each wavelength of a predetermined bandwidth of the light beams B1, B2, B3, and B4.
  • an image having a diffraction efficiency of almost 100% can be obtained in the entire visible light region and infrared region.
  • the fourth embodiment is different from the first embodiment in that each of the first and second regions is separated from the optical axis, and the array-like optical element is replaced with a microlens from the lenticular. 1 and different.
  • a detailed description of the same contents as in the first embodiment is omitted.
  • FIG. 13A is a front view of the optical element L1 viewed from the subject side, and each of the optical surface regions D1 and D2 are arranged separately in the axially symmetric direction with the optical axis V as the center.
  • FIG. 13B is a diagram showing the positional relationship between the arrayed optical element K and the pixels on the image sensor N.
  • the odd-numbered row odd-numbered column and the even-numbered row even-numbered column are added and passed through the optical surface region D2. Since the light rays reach the even-numbered and odd-numbered columns and the odd-numbered and even-numbered columns, the even-numbered and odd-numbered columns and the odd-numbered and even-numbered columns are added to generate an image.
  • each of the first optical surface region D1 and the second optical surface region D2 is divided into two semicircular regions. For this reason, the center of gravity of the spot on the image plane of the light passing through each optical surface region may change depending on the subject distance, and parallax may occur.
  • FIG. 14 is a diagram for explaining a ray diagram for each subject distance and a point image and a change in its center of gravity in the first embodiment.
  • (a1), (b1), and (c1) are ray diagrams for each subject distance, and O is an object point.
  • Other symbols are the same as those in FIG. (A2) and (a3), (b2) and (b3), and (c2) and (c3) in FIG. 14 are a point image (shown as a semicircle) and a center of gravity (shown as a black dot) captured through a lenticular. These correspond to the subject distances of (a1), (b1), and (c1) in FIG.
  • each point image the image information (a2, b2, c2) extracted for each odd column and the pixel information (a3, b3, c3) extracted for each even column are doubled in the Y direction by interpolation processing. It is schematically shown as a thing. As shown in the figure, the spot diameter increases as the object point O approaches, but each point image has a semicircular shape. Therefore, when the acquired image is separated into an odd-numbered image and an even-numbered image, each image is displayed. The distance d between the centroids of the point image increases as the object point approaches. The center-to-center distance d is not preferable because it becomes parallax.
  • each of the optical surface regions D1 and D2 is arranged separately in the axially symmetric direction with the optical axis as the center, the distance between the center of gravity of the point image is changed even if the subject distance changes. d does not change.
  • FIG. 15 is a diagram for explaining a point image and its center of gravity for each subject distance.
  • (a1) and (a2), (b1) and (b2), and (c1) and (c2) are point images (shown as semicircles) imaged through a microlens and their center of gravity (black dots). These correspond to the subject distances of (a1), (b1), and (c1) in FIG.
  • Each point image includes image information (a1, b1, c1) obtained by adding odd rows and odd columns and even rows and even columns, and image information (a2, b2, c2) obtained by adding even rows and odd columns and odd rows and even columns.
  • image information (a1, b1, c1) obtained by adding odd rows and odd columns and even rows and even columns
  • image information (a2, b2, c2) obtained by adding even rows and odd columns and odd rows and even columns.
  • the first and second regions are separated from each other with the optical axis interposed therebetween, so that parallax occurs between acquired images even when the subject distance changes. You can avoid it.
  • the fifth embodiment differs from the first, second, and third embodiments in that a diffraction grating G is provided on the surface of the optical element L1 facing the lens L2 instead of providing the diffraction grating G on the lens L2. Yes.
  • a detailed description of the same contents as in the first, second, and third embodiments is omitted.
  • FIG. 16 is a schematic diagram illustrating the imaging apparatus A according to the fifth embodiment.
  • the manufacturing can be facilitated as compared with the case of providing the aspherical shape.
  • the diffraction grating G may be formed by processing the surface of the optical element L by a semiconductor process such as photolithography or etching.
  • the diffraction grating G can be processed on the surface of the optical element L by an electron beam drawing apparatus (EB drawing or the like).
  • the diffraction grating G is provided on the surface of the optical element L1 facing the lens L2, but the diffraction grating G may be provided on the surface of the optical element L1 on the subject side.
  • the sixth embodiment is different from the first, second, and third embodiments in that a lens L3 is added in addition to the lens L2.
  • a lens L3 is added in addition to the lens L2.
  • FIG. 17 is a schematic diagram illustrating the imaging apparatus A according to the sixth embodiment.
  • the lens L3 By adding the lens L3, the aberration generated in the optical system can be further reduced, and a higher-resolution optical system can be realized.
  • the seventh embodiment is different from the first, second, and third embodiments in that a lens L3 having a diffraction grating GA is added in addition to the lens L2.
  • a lens L3 having a diffraction grating GA is added in addition to the lens L2.
  • FIG. 18 is a schematic diagram illustrating the imaging apparatus A according to the seventh embodiment.
  • a lens L3 on which the diffraction grating GA is formed is added in the vicinity of the stop S.
  • An optical element L1 is disposed between the lens L3 and the lens L2.
  • the surface of the diffraction grating GA is preferably installed on the lens surface on the stop S side.
  • the power distribution of the diffraction grating can be finely adjusted, and chromatic aberration can be further reduced.
  • only the diffraction grating GA may be provided without providing the diffraction grating G.
  • the eighth embodiment is different from the first, second, and third embodiments in that an optical adjustment layer F1 is added so as to cover the surface of the diffraction grating G.
  • an optical adjustment layer F1 is added so as to cover the surface of the diffraction grating G.
  • FIG. 19 is a schematic diagram illustrating an imaging apparatus A according to the eighth embodiment.
  • the depth d of the diffraction step so as to satisfy (Equation 3), it is possible to further reduce unnecessary orders of diffracted light compared to the first, second, and third embodiments.
  • the wavelength bands of light transmitted through D2 and D3 are 400 nm to 500 nm, 500 nm to 600 nm, and 600 nm to 700 nm, respectively
  • d2 15.0 ⁇ m
  • d3 16.
  • the first-order diffraction efficiencies of the light beams B1, B2, and B3 are almost 100% in the respective wavelength bands as shown in FIGS. 20A, 20B, and 20C, and are visible as an optical system. Nearly 100% diffraction efficiency can be secured over the entire optical region.
  • the ninth embodiment is different from the first to eighth embodiments in that a lenticular lens and a microlens array are formed on the imaging surface.
  • a lenticular lens and a microlens array are formed on the imaging surface.
  • FIG. 21 (a) and 21 (b) are diagrams showing the arrayed optical element K and the imaging element N in an enlarged manner.
  • a lenticular lens (or microlens) Md is formed on the imaging surface Ni of the imaging element N. Pixels P are arranged in a matrix on the imaging surface Ni, as in the first embodiment.
  • One lenticular lens optical element or one microlens corresponds to the plurality of pixels P.
  • the light beams that have passed through different regions on the optical element L1 can be guided to different pixels.
  • FIG. 21B is a diagram showing a modification of the present embodiment. In the configuration shown in FIG.
  • a microlens Ms is formed on the imaging surface Ni so as to cover the pixel P, and an arrayed optical element is stacked on the surface of the microlens Ms.
  • the light collection efficiency can be increased as compared with the configuration in FIG.
  • the array-like optical element is separated from the image sensor as in the first to eighth embodiments, alignment of the array-like optical element and the image sensor becomes difficult.
  • the array-like optical element is difficult to align.
  • FIG. 22A is an enlarged view showing the vicinity of the imaging unit outside the optical axis. In FIG. 22A, only the light beam that passes through one optical surface region out of the light that passes through the arrayed optical element K is shown. As shown in FIG. 22A, when the lens optical system L is an image-side non-telecentric optical system, light leaks to adjacent pixels and crosstalk tends to occur, but as shown in FIG.
  • the offset amount ⁇ may be set according to the incident angle of the light beam on the imaging surface.
  • an image side telecentric optical system may be applied to the lens optical system L of the present embodiment.
  • an image side telecentric optical system even if the angle of view changes, the chief ray is incident on the arrayed optical element K at a value close to 0 degrees, so that crosstalk can be reduced over the entire imaging region.
  • FIG. 23 is a schematic diagram illustrating the imaging apparatus A when the image side telecentric optical system is applied.
  • the L3 lens is used to adjust the principal ray so that it is incident on the arrayed optical element K at a value close to 0 degrees even if the angle of view changes.
  • the arrayed optical element K is a microlens array, but each optical element of the microlens may be rotationally symmetric.
  • FIG. 24 (a3) shows the result of ray tracing simulation when the microlens shown in FIGS. 24 (a1) and (a2) is applied to the arrayed optical element of the present embodiment. In FIG. 24 (a3), only the light beam that passes through one optical surface region of the light that passes through the arrayed optical element K is shown.
  • FIG. 24 (b3) shows the result of ray tracing simulation when the microlens shown in FIGS. 24 (b1) and (b2) is applied to the arrayed optical element of the present embodiment.
  • the diaphragm S has a configuration in which a light shielding region is provided at a position corresponding to the boundary portion of the region.
  • the lens L2 As a method of manufacturing the lens L2, as shown in FIG. 25A, it is preferable to form the irregularities 12 for forming the diffraction grating G on the mold 11 using the electron beam drawing apparatus 10. If the electron beam drawing apparatus 10 is used, a non-rotationally symmetric structure can be easily formed. After forming the mold, as shown in FIG. 25B, a large amount of lenses L2 can be produced by injection molding if it is a resin material and press molding if it is a glass material. As another processing method, as shown in FIG. 26, there is a method in which molds 13A to 13D are formed by dividing each region, and one mold is formed by connecting them.
  • Embodiments 1 to 9 include the signal processing unit C, the imaging apparatus may not include these signal processing units. In that case, the processing performed by the signal processing unit C may be performed using a PC or the like outside the imaging apparatus. That is, according to one aspect of the present invention, it is possible to realize a system including an imaging device including the lens optical system L, the arrayed optical element K, and the imaging device N, and an external signal processing device.
  • the imaging device disclosed in the present application is useful as an imaging device such as a digital still camera or a digital video camera.
  • the present invention can also be applied to medical cameras such as in-vehicle cameras, security cameras, endoscopes and capsule endoscopes, biometric authentication cameras, microscopes, and astronomical telescopes.

Abstract

The image-capturing device disclosed in the present application is provided with: an optical element (L1) having in a predetermined plane at least a first optical surface region (D1) for admitting light of a first wavelength band, and a second optical surface region (D2) for admitting light of a second wavelength band that is different from the first wavelength band; a diffraction grating (G) having a first diffraction step provided in a region on which light passing through the first optical surface region (D1) impinges, and a second step provided in a region on which light passing through the second optical surface region (D2) impinges, the depth of the second diffraction step being different from that of the first diffraction step; an image-capturing element (N) having at least a plurality of first pixels and a plurality of second pixels; and an array-shaped optical element (K) arranged between the diffraction grating (G) and the image-capturing element (N), and adapted for causing light having passed through the first optical surface region (D1) to impinge on the plurality of first pixels and causing light having passed through the second optical surface region (D2) to impinge on the plurality of second pixels.

Description

撮像装置Imaging device
 本願は、カメラ等の撮像装置に関する。 This application relates to an imaging device such as a camera.
 カラー撮像用の固体撮像素子の各画素上には、一般的に、顔料もしくは染料などの有機材料を用いたカラーフィルタが形成されている。このようなカラーフィルタは、赤外光を通すため、撮像装置において良好なカラー画像を取得するには、赤外カットフィルターを固体撮像素子よりも前側の光路に配置する構成が一般的である。従って、単一の撮像素子を用いた撮像装置では、可視光と赤外光の両方の画像情報を同時に取得することは困難である。また、有機材料を用いたカラーフィルタは、波長帯域が広く、例えば青色、緑色、赤色の各波長帯域は、比較的広い波長帯域でオーバーラップするため、色再現性が悪くなる。 In general, a color filter using an organic material such as a pigment or a dye is formed on each pixel of a solid-state imaging device for color imaging. Since such a color filter transmits infrared light, in order to obtain a good color image in the imaging apparatus, a configuration in which the infrared cut filter is arranged in the optical path on the front side of the solid-state imaging device is generally used. Therefore, it is difficult for an imaging apparatus using a single imaging device to simultaneously acquire both visible light and infrared light image information. In addition, a color filter using an organic material has a wide wavelength band. For example, each wavelength band of blue, green, and red overlaps with a relatively wide wavelength band, so that color reproducibility is deteriorated.
 そこで、これらの課題を解決するために、誘電体多層膜によるカラーフィルタを形成した固体撮像素子に関する技術が開示されている(特許文献1、2)。 Therefore, in order to solve these problems, techniques relating to a solid-state imaging device in which a color filter using a dielectric multilayer film is formed are disclosed (Patent Documents 1 and 2).
 また、有機材料を用いたカラーフィルタは、狭い帯域の分光特性を形成することが困難であり、狭い波長帯域の色情報を抽出する撮像が困難である。 Also, a color filter using an organic material has difficulty in forming a narrow band spectral characteristic, and it is difficult to capture an image that extracts color information in a narrow wavelength band.
 そこで、狭帯域の色情報を取得するために、白色光と所定の狭帯域光を順次点灯させて画像を取得する技術が開示されている(特許文献3)。 Therefore, in order to acquire narrow band color information, a technique for acquiring an image by sequentially lighting white light and predetermined narrow band light is disclosed (Patent Document 3).
特開2010-212306号公報JP 2010-212306 A 特開2006-190958号公報JP 2006-190958 A 特許第4253550号Japanese Patent No. 4253550
 しかし、従来技術によれば、固体撮像素子が高価となったり、非常に微小な画素サイズでは形成が困難である。また、動体を撮像する場合、時間差による色ずれが生じる。 However, according to the prior art, the solid-state imaging device is expensive or difficult to form with a very small pixel size. Further, when a moving object is imaged, a color shift due to a time difference occurs.
 本願の、限定的ではない例示的なある実施形態は、従来よりも簡単な構成によって、画素ごとに分光情報を有する画像を取得することができる撮像装置を提供する。 One non-limiting exemplary embodiment of the present application provides an imaging apparatus capable of acquiring an image having spectral information for each pixel with a simpler configuration than the conventional one.
 本発明の一態様による撮像装置は、所定の平面内に、第1の波長帯域の光を透過する第1の領域と、前記第1の波長帯域とは異なる第2の波長帯域の光を透過する第2の領域とを少なくとも有する光学部材と、前記第1の領域を透過する光が入射する領域に設けられた第1の回折段差と、前記第2の領域を透過する光が入射する領域に設けられ、前記第1の回折段差とは異なる深さを有する第2の回折段差とを有する回折格子と、複数の第1の画素と複数の第2の画素とを少なくとも有する撮像素子と、前記回折格子と前記撮像素子との間に配置され、前記第1の領域を透過した光を前記複数の第1の画素に入射させ、前記第2の領域を透過した光を前記複数の第2の画素に入射させるアレイ状光学素子とを備える。 An imaging device according to one embodiment of the present invention transmits, in a predetermined plane, a first region that transmits light of a first wavelength band and light of a second wavelength band that is different from the first wavelength band. An optical member having at least a second region, a first diffraction step provided in a region where light transmitted through the first region is incident, and a region where light transmitted through the second region is incident A diffraction grating having a second diffraction step having a depth different from that of the first diffraction step, and an imaging device having at least a plurality of first pixels and a plurality of second pixels, Light that is disposed between the diffraction grating and the imaging device and that has passed through the first region is incident on the plurality of first pixels, and light that has passed through the second region is incident on the plurality of second regions. And an arrayed optical element that is incident on the pixels.
 本発明の一態様によれば、単一の撮像系を用いた1回の撮像によって、マルチスペクトル画像を取得することができる。本発明の一態様によれば、画素毎に誘電体多層膜を設ける必要がない。また、本発明の撮像装置を用いて動画を撮影した場合、時間の経過によって被写体の位置に変化が生じても、複数の画像間で像のずれが生じることがない。さらに、回折格子を設けることにより、光学系の色収差の補正と像面湾曲の低減が可能となる。 According to one aspect of the present invention, a multispectral image can be acquired by one imaging using a single imaging system. According to one embodiment of the present invention, it is not necessary to provide a dielectric multilayer film for each pixel. In addition, when a moving image is shot using the image pickup apparatus of the present invention, even if the subject position changes over time, there is no image shift between a plurality of images. Further, by providing a diffraction grating, it is possible to correct chromatic aberration of the optical system and reduce curvature of field.
撮像装置Aの実施の形態1を示す模式図である。1 is a schematic diagram illustrating a first embodiment of an imaging apparatus A. FIG. (a)は、実施の形態1における光学素子L1を被写体側から見た正面図である。(b)は、回折格子Gのレンズ光学系Lの光軸Vと垂直な平面における形状を示す図である。(c)は、回折格子Gの設けられたレンズL2の光軸Vと平行な面における断面形状を示す図である。(A) is the front view which looked at the optical element L1 in Embodiment 1 from the to-be-photographed object side. (B) is a diagram showing the shape of the diffraction grating G on a plane perpendicular to the optical axis V of the lens optical system L. FIG. (C) is a figure which shows the cross-sectional shape in the surface parallel to the optical axis V of the lens L2 in which the diffraction grating G was provided. 実施の形態1におけるアレイ状光学素子Kの斜視図である。2 is a perspective view of an arrayed optical element K in Embodiment 1. FIG. (a)は、本実施の形態1における図1に示すアレイ状光学素子Kおよび撮像素子Nを拡大して示す図である。(b)は、アレイ状光学素子Kと撮像素子Nの画素との位置関係を示す図である。(A) is an enlarged view of the arrayed optical element K and the imaging element N shown in FIG. 1 in the first embodiment. FIG. 6B is a diagram showing a positional relationship between the arrayed optical element K and the pixels of the image sensor N. (a)は、本実施の形態1における光学面領域D1を通過する光束B1の1次回折効率を示す図である。(b)は、本実施の形態1における光学面領域D2を通過する光束B2の1次回折効率を示す図である。(A) is a figure which shows the 1st-order diffraction efficiency of the light beam B1 which passes the optical surface area | region D1 in this Embodiment 1. FIG. (B) is a figure which shows the 1st-order diffraction efficiency of the light beam B2 which passes the optical surface area | region D2 in this Embodiment 1. FIG. 撮像装置Aの実施の形態2を示す模式図である。6 is a schematic diagram illustrating a second embodiment of the imaging apparatus A. FIG. (a)は、実施の形態2における光学素子L1を被写体側から見た正面図である。(b)は、回折格子Gのレンズ光学系Lの光軸Vと垂直な平面における形状を示す図である。(A) is the front view which looked at the optical element L1 in Embodiment 2 from the to-be-photographed object side. (B) is a diagram showing the shape of the diffraction grating G on a plane perpendicular to the optical axis V of the lens optical system L. FIG. (a)は、本実施の形態2における図6に示すアレイ状光学素子Kおよび撮像素子Nを拡大して示す図である。(b)は、アレイ状光学素子Kと撮像素子Nの画素との位置関係を示す図である。(A) is a figure which expands and shows the array-like optical element K and the image pick-up element N which are shown in FIG. FIG. 6B is a diagram showing a positional relationship between the arrayed optical element K and the pixels of the image sensor N. (a)は、本実施の形態2における光学面領域D1を通過する光束B1の1次回折効率を示す図である。(b)は、本実施の形態2における光学面領域D2を通過する光束B2の1次回折効率を示す図である。(c)は、本実施の形態2における光学面領域D3を通過する光束B3の1次回折効率を示す図である。(A) is a figure which shows the 1st-order diffraction efficiency of the light beam B1 which passes the optical surface area | region D1 in this Embodiment 2. FIG. (B) is a figure which shows the 1st-order diffraction efficiency of the light beam B2 which passes the optical surface area | region D2 in this Embodiment 2. FIG. (C) is a figure which shows the 1st-order diffraction efficiency of the light beam B3 which passes the optical surface area | region D3 in this Embodiment 2. FIG. (a)は、実施の形態3における光学素子L1を被写体側から見た正面図である。(b)は、回折格子Gのレンズ光学系Lの光軸Vと垂直な平面における形状を示す図である。(A) is the front view which looked at the optical element L1 in Embodiment 3 from the to-be-photographed object side. (B) is a diagram showing the shape of the diffraction grating G on a plane perpendicular to the optical axis V of the lens optical system L. FIG. 実施の形態3におけるアレイ状光学素子Kの斜視図である。FIG. 10 is a perspective view of an arrayed optical element K in a third embodiment. (a)は、本実施の形態3におけるアレイ状光学素子Kおよび撮像素子Nを拡大して示す図である。(b)は、アレイ状光学素子Kと撮像素子Nの画素との位置関係を示す図である。(A) is an enlarged view of the arrayed optical element K and the imaging element N in the third embodiment. FIG. 6B is a diagram showing a positional relationship between the arrayed optical element K and the pixels of the image sensor N. (a)は、実施の形態4における光学素子L1を被写体側から見た正面図である。(b)は、アレイ状光学素子Kと撮像素子Nの画素との位置関係を示す図である。(A) is the front view which looked at the optical element L1 in Embodiment 4 from the to-be-photographed object side. FIG. 6B is a diagram showing a positional relationship between the arrayed optical element K and the pixels of the image sensor N. (a1)、(b1)、(c1)は実施の形態1における被写体距離毎の光線図であり、(a2)、(a3)、(b2)、(b3)、(c2)、(c3)は点像とその重心の変化について説明する図である。(A1), (b1), and (c1) are ray diagrams for each subject distance in Embodiment 1, and (a2), (a3), (b2), (b3), (c2), and (c3) are It is a figure explaining the change of a point image and its gravity center. (a1)、(b1)、(c1)、(a2)、(b2)、(c2)は実施の形態4における被写体距離毎の点像とその重心について説明する図である。(A1), (b1), (c1), (a2), (b2), (c2) are diagrams for explaining a point image and its center of gravity for each subject distance in the fourth embodiment. 撮像装置Aの実施の形態5を示す模式図である。10 is a schematic diagram illustrating Embodiment 5 of an imaging apparatus A. FIG. 撮像装置Aの実施の形態6を示す模式図である。It is a schematic diagram which shows Embodiment 6 of the imaging device A. 撮像装置Aの実施の形態7を示す模式図である。FIG. 10 is a schematic diagram illustrating Embodiment 7 of an imaging apparatus A. 撮像装置Aの実施の形態8を示す模式図である。It is a schematic diagram which shows Embodiment 8 of the imaging device A. (a)は、本実施の形態8における光学面領域D1を通過する光束B1の1次回折効率を示す図である。(b)は、本実施の形態8における光学面領域D2を通過する光束B2の1次回折効率を示す図である。(c)は、本実施の形態8における光学面領域D3を通過する光束B3の1次回折効率を示す図である。(A) is a figure which shows the 1st-order diffraction efficiency of the light beam B1 which passes the optical surface area | region D1 in this Embodiment 8. FIG. (B) is a figure which shows the 1st-order diffraction efficiency of the light beam B2 which passes the optical surface area | region D2 in this Embodiment 8. FIG. (C) is a figure which shows the 1st-order diffraction efficiency of the light beam B3 which passes the optical surface area | region D3 in this Embodiment 8. FIG. (a)および(b)は、実施の形態9におけるアレイ状光学素子Kおよび撮像素子Nを拡大して示す図である。(A) And (b) is the figure which expands and shows the array-like optical element K in Embodiment 9, and the image pick-up element N. FIG. (a)および(b)は、その他の形態におけるアレイ状光学素子Kおよび撮像素子Nを拡大して示す図である。(A) And (b) is a figure which expands and shows the array-like optical element K and the image pick-up element N in another form. 撮像装置Aの実施の形態を示す模式図である。1 is a schematic diagram illustrating an embodiment of an imaging apparatus A. FIG. (a1)および(b1)は、その他の形態におけるアレイ状光学素子Kの斜視図である。(a2)および(b2)は、各光学要素の等高線を示す図である。(a3)および(b3)は、光線追跡シミュレーションの結果を示す図である。(A1) And (b1) is a perspective view of the array-like optical element K in another form. (A2) And (b2) is a figure which shows the contour line of each optical element. (A3) And (b3) is a figure which shows the result of a ray tracing simulation. (a)は、その他の形態におけるレンズL2の金型作成方法を示す図である。(b)は、その他の形態におけるレンズL2の金型成形を説明する図である。(A) is a figure which shows the metal mold | die creation method of the lens L2 in another form. (B) is a figure explaining metallic mold fabrication of lens L2 in other forms. その他の形態におけるレンズL2の金型作成方法を示す図である。It is a figure which shows the metal mold | die creation method of the lens L2 in another form.
 本発明の一態様の概要は以下の通りである。 The outline of one embodiment of the present invention is as follows.
 本発明の一態様である、撮像装置は、所定の平面内に、第1の波長帯域の光を透過する第1の領域と、前記第1の波長帯域とは異なる第2の波長帯域の光を透過する第2の領域とを少なくとも有する光学部材と、前記第1の領域を透過する光が入射する領域に設けられた第1の回折段差と、前記第2の領域を透過する光が入射する領域に設けられ、前記第1の回折段差とは異なる深さを有する第2の回折段差とを有する回折格子と、複数の第1の画素と複数の第2の画素とを少なくとも有する撮像素子と、前記回折格子と前記撮像素子との間に配置され、前記第1の領域を透過した光を前記複数の第1の画素に入射させ、前記第2の領域を透過した光を前記複数の第2の画素に入射させるアレイ状光学素子とを備える。 An imaging device according to one embodiment of the present invention includes a first region that transmits light in a first wavelength band in a predetermined plane and light in a second wavelength band that is different from the first wavelength band. An optical member having at least a second region that transmits light, a first diffraction step provided in a region where light transmitted through the first region is incident, and light transmitted through the second region is incident An imaging device provided at least in a region having a diffraction grating having a second diffraction step having a depth different from that of the first diffraction step, and a plurality of first pixels and a plurality of second pixels And between the diffraction grating and the image sensor, the light transmitted through the first region is incident on the plurality of first pixels, and the light transmitted through the second region is transmitted through the plurality of regions. And an arrayed optical element that is incident on the second pixel.
 前記光学部材は、前記第1、第2の領域以外の少なくとも第3の領域をさらに有し、前記回折格子は、前記第3の領域を透過する光が入射する領域に設けられ、前記第1、第2の回折段差とは異なる深さを有する第3の回折段差をさらに有し、前記撮像素子は、複数の第3の画素をさらに有し、前記第3の領域は、前記第1の波長帯域および前記第2の波長帯域と異なる第3の波長帯域の光を透過し、前記アレイ状光学素子は、前記第3の領域を透過した光を、前記複数の第3の画素に入射させてもよい。 The optical member further includes at least a third region other than the first and second regions, and the diffraction grating is provided in a region where light transmitted through the third region is incident, , A third diffraction step having a depth different from that of the second diffraction step, the image pickup device further includes a plurality of third pixels, and the third region includes the first diffraction step. Transmitting light in a wavelength band and a third wavelength band different from the second wavelength band, and the arrayed optical element causes the light transmitted through the third region to enter the plurality of third pixels. May be.
 前記光学部材は、前記第1、第2、第3の領域以外の第4の領域をさらに有し、前記回折格子は、前記第4の領域を透過する光が入射する領域に設けられ、前記第1、第2および第3の回折段差とは異なる深さを有する第4の回折段差をさらに有し、前記撮像素子は複数の第4の画素をさらに有し、前記第4の領域は、前記第1、第2および第3の波長帯域と異なる第4の波長帯域の光を透過し、前記アレイ状光学素子は、前記第4の領域を透過した光を、前記複数の第4の画素に入射させてもよい。 The optical member further includes a fourth region other than the first, second, and third regions, and the diffraction grating is provided in a region where light transmitted through the fourth region is incident, A fourth diffraction step having a depth different from the first, second, and third diffraction steps, the imaging device further includes a plurality of fourth pixels, and the fourth region includes: The arrayed optical element transmits light of a fourth wavelength band different from the first, second, and third wavelength bands, and the arrayed optical element transmits light transmitted through the fourth region to the plurality of fourth pixels. You may make it enter into.
 撮像装置は、前記光学部材と前記アレイ状光学素子との間に設けられ、前記光学部材を通過した光が入射する第1のレンズをさらに備えていてもよい。 The imaging apparatus may further include a first lens that is provided between the optical member and the arrayed optical element and into which light that has passed through the optical member is incident.
 前記回折格子は前記第1のレンズのうち前記光学部材と対向する面に設けられていてもよい。 The diffraction grating may be provided on a surface of the first lens facing the optical member.
 前記回折格子は前記光学部材のうち前記第1のレンズと対向する面に設けられていてもよい。 The diffraction grating may be provided on a surface of the optical member that faces the first lens.
 前記光学部材の前記第1および第2の領域は、前記第1のレンズの光軸を挟んで、互いに分離された複数の領域から構成されている。 The first and second regions of the optical member are composed of a plurality of regions separated from each other across the optical axis of the first lens.
 前記第1の領域及び前記第2の領域のうち少なくとも1つの領域は、近赤外光の波長帯域の光を透過させ、可視光領域の波長の光を遮断する分光透過率特性を有していてもよい。 At least one of the first region and the second region has a spectral transmittance characteristic that transmits light in the near-infrared wavelength band and blocks light in the visible light region. May be.
 前記第1の領域及び前記第2の領域のうち少なくとも1つの領域は、他の領域の波長帯域幅よりも相対的に狭い波長帯域幅の光を透過させる分光透過率特性を有していてもよい。 At least one of the first region and the second region may have a spectral transmittance characteristic that transmits light having a wavelength bandwidth relatively narrower than the wavelength bandwidth of the other region. Good.
 前記第1の領域および前記第2の領域は、前記第1のレンズの光軸を境界中心として分割された領域であってもよい。 The first region and the second region may be regions divided with the optical axis of the first lens as a boundary center.
 撮像装置は、第2のレンズをさらに備え、前記光学部材は、前記第2のレンズと前記第1のレンズとの間に配置され、前記第2のレンズの表面には回折格子が設けられていてもよい。 The imaging apparatus further includes a second lens, the optical member is disposed between the second lens and the first lens, and a diffraction grating is provided on a surface of the second lens. May be.
 撮像装置は、前記回折格子が設けられた面上に形成された光学調整層をさらに備えていてもよい。 The imaging device may further include an optical adjustment layer formed on the surface provided with the diffraction grating.
 撮像装置は、前記第1の領域および前記第2の領域の境界部に対応する位置に設けられた遮光領域をさらに備えていてもよい。 The imaging apparatus may further include a light shielding region provided at a position corresponding to a boundary portion between the first region and the second region.
 撮像装置は、絞りをさらに備え、前記第1の領域及び前記第2の領域は、前記絞り近傍に配置されていてもよい。 The imaging apparatus may further include a diaphragm, and the first area and the second area may be disposed in the vicinity of the diaphragm.
 前記回折格子は、前記絞り近傍に配置されていてもよい。 The diffraction grating may be arranged in the vicinity of the stop.
 前記第1の回折段差に到達する光の波長帯幅の中央の値をλ1とすると、前記第1の回折段差の深さd1は、0.9λ1/{n(λ1)-1}≦d1≦1.1λ1/{n(λ1)-1}を満たし、前記第2の回折段差に到達する光の波長帯幅の中央の値をλ2とすると、前記第2の回折段差深さd2は、0.9λ2/{n(λ2)-1}≦d2≦1.1λ2/{n(λ2)-1}を満たしていてもよい。 When the central value of the wavelength band width of the light reaching the first diffraction step is λ 1 , the depth d 1 of the first diffraction step is 0.9λ 1 / {n (λ 1 ) −1. } ≦ d 1 ≦ 1.1λ 1 / {n (λ 1 ) −1} and the central value of the wavelength band width of light reaching the second diffraction step is λ 2 , the second The diffraction step depth d 2 may satisfy 0.9λ 2 / {n (λ 2 ) −1} ≦ d 2 ≦ 1.1λ 2 / {n (λ 2 ) −1}.
 前記第1の回折段差における前記第1の波長帯域の光の回折効率は、前記第2の波長帯域の光の回折効率よりも高く、前記第2の回折段差における前記第2の波長帯域の光の回折効率は、前記第1の波長帯域の光の回折効率よりも高くてもよい。 The diffraction efficiency of the light in the first wavelength band at the first diffraction step is higher than the diffraction efficiency of the light in the second wavelength band, and the light in the second wavelength band at the second diffraction step. The diffraction efficiency may be higher than the diffraction efficiency of light in the first wavelength band.
 撮像装置は、信号処理部をさらに備え、前記信号処理部は、前記複数の第1の画素において得られた画素値から前記第1の波長帯域に対応する第1の画像情報を生成し、前記複数の第2の画素において得られた画素値から前記第2の波長帯域に対応する第2の画像情報を生成してもよい。 The imaging apparatus further includes a signal processing unit, wherein the signal processing unit generates first image information corresponding to the first wavelength band from pixel values obtained in the plurality of first pixels, and Second image information corresponding to the second wavelength band may be generated from pixel values obtained in a plurality of second pixels.
 本発明の他の一態様による撮像システムは、上記いずれかに記載の撮像装置と、前記撮像装置における前記複数の第1の画素において得られた画素値から前記第1の波長帯域に対応する第1の画像情報を生成し、前記複数の第2の画素において得られた画素値から前記第2の波長帯域に対応する第2の画像情報を生成する信号処理装置とを備える。 An imaging system according to another aspect of the present invention includes an imaging device according to any one of the above, and a first wavelength band corresponding to the first wavelength band from pixel values obtained in the plurality of first pixels in the imaging device. A signal processing device that generates one piece of image information and generates second image information corresponding to the second wavelength band from pixel values obtained in the plurality of second pixels.
 以下撮像装置の実施形態を、図面を参照しながら説明する。 Hereinafter, an embodiment of an imaging apparatus will be described with reference to the drawings.
(実施の形態1)
 図1は、実施の形態1の撮像装置Aを示す模式図である。本実施形態の撮像装置Aは、Vを光軸とするレンズ光学系Lと、レンズ光学系Lの焦点近傍に配置されたアレイ状光学素子Kと、撮像素子Nと、第1の信号処理部Cとを備える。
(Embodiment 1)
FIG. 1 is a schematic diagram illustrating an imaging apparatus A according to the first embodiment. An imaging apparatus A according to the present embodiment includes a lens optical system L having V as an optical axis, an array-like optical element K disposed near the focal point of the lens optical system L, an imaging element N, and a first signal processing unit. C.
 レンズ光学系Lは、被写体(図示せず)からの光が入射する絞りSと、絞りS近傍に配置され、所定の面内(例えば、レンズ光学系Lの光軸Vと垂直な面内)に、第1の波長帯域の光を透過する第1の光学面領域D1と、第1の波長帯域と異なる第2の波長帯域の光を透過する第2の光学面領域D2を有し、絞りSからの光が通過する光学素子L1と、光学素子L1を通過した光が入射するレンズL2とから構成されている。 The lens optical system L is disposed in the vicinity of the diaphragm S where light from a subject (not shown) enters, and in the vicinity of the diaphragm S, for example, in a plane perpendicular to the optical axis V of the lens optical system L. A first optical surface region D1 that transmits light in the first wavelength band, and a second optical surface region D2 that transmits light in a second wavelength band different from the first wavelength band. An optical element L1 through which light from S passes and a lens L2 through which light that has passed through the optical element L1 enters.
 レンズL2の絞りS側(光学素子L1と対向する面)には、回折格子Gが形成されている。回折格子Gは、第1の光学面領域D1を通過した光が入射する領域に設けられた第1の回折格子部G1と、第2の光学面領域D2を通過した光が入射する領域に設けられた第2の回折格子部G2とを有している。第1の回折格子部G1および第2の回折格子部G2のそれぞれは、光軸Vを中心に同心円状に形成された複数の輪帯及び複数の回折段差から構成されている。第1の回折格子部G1における複数の回折段差の深さd1は、第2の回折格子部G2における複数の回折段差の深さd2と異なる。回折格子Gの詳細は後述する。 A diffraction grating G is formed on the stop S side (surface facing the optical element L1) of the lens L2. The diffraction grating G is provided in a first diffraction grating portion G1 provided in a region where light that has passed through the first optical surface region D1 is incident and a region in which light that has passed through the second optical surface region D2 is incident. The second diffraction grating portion G2 is provided. Each of the first diffraction grating portion G1 and the second diffraction grating portion G2 includes a plurality of annular zones and a plurality of diffraction steps formed concentrically around the optical axis V. The depths d1 of the plurality of diffraction steps in the first diffraction grating part G1 are different from the depths d2 of the plurality of diffraction steps in the second diffraction grating part G2. Details of the diffraction grating G will be described later.
 図1では、レンズL2は1枚構成であるが、複数枚の構成であってもよい。 In FIG. 1, the lens L2 has a single lens configuration, but may have a plurality of configurations.
 「第1の波長帯域」及び「第2の波長帯域」における「波長帯域」とは、領域を透過する光の全光量のうち50%以上の光量を占める連続する帯域である。領域を通過することにより95%以上カットされる波長は「波長帯域」に含まれない。つまり、第1の光学面領域D1は、第1の波長帯域の光を選択的に透過し、第2の光学面領域D2は、第2の波長帯域の光を選択的に透過する。 The “wavelength band” in the “first wavelength band” and the “second wavelength band” is a continuous band that occupies 50% or more of the total amount of light transmitted through the region. A wavelength that is cut by 95% or more by passing through the region is not included in the “wavelength band”. That is, the first optical surface region D1 selectively transmits light in the first wavelength band, and the second optical surface region D2 selectively transmits light in the second wavelength band.
 また、2つの波長帯域が互いに異なるとは、少なくとも一方の波長帯域に、他方の波長帯域には含まれない波長帯域が存在することを言う。したがって、一部の波長帯域が重複していてもよい。 Also, the fact that two wavelength bands are different from each other means that at least one wavelength band includes a wavelength band that is not included in the other wavelength band. Therefore, some wavelength bands may overlap.
 透過する波長帯域が互いに異なる構成は、光学素子L1の絞りS側の表面上に有機材料や誘電多層膜を用いたフィルタを形成する構成や、染色系のフィルタによって光学素子L1を領域毎に染色する構成によって異ならせている。このようなカラーフィルタは、1つの平板上に形成されていても良いし、領域毎に分割した複数の平板上に形成されていてもよい。 The configurations in which the wavelength bands to be transmitted are different from each other include a configuration in which a filter using an organic material or a dielectric multilayer film is formed on the surface of the optical element L1 on the stop S side, or the optical element L1 is dyed for each region with a dye-based filter It varies depending on the configuration. Such a color filter may be formed on one flat plate, or may be formed on a plurality of flat plates divided for each region.
 本実施形態では、2つの光学面領域D1、D2を通過した光は、回折格子Gの設けられたレンズL2を通過した後、アレイ状光学素子Kに入射する。アレイ状光学素子Kは、光学面領域D1を通過した光を撮像素子Nにおける複数の画素P1に、光学面領域D2を通過した光を撮像素子Nにおける複数の画素P2に入射させる。信号処理部Cは、画素P1において得られる画素値から第1の波長帯域に対応する画像情報を生成し、画素P2において得られる画素値から第2の波長帯域に対応する画像情報を生成し、出力する。 In this embodiment, the light that has passed through the two optical surface regions D1 and D2 passes through the lens L2 provided with the diffraction grating G, and then enters the arrayed optical element K. The arrayed optical element K causes light that has passed through the optical surface region D1 to enter the plurality of pixels P1 in the image sensor N, and light that has passed through the optical surface region D2 to enter the plurality of pixels P2 in the image sensor N. The signal processing unit C generates image information corresponding to the first wavelength band from the pixel value obtained in the pixel P1, generates image information corresponding to the second wavelength band from the pixel value obtained in the pixel P2, Output.
 図1において、光束B1は、光学素子L1上の光学面領域D1を通過する光束であり、光束B2は、光学素子L1上の光学面領域D2を通過する光束である。光束B1、B2は、絞りS、光学素子L1、レンズL2、アレイ状光学素子Kをこの順に通過し、撮像素子N上の撮像面Niにおける画素P1および画素P2(図4に示す)にそれぞれ到達する。 In FIG. 1, a light beam B1 is a light beam that passes through the optical surface region D1 on the optical element L1, and a light beam B2 is a light beam that passes through the optical surface region D2 on the optical element L1. The light beams B1 and B2 pass through the diaphragm S, the optical element L1, the lens L2, and the arrayed optical element K in this order, and reach the pixels P1 and P2 (shown in FIG. 4) on the imaging surface Ni on the imaging element N, respectively. To do.
 図2(a)は、光学素子L1を被写体側から見た正面図である。光学素子L1における光学面領域D1とD2は、光軸Vを境界中心として上下に2分割されている。図2(a)において、破線sは、絞りSの位置を示している。図2(b)は、レンズ光学系Lの光軸Vと垂直な平面における回折格子Gの形状を示す図であり、図2(c)は、回折格子Gの設けられたレンズL2の光軸Vと平行な面における断面形状を示す図である。図2(b)、(c)に示すように、回折格子Gにおいては、光学素子L1の光学面領域D1、D2に対応する領域に、第1、第2の回折格子部G1、G2が設けられている。第1、第2の回折格子部G1、G2においては、それぞれ、複数の輪帯R1、R2が設けられている。隣合う輪帯R1の間および隣合う輪帯R2の間には、回折段差A1、A2がそれぞれ設けられている。回折段差A1は複数設けられ、典型的には、複数の回折段差A1のそれぞれは同じ深さd1を有している。また、第2の回折格子部G2において、回折段差A2は複数設けられ、典型的には、複数の回折段差A2のそれぞれは同じ深さd2を有している。第1の回折格子部G1における回折段差A1の深さd1は、第2の回折格子部G2における回折段差A2の深さd2と異なる。輪帯R1、R2の幅は、目的とする光学パワーなどに応じて設定される。 FIG. 2A is a front view of the optical element L1 as viewed from the subject side. The optical surface areas D1 and D2 in the optical element L1 are divided into two vertically with the optical axis V as the center of the boundary. In FIG. 2A, the broken line s indicates the position of the diaphragm S. 2B is a diagram showing the shape of the diffraction grating G in a plane perpendicular to the optical axis V of the lens optical system L, and FIG. 2C is an optical axis of the lens L2 provided with the diffraction grating G. It is a figure which shows the cross-sectional shape in a surface parallel to V. FIG. As shown in FIGS. 2B and 2C, in the diffraction grating G, first and second diffraction grating portions G1 and G2 are provided in regions corresponding to the optical surface regions D1 and D2 of the optical element L1. It has been. In the first and second diffraction grating portions G1 and G2, a plurality of annular zones R1 and R2 are provided, respectively. Diffraction steps A1 and A2 are provided between adjacent annular zones R1 and adjacent annular zones R2, respectively. A plurality of diffraction steps A1 are provided. Typically, each of the plurality of diffraction steps A1 has the same depth d1. Further, in the second diffraction grating portion G2, a plurality of diffraction steps A2 are provided, and typically each of the plurality of diffraction steps A2 has the same depth d2. The depth d1 of the diffraction step A1 in the first diffraction grating part G1 is different from the depth d2 of the diffraction step A2 in the second diffraction grating part G2. The widths of the annular zones R1 and R2 are set according to the target optical power and the like.
 図3は、アレイ状光学素子Kの斜視図である。アレイ状光学素子Kにおける撮像素子N側の面には、横方向に細長い複数の光学要素M1が縦方向に配置されている。それぞれの光学要素M1の断面(縦方向)は、撮像素子N側に突出した曲面の形状を有する。このように、アレイ状光学素子Kは、レンチキュラレンズの構成を有する。 FIG. 3 is a perspective view of the arrayed optical element K. FIG. On the surface on the imaging element N side of the arrayed optical element K, a plurality of optical elements M1 elongated in the horizontal direction are arranged in the vertical direction. The cross section (longitudinal direction) of each optical element M1 has a curved shape protruding toward the image sensor N side. Thus, the arrayed optical element K has a lenticular lens configuration.
 図1に示すように、アレイ状光学素子Kは、レンズ光学系Lの焦点近傍に配置されており、撮像面Niから所定の距離だけ離れた位置に配置されている。 As shown in FIG. 1, the array-like optical element K is disposed in the vicinity of the focal point of the lens optical system L, and is disposed at a position away from the imaging surface Ni by a predetermined distance.
 図4(a)は、図1に示すアレイ状光学素子Kおよび撮像素子Nを拡大して示す図であり、図4(b)は、アレイ状光学素子Kと撮像素子N上の画素との位置関係を示す図である。アレイ状光学素子Kは、光学要素M1が形成された面が撮像面Ni側に向かうように配置されている。撮像面Niには、画素Pが行列状に配置されている。画素Pは、画素P1および画素P2に区別できる。図4(a)に示すように、本実施の形態では、撮像素子Nの撮像面Ni上には画素P1、P2に対応したカラーフィルタは設けられていない。 4A is an enlarged view of the arrayed optical element K and the image sensor N shown in FIG. 1, and FIG. 4B is a diagram illustrating the array of the optical element K and the pixels on the image sensor N. It is a figure which shows a positional relationship. The arrayed optical element K is arranged so that the surface on which the optical element M1 is formed faces the imaging surface Ni side. Pixels P are arranged in a matrix on the imaging surface Ni. The pixel P can be distinguished into a pixel P1 and a pixel P2. As shown in FIG. 4A, in the present embodiment, the color filters corresponding to the pixels P1 and P2 are not provided on the imaging surface Ni of the imaging element N.
 画素P1および画素P2のそれぞれは、横方向(行方向)に1行に並んで配置されている。縦方向(列方向)において、画素P1とP2とは交互に配置されている。アレイ状光学素子Kは、その光学要素M1の1つが、撮像面Ni上における1行の画素P1および1行の画素P2からなる2行の画素に対応するように配置されている。撮像面Ni上には、画素P1、P2の表面を覆うようにマイクロレンズMsが設けられている。 Each of the pixel P1 and the pixel P2 is arranged in a row in the horizontal direction (row direction). In the vertical direction (column direction), the pixels P1 and P2 are alternately arranged. The arrayed optical element K is arranged so that one of the optical elements M1 corresponds to two rows of pixels including one row of pixels P1 and one row of pixels P2 on the imaging surface Ni. On the imaging surface Ni, a microlens Ms is provided so as to cover the surfaces of the pixels P1 and P2.
 アレイ状光学素子Kは、光学素子L1上の光学面領域D1(図1、図2に示す)を通過した光束(図1において実線で示される光束B1)の大部分が、撮像面Ni上の画素P1に到達し、光学面領域D2を通過した光束(図1において破線で示される光束B2)の大部分が、撮像面Ni上の画素P2に到達するように設計されている。具体的には、アレイ状光学素子Kの屈折率、撮像面Niからの距離及び光学要素M1表面の曲率半径等のパラメータを適切に設定することで、上記構成が実現する。 In the array-like optical element K, most of the light beam (light beam B1 shown by a solid line in FIG. 1) that has passed through the optical surface region D1 (shown in FIGS. 1 and 2) on the optical element L1 is on the imaging surface Ni. It is designed so that most of the light beam (light beam B2 indicated by a broken line in FIG. 1) that reaches the pixel P1 and passes through the optical surface region D2 reaches the pixel P2 on the imaging surface Ni. Specifically, the above configuration is realized by appropriately setting parameters such as the refractive index of the arrayed optical element K, the distance from the imaging surface Ni, and the radius of curvature of the surface of the optical element M1.
 撮像光学系が、像側非テレセントリック光学系の場合、絞りを通過する光線の位置と光軸に対する角度とによって焦点における光線の角度が決定される。また、アレイ状光学素子は、光線の入射角度に応じて出射方向を振り分ける作用を有する。従って、絞り近傍に光学面領域D1およびD2を配置し、かつ、前述のようにアレイ状光学素子Kを焦点近傍に配置することにより、各光学面領域を通過した光束B1およびB2をそれぞれ、画素P1およびP2に分離して導くことができる。なお、光学面領域D1およびD2を配置する位置が絞りの位置から大きく離れると、光学面領域D1を通過した光と光学面領域D2を通過した光とをそれぞれ画素P1と画素P2に分離しきれず、クロストークが多く発生してしまう。なお、撮像光学系が、像側テレセントリック光学系の場合は、絞りを通過する光線は平行なため、焦点における光線の角度は、絞りを通過する光線の位置によって一義的に決定される。 When the imaging optical system is an image-side non-telecentric optical system, the angle of the light beam at the focal point is determined by the position of the light beam passing through the stop and the angle with respect to the optical axis. Further, the array-like optical element has a function of distributing the emission direction according to the incident angle of the light beam. Therefore, by arranging the optical surface regions D1 and D2 in the vicinity of the stop, and arranging the arrayed optical element K in the vicinity of the focal point as described above, the light beams B1 and B2 that have passed through the optical surface regions are respectively pixelated. It can be separated into P1 and P2. If the positions where the optical surface regions D1 and D2 are disposed are greatly separated from the position of the stop, the light passing through the optical surface region D1 and the light passing through the optical surface region D2 cannot be separated into the pixels P1 and P2, respectively. A lot of crosstalk occurs. When the imaging optical system is an image side telecentric optical system, the light rays passing through the stop are parallel, and therefore the angle of the light ray at the focal point is uniquely determined by the position of the light ray passing through the stop.
 以上の構成により、画素P1と画素P2は、互いに異なる波長帯域の光に対応する画像情報をそれぞれ生成する。光束B1およびB2は、被写体から同時刻に得られたものであるため、画素P1およびP2が検出する画像情報に時差はない。つまり、撮像装置Aは、互いに異なる波長帯域の光によって形成される複数の画像情報を、単一の撮像光学系で、かつ、1回の撮像で取得することができる。このため、動体を撮影しても、時差による色ずれが生じることなく、画素ごとに分光情報を有する画像を得ることができる。 With the above configuration, the pixel P1 and the pixel P2 respectively generate image information corresponding to light in different wavelength bands. Since the light beams B1 and B2 are obtained from the subject at the same time, there is no time difference in the image information detected by the pixels P1 and P2. That is, the imaging apparatus A can acquire a plurality of pieces of image information formed by light having different wavelength bands with a single imaging optical system and with one imaging. For this reason, even when a moving body is photographed, an image having spectral information for each pixel can be obtained without causing a color shift due to time difference.
 第1の波長帯域及び第2の波長帯域の具体例を挙げる。 Specific examples of the first wavelength band and the second wavelength band are given below.
 ひとつの例では、第1の光学面領域D1は、第1の波長帯域の光として可視光を透過し近赤外光を実質的に遮断する特性を有する光学フィルタである。第2の光学面領域D2は、可視光を実質的に遮断し、第2の波長帯域の光として近赤外光を透過する特性を有する光学フィルタである。これにより、昼夜兼用の撮像装置や生態認証用の撮像装置を実現することができる。このような撮像装置において、近赤外光の画像を取得する際には、近赤外光の帯域を含む分光放射特性を有する光源をていてもよい。 In one example, the first optical surface region D1 is an optical filter having a characteristic of transmitting visible light as light in the first wavelength band and substantially blocking near-infrared light. The second optical surface region D2 is an optical filter having a characteristic of substantially blocking visible light and transmitting near-infrared light as light in the second wavelength band. Thereby, an imaging device for day and night and an imaging device for biometric authentication can be realized. In such an imaging apparatus, when acquiring a near-infrared light image, a light source having spectral radiation characteristics including a near-infrared light band may be provided.
 また別の例では、第1の光学面領域D1は、所定の波長帯域幅の光を透過する光学フィルタであり、第2の光学面領域D2は、前記所定の波長帯域幅よりも狭い帯域幅の光を透過する光学フィルタである。つまり、第1の波長帯域の幅よりも第2の波長帯域の幅を狭くする。これにより、病変を狭帯域で観察することができる内視鏡やカプセル内視鏡用途の撮像装置を実現することができる。この例では、第2の波長帯域は、第1の波長帯域に含まれていてもよいし、含まれていなくても良い。このような撮像装置においては、第1および第2の波長帯域を含む分光放射特性を有する1種類の光源、あるいは、第1および第2の波長帯域にそれぞれ対応した分光放射特性を有する複数種類の光源を備えていてもよい。このような用途においては、広帯域で取得した画像と狭帯域で取得した画像をそれぞれ異なる色でモニター表示させることにより、病変を容易に見分けることができる。 In another example, the first optical surface region D1 is an optical filter that transmits light of a predetermined wavelength bandwidth, and the second optical surface region D2 is a bandwidth narrower than the predetermined wavelength bandwidth. It is an optical filter which permeate | transmits the light. That is, the width of the second wavelength band is made narrower than the width of the first wavelength band. As a result, it is possible to realize an endoscope that can observe a lesion in a narrow band or an imaging device for capsule endoscope. In this example, the second wavelength band may or may not be included in the first wavelength band. In such an imaging apparatus, one type of light source having spectral radiation characteristics including the first and second wavelength bands, or a plurality of types of spectral radiation characteristics corresponding to the first and second wavelength bands, respectively. A light source may be provided. In such applications, lesions can be easily distinguished by displaying images acquired in a wide band and images acquired in a narrow band in different colors.
 これら第1の光学面領域D1および第2の光学面領域D2に配置する光学フィルタは、有機材料を用いたカラーフィルタであってもよいし、誘電体多層膜によるカラーフィルタであってもよい。誘電体多層膜を用いる場合、狭い帯域の分光特性を実現でき、狭い波長帯域の色情報を抽出する撮影が可能となる。また、撮像素子の画素上に誘電体多層膜を設ける場合に比べて、安価で比較的容易に誘電体多層膜からなるカラーフィルタを第1の光学面領域D1および第2の光学面領域D2に設けることが可能である。このため、第1の光学面領域D1および第2の光学面領域D2における分光特性の調整や変更も比較的容易である。 The optical filter disposed in the first optical surface region D1 and the second optical surface region D2 may be a color filter using an organic material or a color filter using a dielectric multilayer film. When a dielectric multilayer film is used, narrow band spectral characteristics can be realized, and photographing that extracts color information in a narrow wavelength band is possible. Compared with the case where the dielectric multilayer film is provided on the pixel of the image sensor, a color filter made of the dielectric multilayer film can be provided in the first optical surface region D1 and the second optical surface region D2 relatively easily and inexpensively. It is possible to provide. For this reason, it is relatively easy to adjust and change the spectral characteristics in the first optical surface region D1 and the second optical surface region D2.
 なお、本実施の形態では、y方向の画素値が1画素おきに欠落するため、画像の縦横比が等しくなくなる。このため、欠落している画素の画素値をy方向に隣接する画素の画素値によって補間して生成してもよいし、x方向の画素値を2画素ずつ加算して生成してもよい。 In the present embodiment, every other pixel value in the y direction is lost, so the aspect ratios of the images are not equal. For this reason, the pixel value of the missing pixel may be generated by interpolating with the pixel value of the pixel adjacent in the y direction, or may be generated by adding the pixel values in the x direction by two pixels.
 次に、図1に示したレンズL2の絞りS側に形成された回折格子Gの効果について説明する。 Next, the effect of the diffraction grating G formed on the stop S side of the lens L2 shown in FIG. 1 will be described.
 回折格子Gを形成する目的は、光学系の色収差の補正と像面湾曲の低減である。一般に、回折格子のアッベ数νdは以下に示す通りである。
Figure JPOXMLDOC01-appb-M000001

ここで、λd、λF、λCはそれぞれ、d線、F線、C線における波長である。
The purpose of forming the diffraction grating G is to correct the chromatic aberration of the optical system and reduce the curvature of field. In general, the Abbe number ν d of the diffraction grating is as shown below.
Figure JPOXMLDOC01-appb-M000001

Here, λ d , λ F , and λ C are the wavelengths for the d-line, F-line, and C-line, respectively.
 (式1)が示すように、回折格子は逆分散かつ異常分散性をもつ。したがって、回折格子で発生する色収差の光軸上の波長の並びは非球面で発生する色収差の並びと逆であるため、非球面の色収差を打ち消すことができる。つまり、回折格子を非球面レンズに組み合わせることにより屈折レンズに発生する色収差を補正することが可能となる。 (Equation 1) shows that the diffraction grating has inverse dispersion and anomalous dispersion. Therefore, since the arrangement of the wavelengths of the chromatic aberration generated on the diffraction grating on the optical axis is opposite to the arrangement of the chromatic aberration generated on the aspheric surface, the chromatic aberration of the aspheric surface can be canceled out. That is, it is possible to correct chromatic aberration generated in the refractive lens by combining the diffraction grating with the aspherical lens.
 また、回折格子レンズは像面湾曲補正能力も兼ね備える。一般に、回折格子の輪帯ピッチを小さくするほど、回折パワーは強くなる。輪帯ピッチを小さくして、全体のパワーを回折により稼ぐことで、相対的に屈折パワーを小さく抑えることが可能である。回折格子のペッツバール和はほとんど0であるため、屈折パワーが小さいことで光学系全体のペッツバール和が小さくなり像面湾曲を低減できる。したがって、回折格子を用いることで色収差補正だけでなく、像面湾曲補正も可能となる。 In addition, the diffraction grating lens also has a field curvature correction capability. In general, the smaller the ring pitch of the diffraction grating, the stronger the diffraction power. Refraction power can be kept relatively small by reducing the ring pitch and gaining the overall power by diffraction. Since the Petzval sum of the diffraction grating is almost zero, the refractive power is small, so that the Petzval sum of the entire optical system is reduced, and field curvature can be reduced. Therefore, by using a diffraction grating, not only chromatic aberration correction but also field curvature correction can be performed.
 また、一般的な回折格子であれば、可視光域で撮像する際、ある特定波長でのみ回折効率が100%となりその他の波長においては不要な次数の回折光が発生し画質を劣化させてしまうが、本実施の形態の構成を用いれば可視域全域において不要な回折光の発生を低減した画像を得ることができる。この理由を下記に説明する。 In addition, with a general diffraction grating, when imaging in the visible light region, the diffraction efficiency is 100% only at a specific wavelength, and unnecessary orders of diffracted light are generated at other wavelengths, thereby degrading image quality. However, if the configuration of the present embodiment is used, an image with reduced generation of unnecessary diffracted light in the entire visible range can be obtained. The reason for this will be described below.
 回折格子Gは絞りS側のレンズ面に形成されている。絞りS近傍のレンズ面に回折格子Gを形成することで、第1の光学面領域D1を通過する光束B1が回折格子面上を通過する領域と、第2の光学面領域D2を通過する光束B2が回折格子面上を通過する領域を分離することができる。光束B1が通過する回折格子G上の回折段差の深さd1は、光学面領域D1が透過する第1の波長帯域の光の回折効率が(第2の波長帯域の光の回折効率よりも)高くなるように設定すると良い。同様に、光束B2が通過する回折格子G上の回折段差の深さd2は、光学面領域D2が透過する第2の波長帯域の光の回折効率が(第1の波長帯域の光の回折効率よりも)高くなるように設定すると良い。回折格子Gにおいて、入射角0°で入射する光線に対し、理論的に回折効率が100%になる条件は、波長λ、回折段差の深さd、屈折率nを用いて次式で表される。
  d=mλ/(n-1)    (式2)
(式2)において、λ=500nm、m=1、n=1.526とすると、d=0.95μmとなる。
The diffraction grating G is formed on the lens surface on the stop S side. By forming the diffraction grating G on the lens surface near the stop S, the light beam B1 that passes through the first optical surface region D1 and the light beam that passes through the second optical surface region D2 The region where B2 passes on the diffraction grating surface can be separated. The depth d1 of the diffraction step on the diffraction grating G through which the light beam B1 passes is such that the diffraction efficiency of light in the first wavelength band transmitted through the optical surface region D1 is greater than the diffraction efficiency of light in the second wavelength band. It is better to set it higher. Similarly, the depth d2 of the diffraction step on the diffraction grating G through which the light beam B2 passes is that the diffraction efficiency of light in the second wavelength band transmitted through the optical surface region D2 is (the diffraction efficiency of light in the first wavelength band). It is better to set it higher. In the diffraction grating G, the condition that the diffraction efficiency is theoretically 100% for a light ray incident at an incident angle of 0 ° is expressed by the following equation using the wavelength λ, the depth d of the diffraction step, and the refractive index n. The
d = mλ / (n−1) (Formula 2)
In (Expression 2), if λ = 500 nm, m = 1, and n = 1.526, d = 0.95 μm.
 回折格子Gは、入射する光線を回折させて波面を変化させる。例えば(式2)が成立する条件においては、回折格子Gでは、入射する光のすべてがm次の回折光となり、光の方向が変化する。なお、帯域幅が広い場合、波長λの値としては、帯域幅の中央付近の値を用いるか、不要な回折次数の光によるロスが帯域幅全体でバランスよく低減できる波長を選択するとよい。 The diffraction grating G changes the wavefront by diffracting incident light rays. For example, under the condition that (Formula 2) is satisfied, in the diffraction grating G, all of the incident light becomes m-order diffracted light, and the direction of the light changes. When the bandwidth is wide, as the value of the wavelength λ, a value near the center of the bandwidth may be used, or a wavelength that can reduce the loss due to unnecessary diffraction order light in a balanced manner over the entire bandwidth may be selected.
 例えば、本実施の形態においては、第1の回折格子部G1に到達する光の波長帯域幅の中央値をλ1とすると、前記第1の回折段差の深さd1は、下記式の関係を満たす。
0.9λ1/{n(λ1)-1}≦d1≦1.1λ1/{n(λ1)-1}
For example, in this embodiment, when the center value of the wavelength bandwidth of the light reaching the first diffraction grating portions G1 and lambda 1, the first depth of the diffraction step d1, the relation of the following formula Fulfill.
0.9λ 1 / {n (λ 1 ) −1} ≦ d1 ≦ 1.1λ 1 / {n (λ 1 ) −1}
 また、第2の回折格子部G2に到達する光の波長帯幅の中央の値をλ2とすると、前記第2の回折段差深さd2は、下記式の関係を満たす。
0.9λ2/{n(λ2)-1}≦d2≦1.1λ2/{n(λ2)-1}
Further, when the center value of the wavelength band width of the light reaching the second diffraction grating portion G2 and lambda 2, the second diffractive step depth d2 satisfy the relationship of the following formula.
0.9λ 2 / {n (λ 2 ) −1} ≦ d2 ≦ 1.1λ 2 / {n (λ 2 ) −1}
 本実施の形態の構成によれば、光束B1、B2の所定の帯域幅の波長ごとに、適切な回折段差の深さd1、d2の回折格子を組み合わせることで例えば、可視光域全域において回折効率をほぼ100%の画像を得ることができる。例えば、レンズL2の屈折率をnd=1.585、アッベ数をνd=27.9として、光学面領域D1、D2が透過する光の波長帯域をそれぞれ400nm~550nm、550nm~700nmとし、回折格子の回折段差深さをそれぞれd1=0.76μm、d2=1.06μmとすると、光束B1、B2の1次回折効率はそれぞれ図5(a)、(b)に示す通り、いずれもそれぞれの波長帯域で90%以上となり、光学系として可視光域全域で回折効率をほぼ100%とすることができる。 According to the configuration of the present embodiment, for example, by combining diffraction gratings having appropriate diffraction step depths d1 and d2 for each wavelength of the predetermined bandwidth of the light beams B1 and B2, for example, diffraction efficiency in the entire visible light region. Almost 100% of the image can be obtained. For example, the refractive index of the lens L2 is nd = 1.585, the Abbe number is νd = 27.9, and the wavelength bands of light transmitted through the optical surface regions D1 and D2 are 400 nm to 550 nm and 550 nm to 700 nm, respectively. Are d1 = 0.76 μm and d2 = 1.06 μm, respectively, the first-order diffraction efficiencies of the light beams B1 and B2 are as shown in FIGS. 5A and 5B, respectively. It becomes 90% or more in the band, and the diffraction efficiency can be almost 100% in the entire visible light region as the optical system.
 また、回折格子を付加せずに色収差をそれなりに補正する方法として、図1のレンズL2を両面非球面で構成し、光軸Vを境界中心として2分割した領域それぞれで面形状を非対称に設計する方法も考えられる。それぞれの領域で、所定の帯域幅の波長ごとに面形状を調整することで、各領域の結像位置をほぼ一致させることができる。しかしこの場合、帯域幅内の波長の色収差が十分に補正できない。例えば、光束B1の帯域幅として550nm~700nmとすると、550nmと700nmとで結像位置が異なるため色収差が残存する。これに対し、回折格子は、輪帯本数を制御することである程度幅のある帯域においても結像位置を揃えることができるため、非球面のみの光学系に比べ色収差を補正することが可能となる。 In addition, as a method of correcting chromatic aberration as it is without adding a diffraction grating, the lens L2 in FIG. 1 is configured as a double-sided aspheric surface, and the surface shape is designed asymmetrically in each of the two divided regions with the optical axis V as the boundary center. A way to do this is also possible. By adjusting the surface shape for each wavelength of a predetermined bandwidth in each region, the imaging positions in each region can be made substantially coincident. However, in this case, the chromatic aberration of the wavelength within the bandwidth cannot be corrected sufficiently. For example, if the bandwidth of the light beam B1 is 550 nm to 700 nm, the chromatic aberration remains because the imaging positions differ between 550 nm and 700 nm. On the other hand, the diffraction grating can correct the chromatic aberration as compared with an optical system with only an aspherical surface because the imaging position can be made uniform even in a band having a certain width by controlling the number of ring zones. .
 また、回折格子Gの回折輪帯ピッチは、光束B1、B2が通過するそれぞれの領域で適宜調整するとよい。回折輪帯ピッチの調整により、それぞれの帯域幅に応じた回折パワーの配分が調整でき、互いの領域の結像位置のずれを修正し揃えることができる。 Further, the diffraction ring zone pitch of the diffraction grating G may be appropriately adjusted in each region through which the light beams B1 and B2 pass. By adjusting the diffraction zone pitch, it is possible to adjust the distribution of diffraction power in accordance with the respective bandwidths, and to correct and evenly align the image formation positions in each region.
 以上より、実施の形態1の構成により、色収差を補正し解像度を高めることができるとともに、回折効率をほぼ100%とし不要な次数の回折光によるフレアを低減した画像を得ることができる。 As described above, according to the configuration of the first embodiment, it is possible to correct the chromatic aberration and increase the resolution, and to obtain an image in which the diffraction efficiency is almost 100% and the flare caused by unnecessary orders of diffracted light is reduced.
(実施の形態2)
 本実施の形態2は、光学素子L1の領域分割を3つにした点と、回折格子Gが有する複数の回折段差が領域ごとに3つの異なる高さを有している点と、撮像素子が3種類の画素を含む点で、実施の形態1と異なっている。このために、光学素子L1は、第1の波長帯域および前記第2の波長帯域と異なる第3の波長帯域の光を選択的に透過する光学領域D3をさらに有する。回折格子Gは、さらに第3の回折段差G3を有する。撮像素子は、複数の画素P3をさらに備える。ここでは、実施の形態1と同様の内容についての詳細な説明は省略する。
(Embodiment 2)
In the second embodiment, the optical element L1 is divided into three regions, the plurality of diffraction steps of the diffraction grating G have three different heights for each region, and the imaging device This is different from the first embodiment in that it includes three types of pixels. For this purpose, the optical element L1 further includes an optical region D3 that selectively transmits light in a third wavelength band different from the first wavelength band and the second wavelength band. The diffraction grating G further has a third diffraction step G3. The imaging device further includes a plurality of pixels P3. Here, a detailed description of the same contents as in the first embodiment is omitted.
 図6は、実施の形態2の撮像装置Aを示す模式図である。図6において、光束B1は、光学素子L1上の光学面領域D1を通過する光束であり、光束B2は、光学素子L1上の光学面領域D2を通過する光束であり、光束B3は、光学素子L1上の光学面領域D3を通過する光束である。光束B1、B2、およびB3は、絞りS、光学素子L1、レンズL2、アレイ状光学素子Kをこの順に通過し、撮像素子N上の撮像面Ni(図8等に示す)に到達する。 FIG. 6 is a schematic diagram illustrating the imaging apparatus A according to the second embodiment. In FIG. 6, a light beam B1 is a light beam that passes through the optical surface region D1 on the optical element L1, a light beam B2 is a light beam that passes through the optical surface region D2 on the optical element L1, and a light beam B3 is the optical element. The light beam passes through the optical surface region D3 on L1. The light beams B1, B2, and B3 pass through the stop S, the optical element L1, the lens L2, and the arrayed optical element K in this order, and reach the imaging surface Ni (shown in FIG. 8 and the like) on the imaging element N.
 図7(a)は、光学素子L1を被写体側から見た正面図である。光学面領域D1、D2、D3は、光学素子L1が光軸Vに垂直な面内で上下方向に3分割された領域である。また、各光学面領域が透過する光の波長帯域は、互いに異なっている。図7(b)は、回折格子Gのレンズ光学系Lの光軸Vと垂直な平面における形状を示す図である。図7(b)に示すように、回折格子Gにおいては、光学素子L1の光学面領域D1からD3に対応する領域に、第1から第3の回折格子部G1からG3が設けられている。第1から第3の回折格子部G1からG3においては、それぞれ、複数の輪帯R1からR3が設けられている。隣合う輪帯R1の間、隣合う輪帯R2の間および隣合う輪帯R3の間には、それぞれ、回折段差A1、A2、A3が設けられている。回折段差A1からA3はそれぞれ複数設けられ、典型的には、複数の回折段差A1、A2、A3は、それぞれ、同じ深さd1、d2、d3を有している。第1の回折格子部G1における回折段差A1の深さd1と、第2の回折格子部G2における回折段差A2の深さd2と、第3の回折格子部G3における回折段差A3の深さd3とは異なる。また、輪帯R1、R2、R3の幅は、目的とする光学パワーなどに応じて設定される。 FIG. 7A is a front view of the optical element L1 as viewed from the subject side. The optical surface regions D1, D2, and D3 are regions in which the optical element L1 is divided into three in the vertical direction within a plane perpendicular to the optical axis V. Further, the wavelength bands of the light transmitted through each optical surface region are different from each other. FIG. 7B is a diagram showing the shape of the diffraction grating G on a plane perpendicular to the optical axis V of the lens optical system L. As shown in FIG. 7B, in the diffraction grating G, first to third diffraction grating portions G1 to G3 are provided in regions corresponding to the optical surface regions D1 to D3 of the optical element L1. In the first to third diffraction grating portions G1 to G3, a plurality of annular zones R1 to R3 are provided, respectively. Diffraction steps A1, A2, and A3 are provided between the adjacent ring zones R1, between the adjacent ring zones R2, and between the adjacent ring zones R3, respectively. A plurality of diffraction steps A1 to A3 are provided. Typically, the plurality of diffraction steps A1, A2, and A3 have the same depths d1, d2, and d3, respectively. The depth d1 of the diffraction step A1 in the first diffraction grating part G1, the depth d2 of the diffraction step A2 in the second diffraction grating part G2, and the depth d3 of the diffraction step A3 in the third diffraction grating part G3. Is different. Further, the widths of the annular zones R1, R2, and R3 are set according to the target optical power and the like.
 図8(a)は、図6に示すアレイ状光学素子Kおよび撮像素子Nを拡大して示す図であり、図8(b)は、アレイ状光学素子Kと撮像素子N上の画素との位置関係を示す図である。アレイ状光学素子Kは、光学要素M1が形成された面が撮像面Ni側に向かうように配置されている。撮像面Niには、画素Pが行列状に配置されている。画素Pは、画素P1、画素P2および画素P3に区別できる。図8(a)に示すように、本実施の形態では、撮像素子Nの撮像面Ni上には画素P1、P2,P3に対応したカラーフィルタは設けられていない。 FIG. 8A is an enlarged view of the arrayed optical element K and the image sensor N shown in FIG. 6, and FIG. 8B is a diagram illustrating the relationship between the arrayed optical element K and the pixels on the image sensor N. It is a figure which shows a positional relationship. The arrayed optical element K is arranged so that the surface on which the optical element M1 is formed faces the imaging surface Ni side. Pixels P are arranged in a matrix on the imaging surface Ni. The pixel P can be distinguished into a pixel P1, a pixel P2, and a pixel P3. As shown in FIG. 8A, in the present embodiment, the color filters corresponding to the pixels P1, P2, and P3 are not provided on the imaging surface Ni of the imaging element N.
 画素P1、画素P2および画素P3のそれぞれは、横方向(行方向)に1行に並んで配置されている。縦方向(列方向)において、画素P1、P2、P3が繰り返し配置されている。アレイ状光学素子Kは、その光学要素M1の1つが、撮像面Ni上における1行の画素P1、1行の画素P2および1行の画素P3からなる3行の画素に対応するように配置されている。撮像面Ni上には、画素P1、P2、P3の表面を覆うようにマイクロレンズMsが設けられている。 Each of the pixel P1, the pixel P2, and the pixel P3 is arranged in a row in the horizontal direction (row direction). In the vertical direction (column direction), the pixels P1, P2, and P3 are repeatedly arranged. The array-like optical element K is arranged so that one of the optical elements M1 corresponds to three rows of pixels including one row of pixels P1, one row of pixels P2, and one row of pixels P3 on the imaging surface Ni. ing. On the imaging surface Ni, a microlens Ms is provided so as to cover the surfaces of the pixels P1, P2, and P3.
 アレイ状光学素子Kは、光学素子L1上の光学面領域D1(図6、図7に示す)を通過した光束B1(図6において点線で示される光束B1)の大部分が、撮像面Ni上の画素P1に到達し、光学面領域D2を通過した光束(図5において実線で示される光束B2)の大部分が、撮像面Ni上の画素P2に到達し、光学面領域D3を通過した光束(図6において破線で示される光束B3)の大部分が、撮像面Ni上の画素P3に到達するように設計されている。具体的にはアレイ状光学素子Kの屈折率、撮像面Niからの距離及び光学要素M1表面の曲率半径等のパラメータを適切に設定することで、上記構成が実現する。 In the arrayed optical element K, most of the light beam B1 (light beam B1 indicated by a dotted line in FIG. 6) that has passed through the optical surface region D1 (shown in FIGS. 6 and 7) on the optical element L1 is on the imaging surface Ni. Most of the light beam that has reached the pixel P1 and passed through the optical surface region D2 (light beam B2 indicated by the solid line in FIG. 5) has reached the pixel P2 on the imaging surface Ni and has passed through the optical surface region D3. It is designed so that most of (the light beam B3 indicated by a broken line in FIG. 6) reaches the pixel P3 on the imaging surface Ni. Specifically, the above-described configuration is realized by appropriately setting parameters such as the refractive index of the arrayed optical element K, the distance from the imaging surface Ni, and the radius of curvature of the surface of the optical element M1.
 以上の構成により、画素P1、画素P2、および画素P3は、互いに異なる波長帯域の光に対応する画像情報をそれぞれ生成する。つまり、撮像装置Aは、互いに異なる波長帯域の光によって形成される複数の画像情報を、単一の撮像光学系で、かつ、1回の撮像で取得することができる。 With the above configuration, the pixel P1, the pixel P2, and the pixel P3 each generate image information corresponding to light in different wavelength bands. That is, the imaging apparatus A can acquire a plurality of pieces of image information formed by light having different wavelength bands with a single imaging optical system and with one imaging.
 実施の形態1では、2種類の波長帯域の画像を同時に取得する構造であるが、本実施の形態2では、3種類の波長帯域の画像を同時に取得することができる。ここで同時とは、被写体から同時刻に得られた3種類の波長帯域の光による画像が取得できることを意味しており、3種類の波長帯域の画像の信号生成は同時でなくてもよい。 In Embodiment 1, the structure is such that images of two types of wavelength bands are acquired simultaneously, but in Embodiment 2, images of three types of wavelength bands can be acquired simultaneously. Here, “simultaneous” means that images of light of three types of wavelength bands obtained from the subject at the same time can be acquired, and signal generation of images of three types of wavelength bands may not be simultaneous.
 3種類の波長帯域の具体例を挙げる。 Specific examples of three types of wavelength bands are given below.
 ひとつの例では、第1の光学面領域D1については、青色帯域の光を透過し青色以外の帯域の色を実質的に遮断する青色のカラーフィルタとする。第2の光学面領域D2については、緑色帯域の光を透過し緑色以外の帯域の色を実質的に遮断する緑色のカラーフィルタとする。第3の光学面領域D3については、赤色帯域の光を透過し赤色以外の帯域の色を実質的に遮断する赤色のカラーフィルタとする。これにより、モノクロ撮像素子を用いてフルカラーの画像を取得できる撮像装置を実現することができる。また、前述のような原色系のフィルタに限らず、補色系(シアン、マゼンタ、イエロー)のフィルタを用いることもできる。また、前述のカラーフィルタとして、誘電多層膜を用いることで、有機フィルタの場合よりも色再現性の良い画像を取得することができる。 In one example, the first optical surface region D1 is a blue color filter that transmits light in the blue band and substantially blocks colors in bands other than blue. The second optical surface region D2 is a green color filter that transmits light in the green band and substantially blocks colors in bands other than green. About the 3rd optical surface area | region D3, it is set as the red color filter which permeate | transmits the light of a red zone | band and cuts off the color of bands other than red substantially. Thereby, an imaging device capable of acquiring a full-color image using a monochrome imaging element can be realized. Further, not only the primary color filters as described above, but also complementary color (cyan, magenta, yellow) filters can be used. Further, by using a dielectric multilayer film as the color filter described above, an image with better color reproducibility than that of an organic filter can be obtained.
 なお、本実施の形態では、y方向の画素値が2画素おきに欠落するため、欠落している画素の画素値をy方向に隣接する画素の画素値によって補間して生成してもよいし、x方向の画素値を3画素ずつ加算して生成してもよい。 In this embodiment, since pixel values in the y direction are missing every two pixels, pixel values of missing pixels may be generated by interpolation with pixel values of pixels adjacent in the y direction. The pixel values in the x direction may be generated by adding three pixels at a time.
 また、撮像素子の各画素のx方向とy方向のアスペクト比が3:1である構成であってもよい。このような構成にすることにより、前述のような補間処理や加算処理が不要となる。 Further, a configuration in which the aspect ratio of each pixel of the image sensor in the x direction and the y direction is 3: 1 may be employed. With such a configuration, the above-described interpolation processing and addition processing are not necessary.
 次に、回折格子Gについて詳細に説明する。回折格子Gは絞りS側のレンズ面に形成している。絞りS近傍のレンズ面に回折格子Gを形成することで、第1の光学面領域D1を通過する光束B1が回折格子面上を通過する領域と、第2の光学面領域D2を通過する光束B2が回折格子面上を通過する領域と、第3の光学面領域D3を通過する光束B3が回折格子面上を通過する領域を分離することができる。光束B1が通過する回折格子G上の回折段差の深さd1は、光学面領域D1が透過する第1の波長帯域の光の回折効率が高くなるように設定すると良い。同様に、光束B2が通過する回折格子G上の回折段差の深さd2は、光学面領域D2が透過する第2の波長帯域の光の回折効率が高くなるように設定すると良い。同様に、光束B3が通過する回折格子G上の回折段差の深さd3は、光学面領域D3が透過する第3の波長帯域の光の回折効率が高くなるように設定すると良い。 Next, the diffraction grating G will be described in detail. The diffraction grating G is formed on the lens surface on the stop S side. By forming the diffraction grating G on the lens surface near the stop S, the light beam B1 that passes through the first optical surface region D1 and the light beam that passes through the second optical surface region D2 A region where B2 passes on the diffraction grating surface and a region where the light beam B3 passing through the third optical surface region D3 passes on the diffraction grating surface can be separated. The depth d1 of the diffraction step on the diffraction grating G through which the light beam B1 passes is preferably set so that the diffraction efficiency of the light in the first wavelength band transmitted through the optical surface region D1 is high. Similarly, the depth d2 of the diffraction step on the diffraction grating G through which the light beam B2 passes is preferably set so that the diffraction efficiency of light in the second wavelength band transmitted through the optical surface region D2 is high. Similarly, the depth d3 of the diffraction step on the diffraction grating G through which the light beam B3 passes may be set so that the diffraction efficiency of the light in the third wavelength band transmitted through the optical surface region D3 is high.
 本実施の形態の構成によれば、光束B1、B2、B3の所定の帯域幅の波長ごとに、適切な回折段差の深さd1、d2、d3の回折格子を組み合わせることで例えば、可視光域全域において回折効率をほぼ100%の画像を得ることができる。例えば、レンズL2の屈折率をnd=1.585、アッベ数をνd=27.9として、光学面領域D1、D2、D3が透過する光の波長帯域をそれぞれ400nm~500nm、500nm~600nm、600nm~700nmとし、回折格子の回折段差深さをそれぞれd1=0.73μm、d2=0.93μm、d3=1.12μmとすると、光束B1、B2の1次回折効率はそれぞれ図9(a)、(b)、(c)に示す通り、いずれもそれぞれの波長帯域で90%以上となり、光学系として可視光域全域で回折効率をほぼ100%とすることができる。 According to the configuration of the present embodiment, for example, by combining diffraction gratings having appropriate diffraction step depths d1, d2, and d3 for each wavelength of the predetermined bandwidth of the light beams B1, B2, and B3, for example, in the visible light region. An image having a diffraction efficiency of almost 100% can be obtained over the entire area. For example, the refractive index of the lens L2 is nd = 1.585, the Abbe number is νd = 27.9, and the wavelength bands of light transmitted through the optical surface regions D1, D2, and D3 are 400 nm to 500 nm, 500 nm to 600 nm, and 600 nm, respectively. When the diffraction step depth of the diffraction grating is d1 = 0.73 μm, d2 = 0.93 μm, and d3 = 1.12 μm, respectively, the first-order diffraction efficiencies of the light beams B1 and B2 are shown in FIG. As shown in (b) and (c), both are 90% or more in each wavelength band, and the diffraction efficiency can be almost 100% in the entire visible light region as an optical system.
(実施の形態3)
 本実施の形態3は、図1の光学素子L1の領域分割を4つにした点と、回折格子Gが有する複数の回折段差は、領域ごとに4つの異なる高さを有する点と、撮像素子が4種類の画素を含む点と、さらに、アレイ状光学素子をレンチキュラからマイクロレンズに置き換えた点で、実施の形態1と異なっている。このために、光学素子L1は、第1および第2の波長帯域と異なる第3の波長帯域の光を選択的に透過する光学領域D3と、第1、2および第3の波長帯域と異なる第4の波長帯域の光を選択的に透過する光学領域D4をさらに有する。回折格子Gは、第3および第4の回折段差G3、G4をさらに有する。撮像素子は、複数の画素P3および複数の画素P4をさらに備える。ここでは、実施の形態1と同様の内容についての詳細な説明は省略する。
(Embodiment 3)
In the third embodiment, the area division of the optical element L1 in FIG. 1 is four, the plurality of diffraction steps of the diffraction grating G have four different heights for each area, and the imaging element. Is different from the first embodiment in that it includes four types of pixels and that the array-like optical element is replaced with a microlens from a lenticular. For this reason, the optical element L1 includes an optical region D3 that selectively transmits light in a third wavelength band different from the first and second wavelength bands, and a first wavelength band different from the first, second, and third wavelength bands. It further has an optical region D4 that selectively transmits light in the four wavelength bands. The diffraction grating G further includes third and fourth diffraction steps G3 and G4. The imaging device further includes a plurality of pixels P3 and a plurality of pixels P4. Here, a detailed description of the same contents as in the first embodiment is omitted.
 図10(a)は、光学素子L1を被写体側から見た正面図である。光学面領域D1、D2、D3、およびD4は光学素子L1が光軸Vを境界中心として光軸Vに垂直な面内で上下左右に4分割された領域である。また、各光学面領域が透過する光の波長帯域は、互いに異なっている。図10(b)は、回折格子Gのレンズ光学系Lの光軸Vと垂直な平面における形状を示す図である。図10(b)に示すように、回折格子Gにおいては、光学素子L1の光学面領域D1からD4に対応する領域に、第1から第4の回折格子部G1からG4が設けられている。第1から第4の回折格子部G1からG4においては、それぞれ、複数の輪帯R1からR4が設けられている。隣合う輪帯R1の間、隣合う輪帯R2の間、隣合う輪帯R3の間および隣合う輪帯R4の間には、それぞれ、回折段差A1、A2、A3、A4が設けられている。回折段差A1からA4はそれぞれ複数設けられている。典型的には、複数の回折段差A1、A2、A3、A4は、それぞれ、同じ深さd1、d2、d3、d4を有している。深さd1、深さd2、深さd3および深さd4は互いに異なる。また、輪帯R1、R2、R3、R4の幅は、目的とする光学パワーなどに応じて設定される。 FIG. 10A is a front view of the optical element L1 viewed from the subject side. Optical surface regions D1, D2, D3, and D4 are regions in which the optical element L1 is divided into four parts in the vertical and horizontal directions within a plane perpendicular to the optical axis V with the optical axis V as the boundary center. Further, the wavelength bands of the light transmitted through each optical surface region are different from each other. FIG. 10B is a diagram showing the shape of the diffraction grating G in a plane perpendicular to the optical axis V of the lens optical system L. As shown in FIG. 10B, in the diffraction grating G, first to fourth diffraction grating portions G1 to G4 are provided in regions corresponding to the optical surface regions D1 to D4 of the optical element L1. In the first to fourth diffraction grating portions G1 to G4, a plurality of annular zones R1 to R4 are provided, respectively. Diffraction steps A1, A2, A3, and A4 are provided between the adjacent ring zones R1, between the adjacent ring zones R2, between the adjacent ring zones R3, and between the adjacent ring zones R4, respectively. . A plurality of diffraction steps A1 to A4 are provided. Typically, the plurality of diffraction steps A1, A2, A3, and A4 have the same depths d1, d2, d3, and d4, respectively. The depth d1, the depth d2, the depth d3, and the depth d4 are different from each other. Further, the widths of the annular zones R1, R2, R3, and R4 are set according to the target optical power and the like.
 図11は、アレイ状光学素子Kの斜視図である。アレイ状光学素子Kにおける撮像素子N側の面には、光学要素M2が格子状に配置されている。それぞれの光学要素M2の断面(縦方向および横方向それぞれの断面)は曲面形状であり、それぞれの光学要素M2は、撮像素子N側に突出している。このように、光学要素M2はマイクロレンズであり、アレイ状光学素子Kは、マイクロレンズアレイとなっている。 FIG. 11 is a perspective view of the arrayed optical element K. FIG. On the surface of the arrayed optical element K on the imaging element N side, optical elements M2 are arranged in a grid pattern. Each optical element M2 has a curved cross-section (vertical and horizontal cross-sections), and each optical element M2 protrudes toward the image sensor N. Thus, the optical element M2 is a microlens, and the arrayed optical element K is a microlens array.
 図12(a)は、アレイ状光学素子Kと撮像素子Nとを拡大して示す図であり、図12(b)は、アレイ状光学素子Kと撮像素子N上の画素との位置関係を示す図である。アレイ状光学素子Kは、光学要素M2が形成された面が撮像面Ni側に向かうように配置されている。撮像面Niには、画素Pが行列状に配置されている。画素Pは、画素P1、画素P2、画素P3および画素P4に区別できる。図12(a)に示すように、本実施の形態では、撮像素子Nの撮像面Ni上には画素P1、P2、P3、P4に対応したカラーフィルタは設けられていない。 12A is an enlarged view showing the arrayed optical element K and the image sensor N, and FIG. 12B shows the positional relationship between the arrayed optical element K and the pixels on the image sensor N. FIG. The arrayed optical element K is arranged so that the surface on which the optical element M2 is formed faces the imaging surface Ni side. Pixels P are arranged in a matrix on the imaging surface Ni. The pixel P can be distinguished into a pixel P1, a pixel P2, a pixel P3, and a pixel P4. As shown in FIG. 12A, in the present embodiment, the color filters corresponding to the pixels P1, P2, P3, and P4 are not provided on the imaging surface Ni of the imaging element N.
 アレイ状光学素子Kは、実施の形態1と同様に、レンズ光学系Lの焦点近傍に配置されており、かつ撮像面Niから所定の距離だけ離れた位置に配置されている。また、撮像面Ni上には、画素P1、P2、P3、P4の表面を覆うようにマイクロレンズMsが設けられている。 As in the first embodiment, the arrayed optical element K is disposed in the vicinity of the focal point of the lens optical system L, and is disposed at a position away from the imaging surface Ni by a predetermined distance. A microlens Ms is provided on the imaging surface Ni so as to cover the surfaces of the pixels P1, P2, P3, and P4.
 また、アレイ状光学素子Kは、光学要素M2が形成された面が撮像面Ni側に向うように配置されている。アレイ状光学素子Kは、その光学要素M2の1つが、撮像面Niにおける2行2列の画素P1~P4の4つの画素に対応するように配置されている。 The arrayed optical element K is arranged so that the surface on which the optical element M2 is formed faces the imaging surface Ni side. The arrayed optical element K is arranged so that one of the optical elements M2 corresponds to four pixels of pixels P1 to P4 in 2 rows and 2 columns on the imaging surface Ni.
 アレイ状光学素子Kは、光学素子L1上の光学面領域D1、D2、D3、およびD4を通過した光束の大部分は、それぞれ撮像面Ni上の画素P1、画素P2、画素P3、および画素P4に到達するように設計されている。具体的にはアレイ状光学素子Kの屈折率、撮像面Niからの距離及び光学要素M1表面の曲率半径等のパラメータを適切に設定することで、上記構成が実現する。 In the array-like optical element K, most of the light beams that have passed through the optical surface regions D1, D2, D3, and D4 on the optical element L1 are pixels P1, P2, P3, and P4 on the imaging surface Ni, respectively. Designed to reach. Specifically, the above-described configuration is realized by appropriately setting parameters such as the refractive index of the arrayed optical element K, the distance from the imaging surface Ni, and the radius of curvature of the surface of the optical element M1.
 以上の構成により、画素P1、画素P2、画素P3、および画素P4は、互いに異なる波長帯域の光に対応する画像情報をそれぞれ生成する。つまり、撮像装置Aは、異なる波長帯域の光によって形成される複数の画像情報を単一の撮像光学系で、かつ、1回の撮像で取得することができる。 With the above configuration, the pixel P1, the pixel P2, the pixel P3, and the pixel P4 each generate image information corresponding to light in different wavelength bands. That is, the imaging apparatus A can acquire a plurality of pieces of image information formed by light of different wavelength bands with a single imaging optical system and one imaging.
 実施の形態1および実施の形態2では、それぞれ2種類および3種類の波長帯域の画像を同時に取得する構造であるが、本実施の形態3では、4種類の波長帯域の画像を同時に取得することができる。 In Embodiment 1 and Embodiment 2, the structure is such that images of two types and three types of wavelength bands are acquired simultaneously, but in Embodiment 3, images of four types of wavelength bands are acquired simultaneously. Can do.
 4種類の波長帯域の具体例を挙げる。 Specific examples of four types of wavelength bands are given below.
 ひとつの例では、実施の形態2で述べた、青色、緑色、赤色のカラーフィルタに加えて、前記青色、緑色、赤色を含む可視光を実質的に遮断し、近赤外光を透過する近赤外光フィルタを備える構成にする。これにより、昼夜兼用の撮像装置や生態認証用の撮像装置を実現することができる。このような撮像装置において、近赤外光の画像を取得する際には、近赤外光の帯域を含む分光放射特性を有する光源を備えていてもよい。 In one example, in addition to the blue, green, and red color filters described in the second embodiment, visible light including blue, green, and red is substantially blocked, and near-infrared light that transmits near-infrared light is transmitted. The configuration includes an infrared light filter. Thereby, an imaging device for day and night and an imaging device for biometric authentication can be realized. In such an imaging apparatus, when acquiring a near-infrared light image, a light source having spectral radiation characteristics including a near-infrared light band may be provided.
 また別の例では、実施の形態2で述べた、青色、緑色、赤色のカラーフィルタに加えて、前記青色、緑色、赤色のカラーフィルタの分光透過率特性における各帯域幅よりも狭い帯域幅の波長帯域のみを透過するフィルタを備える構成とする。これにより、病変を狭帯域で観察することができる内視鏡やカプセル内視鏡用途の撮像装置を実現することができる。前述の狭い帯域は、青色、緑色、赤色のカラーフィルタのいずれかの帯域に含まれていてもよいし、含まれていなくてもよい。このような撮像装置においては、青色、緑色、赤色、および前記狭帯域を含む分光放射特性を有する1種類の光源、あるいは、青色、緑色、赤色、および前記狭帯域の各帯域にそれぞれ対応した分光放射特性を有する複数種類の光源を備えていてもよい。また、白色光源と、前記狭帯域を含む分光放射特性を有する光源とを備えた光源であってもよい。 In another example, in addition to the blue, green, and red color filters described in the second embodiment, a bandwidth that is narrower than each bandwidth in the spectral transmittance characteristics of the blue, green, and red color filters. The filter is configured to transmit only the wavelength band. As a result, it is possible to realize an endoscope that can observe a lesion in a narrow band or an imaging device for capsule endoscope. The narrow band described above may or may not be included in any of the blue, green, and red color filters. In such an imaging apparatus, one type of light source having spectral emission characteristics including blue, green, red, and the narrow band, or a spectrum corresponding to each of the blue, green, red, and narrow bands, respectively. A plurality of types of light sources having radiation characteristics may be provided. Moreover, the light source provided with the white light source and the light source which has the spectral radiation characteristic containing the said narrow band may be sufficient.
 なお、本実施の形態では、x方向及びy方向の画素値がそれぞれ1画素おきに欠落する。このため、欠落している画素の画素値をx方向及びy方向に隣接する画素の画素値によってそれぞれ補間して生成してもよい。 In the present embodiment, the pixel values in the x direction and the y direction are missing every other pixel. Therefore, the pixel values of the missing pixels may be generated by interpolating with the pixel values of the pixels adjacent in the x direction and the y direction, respectively.
 また、4分割した領域のうち光軸を挟んで対向する2つの領域が同じ緑色のカラーフィルタであってもよい。このような構成にすることにより、緑色の画素数が増すため、緑色の画像成分の解像度を向上させることができる。 In addition, two regions facing each other across the optical axis among the four divided regions may be the same green color filter. With such a configuration, the number of green pixels increases, so that the resolution of the green image component can be improved.
 次に、回折格子Gについて詳細に説明する。回折格子Gは絞りS側のレンズ面に形成している。絞りS側近傍のレンズ面に回折格子Gを形成することで、第1の光学面領域D1を通過する光束B1が回折格子面上を通過する領域と、第2の光学面領域D2を通過する光束B2が回折格子面上を通過する領域と、第3の光学面領域D3を通過する光束B3が回折格子面上を通過する領域と、第4の光学面領域D4を通過する光束B3が回折格子面上を通過する領域を分離することができる。光束B1が通過する回折格子G上の回折段差の深さd1は、光学面領域D1が透過する第1の波長帯域の光の回折効率が高くなるように設定すると良い。同様に、光束B2が通過する回折格子G上の回折段差の深さd2は、光学面領域D2が透過する第2の波長帯域の光の回折効率が高くなるように設定すると良い。同様に、光束B3が通過する回折格子G上の回折段差の深さd3は、光学面領域D3が透過する第3の波長帯域の光の回折効率が高くなるように設定すると良い。同様に、光束B4が通過する回折格子G上の回折段差の深さd4は、光学面領域D4が透過する第4の波長帯域の光の回折効率が高くなるように設定すると良い。 Next, the diffraction grating G will be described in detail. The diffraction grating G is formed on the lens surface on the stop S side. By forming the diffraction grating G on the lens surface in the vicinity of the stop S side, the light beam B1 passing through the first optical surface region D1 passes through the diffraction grating surface and the second optical surface region D2. A region where the light beam B2 passes through the diffraction grating surface, a region where the light beam B3 which passes through the third optical surface region D3 passes through the diffraction grating surface, and a light beam B3 which passes through the fourth optical surface region D4 is diffracted. A region passing on the lattice plane can be separated. The depth d1 of the diffraction step on the diffraction grating G through which the light beam B1 passes is preferably set so that the diffraction efficiency of the light in the first wavelength band transmitted through the optical surface region D1 is high. Similarly, the depth d2 of the diffraction step on the diffraction grating G through which the light beam B2 passes is preferably set so that the diffraction efficiency of light in the second wavelength band transmitted through the optical surface region D2 is high. Similarly, the depth d3 of the diffraction step on the diffraction grating G through which the light beam B3 passes may be set so that the diffraction efficiency of the light in the third wavelength band transmitted through the optical surface region D3 is high. Similarly, the depth d4 of the diffraction step on the diffraction grating G through which the light beam B4 passes may be set so that the diffraction efficiency of the light in the fourth wavelength band transmitted through the optical surface region D4 is high.
 本実施の形態の構成によれば、光束B1、B2、B3、B4の所定の帯域幅の波長ごとに、適切な回折段差の深さd1、d2、d3、d4の回折格子を組み合わせることで例えば、可視光域および赤外域全域において回折効率がほぼ100%の画像を得ることができる。 According to the configuration of the present embodiment, for example, by combining diffraction gratings having appropriate diffraction step depths d1, d2, d3, and d4 for each wavelength of a predetermined bandwidth of the light beams B1, B2, B3, and B4. In addition, an image having a diffraction efficiency of almost 100% can be obtained in the entire visible light region and infrared region.
(実施の形態4)
 本実施の形態4は、第1および第2の領域のそれぞれが、光軸を挟んで分離して配置された点と、アレイ状光学素子をレンチキュラからマイクロレンズに置き換えた点で、実施の形態1と異なっている。ここでは、実施の形態1と同様の内容についての詳細な説明は省略する。
(Embodiment 4)
The fourth embodiment is different from the first embodiment in that each of the first and second regions is separated from the optical axis, and the array-like optical element is replaced with a microlens from the lenticular. 1 and different. Here, a detailed description of the same contents as in the first embodiment is omitted.
 図13(a)は、光学素子L1を被写体側から見た正面図であり、光学面領域D1とD2のそれぞれは、光軸Vを中心として軸対称方向に分離して配置されている。図13(b)は、アレイ状光学素子Kと撮像素子N上の画素との位置関係を示す図である。本実施の形態4では、光学面領域D1を通過した光線は奇数行奇数列と偶数行偶数列に到達するため、奇数行奇数列と偶数行偶数列を加算し、光学面領域D2を通過した光線が偶数行奇数列と奇数行偶数列に到達するため、偶数行奇数列と奇数行偶数列を加算して画像を生成する。 FIG. 13A is a front view of the optical element L1 viewed from the subject side, and each of the optical surface regions D1 and D2 are arranged separately in the axially symmetric direction with the optical axis V as the center. FIG. 13B is a diagram showing the positional relationship between the arrayed optical element K and the pixels on the image sensor N. In the fourth embodiment, since the light beam that has passed through the optical surface region D1 reaches the odd-numbered row odd-numbered column and the even-numbered row even-numbered column, the odd-numbered row odd-numbered column and the even-numbered row even-numbered column are added and passed through the optical surface region D2. Since the light rays reach the even-numbered and odd-numbered columns and the odd-numbered and even-numbered columns, the even-numbered and odd-numbered columns and the odd-numbered and even-numbered columns are added to generate an image.
 実施の形態1では、第1の光学面領域D1と第2の光学面領域D2のそれぞれが、半円形状の領域で2つ分割されている。このため、各光学面領域を通過した光の像面におけるスポット重心が被写体距離によって変化し、視差が発生する場合がある。 In the first embodiment, each of the first optical surface region D1 and the second optical surface region D2 is divided into two semicircular regions. For this reason, the center of gravity of the spot on the image plane of the light passing through each optical surface region may change depending on the subject distance, and parallax may occur.
 図14は、実施の形態1における被写体距離毎の光線図、および点像とその重心の変化について説明する図である。図14において(a1)、(b1)および(c1)は、被写体距離毎の光線図を示しており、Oは物点である。その他の記号は、図1と同じである。図14の(a2)と(a3)、(b2)と(b3)および(c2)と(c3)は、レンチキュラを介して撮像した点像(半円で図示)とその重心(黒点で図示)を示しており、それぞれ図14の(a1)、(b1)および(c1)の被写体距離に対応している。 FIG. 14 is a diagram for explaining a ray diagram for each subject distance and a point image and a change in its center of gravity in the first embodiment. In FIG. 14, (a1), (b1), and (c1) are ray diagrams for each subject distance, and O is an object point. Other symbols are the same as those in FIG. (A2) and (a3), (b2) and (b3), and (c2) and (c3) in FIG. 14 are a point image (shown as a semicircle) and a center of gravity (shown as a black dot) captured through a lenticular. These correspond to the subject distances of (a1), (b1), and (c1) in FIG.
 各点像は、奇数列毎に抽出した画像情報(a2、b2、c2)、および偶数列毎に抽出した画素情報(a3、b3、c3)を、補間処理によってY方向に2倍に引き伸ばしたものとして模式的に示している。図示の通り、物点Oが近づくにつれてスポット径は大きくなるが、それぞれの点像は半円形状になるため、取得した画像を奇数列の画像と偶数列の画像に分離したときに、各画像の点像の重心間距離dは、物点が近づくにつれて大きくなる。この重心間距離dは、視差となるため好ましくない。 In each point image, the image information (a2, b2, c2) extracted for each odd column and the pixel information (a3, b3, c3) extracted for each even column are doubled in the Y direction by interpolation processing. It is schematically shown as a thing. As shown in the figure, the spot diameter increases as the object point O approaches, but each point image has a semicircular shape. Therefore, when the acquired image is separated into an odd-numbered image and an even-numbered image, each image is displayed. The distance d between the centroids of the point image increases as the object point approaches. The center-to-center distance d is not preferable because it becomes parallax.
 一方、本実施の形態4では、光学面領域D1とD2のそれぞれは、光軸を中心として軸対称方向に分離して配置されているため、被写体距離が変化しても点像の重心間距離dは変化しない。 On the other hand, in the fourth embodiment, since each of the optical surface regions D1 and D2 is arranged separately in the axially symmetric direction with the optical axis as the center, the distance between the center of gravity of the point image is changed even if the subject distance changes. d does not change.
 図15は、被写体距離毎の点像とその重心について説明する図である。図13において、(a1)と(a2)、(b1)と(b2)および(c1)と(c2)は、マイクロレンズを介して撮像した点像(半円で図示)とその重心(黒点)を示しており、それぞれ図14の(a1)、(b1)および(c1)の被写体距離に対応している。 FIG. 15 is a diagram for explaining a point image and its center of gravity for each subject distance. In FIG. 13, (a1) and (a2), (b1) and (b2), and (c1) and (c2) are point images (shown as semicircles) imaged through a microlens and their center of gravity (black dots). These correspond to the subject distances of (a1), (b1), and (c1) in FIG.
 各点像は、奇数行奇数列と偶数行偶数列を加算した画像情報(a1、b1、c1)、および偶数行奇数列と奇数行偶数列を加算した画像情報(a2、b2、c2)を模式的に示している。図示の通り、本実施の形態4では、それぞれの点像は円弧が対向した形状になるため、取得した画像を奇数行奇数列と偶数行偶数列を加算した画像と偶数行奇数列と奇数行偶数列を加算した画像に分離したときに、被写体距離が変化しても、各画像の点像の重心間距離dは変化しない。 Each point image includes image information (a1, b1, c1) obtained by adding odd rows and odd columns and even rows and even columns, and image information (a2, b2, c2) obtained by adding even rows and odd columns and odd rows and even columns. This is shown schematically. As shown in the figure, in the fourth embodiment, each point image has a shape in which the arcs are opposed to each other. Therefore, the obtained image is an image obtained by adding the odd-numbered rows and the odd-numbered columns and the even-numbered rows and the even-numbered columns, Even when the subject distance changes when the images are divided into the images obtained by adding the even columns, the distance d between the centers of gravity of the point images of each image does not change.
 このように、本実施の形態4では、第1および第2の領域のそれぞれを、光軸を挟んで分離して配置することにより、被写体距離が変化しても取得した画像間に視差が発生しないようにすることができる。 As described above, in the fourth embodiment, the first and second regions are separated from each other with the optical axis interposed therebetween, so that parallax occurs between acquired images even when the subject distance changes. You can avoid it.
(実施の形態5)
 本実施の形態5は、レンズL2上に回折格子Gを設ける代わりに、光学素子L1のうちレンズL2と対向する面に回折格子Gを設ける点で、実施の形態1、2、3と異なっている。ここでは、実施の形態1、2、3と同様の内容についての詳細な説明は省略する。
(Embodiment 5)
The fifth embodiment differs from the first, second, and third embodiments in that a diffraction grating G is provided on the surface of the optical element L1 facing the lens L2 instead of providing the diffraction grating G on the lens L2. Yes. Here, a detailed description of the same contents as in the first, second, and third embodiments is omitted.
 図16は、実施の形態5の撮像装置Aを示す模式図である。平面形状である光学素子L1のうちレンズL2と対向する面に回折格子Gを設けることで、非球面状に設ける場合に比べ、製造を容易にすることができる。光学素子L1上に回折格子Gを設ける場合には、フォトリソグラフィやエッチングなどの半導体プロセスによって光学素子Lの表面を加工することにより回折格子Gを形成すればよい。または、電子ビーム描画装置(EB描画など)によっても、光学素子Lの表面に回折格子Gを加工することが可能である。 FIG. 16 is a schematic diagram illustrating the imaging apparatus A according to the fifth embodiment. By providing the diffraction grating G on the surface facing the lens L2 of the optical element L1 having a planar shape, the manufacturing can be facilitated as compared with the case of providing the aspherical shape. When the diffraction grating G is provided on the optical element L1, the diffraction grating G may be formed by processing the surface of the optical element L by a semiconductor process such as photolithography or etching. Alternatively, the diffraction grating G can be processed on the surface of the optical element L by an electron beam drawing apparatus (EB drawing or the like).
 図16においては、光学素子L1のうちレンズL2と対向する面に回折格子Gを設けているが、光学素子L1のうち被写体側の面に回折格子Gを設けてもよい。 In FIG. 16, the diffraction grating G is provided on the surface of the optical element L1 facing the lens L2, but the diffraction grating G may be provided on the surface of the optical element L1 on the subject side.
(実施の形態6)
 本実施の形態6は、レンズL2に加え、レンズL3を加える点で、実施の形態1、2、3と異なっている。ここでは、実施の形態1、2、3と同様の内容についての詳細な説明は省略する。
(Embodiment 6)
The sixth embodiment is different from the first, second, and third embodiments in that a lens L3 is added in addition to the lens L2. Here, a detailed description of the same contents as in the first, second, and third embodiments is omitted.
 図17は、実施の形態6の撮像装置Aを示す模式図である。レンズL3を加えることで、光学系で発生する収差をより低減させることができ、より高解像度の光学系を実現できる。 FIG. 17 is a schematic diagram illustrating the imaging apparatus A according to the sixth embodiment. By adding the lens L3, the aberration generated in the optical system can be further reduced, and a higher-resolution optical system can be realized.
(実施の形態7)
 本実施の形態7は、レンズL2に加え、回折格子GAを形成したレンズL3を加える点で、実施の形態1、2、3と異なっている。ここでは、実施の形態1、2、3と同様の内容についての詳細な説明は省略する。
(Embodiment 7)
The seventh embodiment is different from the first, second, and third embodiments in that a lens L3 having a diffraction grating GA is added in addition to the lens L2. Here, a detailed description of the same contents as in the first, second, and third embodiments is omitted.
 図18は、実施の形態7の撮像装置Aを示す模式図である。回折格子GAを形成したレンズL3を絞りS近傍に加えている。また、レンズL3とレンズL2との間に光学素子L1が配置されている。回折格子GAの面は絞りS側のレンズ面に設置するとよい。絞りS近傍のレンズ面に回折格子Gを形成することで、第1の光学面領域D1を通過する光束B1が回折格子面上を通過する領域と、第2の光学面領域D2を通過する光束B2が回折格子面上を通過する領域と、第3の光学面領域D3を通過する光束B3が回折格子面上を通過する領域とを分離することができる。図示は省略するが、回折格子GAは、回折格子Gと同様に、第1から第3の回折格子部G1からG3を有する。 FIG. 18 is a schematic diagram illustrating the imaging apparatus A according to the seventh embodiment. A lens L3 on which the diffraction grating GA is formed is added in the vicinity of the stop S. An optical element L1 is disposed between the lens L3 and the lens L2. The surface of the diffraction grating GA is preferably installed on the lens surface on the stop S side. By forming the diffraction grating G on the lens surface near the stop S, the light beam B1 that passes through the first optical surface region D1 and the light beam that passes through the second optical surface region D2 A region where B2 passes on the diffraction grating surface and a region where the light beam B3 passing through the third optical surface region D3 passes on the diffraction grating surface can be separated. Although not shown in the drawing, the diffraction grating GA has first to third diffraction grating portions G1 to G3, similar to the diffraction grating G.
 光学系に回折格子面を2面設定することで、回折格子のパワー配分をより微調整でき、色収差をさらに低減させることができる。 By setting two diffraction grating surfaces in the optical system, the power distribution of the diffraction grating can be finely adjusted, and chromatic aberration can be further reduced.
 本実施形態においては、回折格子Gを設けずに、回折格子GAのみを設けてもよい。 In this embodiment, only the diffraction grating GA may be provided without providing the diffraction grating G.
(実施の形態8)
 本実施の形態8は、回折格子Gの表面を覆うように光学調整層F1を加える点で、実施の形態1、2、3と異なっている。ここでは、実施の形態1、2、3と同様の内容についての詳細な説明は省略する。
(Embodiment 8)
The eighth embodiment is different from the first, second, and third embodiments in that an optical adjustment layer F1 is added so as to cover the surface of the diffraction grating G. Here, a detailed description of the same contents as in the first, second, and third embodiments is omitted.
 図19は、実施の形態8の撮像装置Aを示す模式図である。実施の形態8は、レンズL2の屈折率をn1(λ)、光学調整層F1の屈折率をn2(λ)としたとき、下記式を満たすように回折段差の深さdを決定する。
  d=mλ/|n2(λ)-n1(λ)|    (式3)
FIG. 19 is a schematic diagram illustrating an imaging apparatus A according to the eighth embodiment. In Embodiment 8, when the refractive index of the lens L2 is n1 (λ) and the refractive index of the optical adjustment layer F1 is n2 (λ), the depth d of the diffraction step is determined so as to satisfy the following formula.
d = mλ / | n2 (λ) −n1 (λ) | (Formula 3)
 (式3)を満たすように、回折段差の深さdを設定することで、実施の形態1、2、3よりも不要な次数の回折光をより低減することができる。例えば、レンズL2の屈折率をnd=1.585、アッベ数をνd=27.9、光学調整層F1の屈折率をnd=1.623、アッベ数をνd=40として、光学面領域D1、D2、D3が透過する光の波長帯域をそれぞれ400nm~500nm、500nm~600nm、600nm~700nmとし、回折格子の回折段差深さをそれぞれd1=14.0μm、d2=15.0μm、d3=16.5μmとすると、光束B1、B2、B3の1次回折効率はそれぞれ図20(a)、(b)、(c)に示す通り、いずれもそれぞれの波長帯域でほぼ100%なり、光学系として可視光域全域でほぼ100%の回折効率を確保することができる。 By setting the depth d of the diffraction step so as to satisfy (Equation 3), it is possible to further reduce unnecessary orders of diffracted light compared to the first, second, and third embodiments. For example, when the refractive index of the lens L2 is nd = 1.585, the Abbe number is νd = 27.9, the refractive index of the optical adjustment layer F1 is nd = 1.623, and the Abbe number is νd = 40, the optical surface region D1, The wavelength bands of light transmitted through D2 and D3 are 400 nm to 500 nm, 500 nm to 600 nm, and 600 nm to 700 nm, respectively, and the diffraction step depths of the diffraction gratings are d1 = 14.0 μm, d2 = 15.0 μm, and d3 = 16. Assuming that the thickness is 5 μm, the first-order diffraction efficiencies of the light beams B1, B2, and B3 are almost 100% in the respective wavelength bands as shown in FIGS. 20A, 20B, and 20C, and are visible as an optical system. Nearly 100% diffraction efficiency can be secured over the entire optical region.
(実施の形態9)
 本実施の形態9は、レンチキュラレンズやマイクロレンズアレイを撮像面上に形成したという点で、実施の形態1~8と異なる。ここでは、本実施形態において実施の形態1~8と同様の内容についての詳細な説明は省略する。
(Embodiment 9)
The ninth embodiment is different from the first to eighth embodiments in that a lenticular lens and a microlens array are formed on the imaging surface. Here, detailed description of the same contents as in the first to eighth embodiments is omitted in this embodiment.
 図21(a)および(b)は、アレイ状光学素子Kおよび撮像素子Nを拡大して示す図である。本実施形態では、レンチキュラレンズ(またはマイクロレンズ)Mdが、撮像素子Nの撮像面Ni上に形成されている。撮像面Niには、実施の形態1等と同様に、画素Pが行列状に配置されている。これら複数の画素Pに対して、1つのレンチキュラレンズの光学要素あるいは、1つのマイクロレンズが対応している。本実施の形態においても、実施の形態1~8と同様に、光学素子L1上の異なる領域を通過した光束を、それぞれ異なる画素に導くことができる。また、図21(b)は、本実施形態の変形例を示す図である。図21(b)に示す構成では、撮像面Ni上に、画素Pを覆うようにマイクロレンズMsが形成され、マイクロレンズMsの表面上にアレイ状光学素子が積層されている。図21(b)に示す構成では、図21(a)の構成よりも集光効率を高めることができる。 21 (a) and 21 (b) are diagrams showing the arrayed optical element K and the imaging element N in an enlarged manner. In the present embodiment, a lenticular lens (or microlens) Md is formed on the imaging surface Ni of the imaging element N. Pixels P are arranged in a matrix on the imaging surface Ni, as in the first embodiment. One lenticular lens optical element or one microlens corresponds to the plurality of pixels P. Also in the present embodiment, similarly to Embodiments 1 to 8, the light beams that have passed through different regions on the optical element L1 can be guided to different pixels. FIG. 21B is a diagram showing a modification of the present embodiment. In the configuration shown in FIG. 21B, a microlens Ms is formed on the imaging surface Ni so as to cover the pixel P, and an arrayed optical element is stacked on the surface of the microlens Ms. In the configuration shown in FIG. 21B, the light collection efficiency can be increased as compared with the configuration in FIG.
 実施の形態1~8のようにアレイ状光学素子が撮像素子と分離していると、アレイ状光学素子と撮像素子との位置合せが難しくなるが、本実施の形態9のように、アレイ状光学素子Kを撮像素子上に形成する構成にすることにより、ウエハプロセスにて位置合せが可能になるため、位置合せが容易となり、位置合せ精度も増すことができる。 If the array-like optical element is separated from the image sensor as in the first to eighth embodiments, alignment of the array-like optical element and the image sensor becomes difficult. However, as in the ninth embodiment, the array-like optical element is difficult to align. By adopting a configuration in which the optical element K is formed on the image sensor, alignment can be performed by a wafer process, so that alignment can be facilitated and alignment accuracy can be increased.
(その他の実施の形態)
 なお、本実施の形態では、各光学面領域の分割数を2分割、3分割、4分割の実施例について説明したが、より多くの分割数であってもよい。
(Other embodiments)
In the present embodiment, the example in which the number of divisions of each optical surface region is two divisions, three divisions, and four divisions has been described, but a larger number of divisions may be used.
 また、カメラ等の撮像素子に用いられるレンズの多くは、像側が非テレセントリック光学系である。本実施の形態のレンズ光学系Lに像側非テレセントリック光学系を適用した場合、画角が変化すると、主光線はアレイ状光学素子Kに対して斜めに入射する。図22(a)は、光軸外の撮像部近傍を拡大して示す図である。図22(a)では、アレイ状光学素子Kを通過する光のうち1つの光学面領域を通過する光束のみを示している。図22(a)に示すように、レンズ光学系Lが像側非テレセントリック光学系の場合には、隣接画素に光が漏れてクロストークが発生しやすくなるが、図22(b)のようにアレイ状光学素子を画素配列に対してΔだけオフセットさせることにより、クロストークを低減させることができるため、色純度の劣化を抑制することができる。前記入射角は、像高によって異なるため、前記オフセット量Δは、撮像面への光束の入射角に応じて設定すればよい。 Also, many of the lenses used in image sensors such as cameras have a non-telecentric optical system on the image side. When the image-side non-telecentric optical system is applied to the lens optical system L of the present embodiment, the principal ray is incident on the arrayed optical element K obliquely when the angle of view changes. FIG. 22A is an enlarged view showing the vicinity of the imaging unit outside the optical axis. In FIG. 22A, only the light beam that passes through one optical surface region out of the light that passes through the arrayed optical element K is shown. As shown in FIG. 22A, when the lens optical system L is an image-side non-telecentric optical system, light leaks to adjacent pixels and crosstalk tends to occur, but as shown in FIG. By offsetting the arrayed optical element by Δ with respect to the pixel array, crosstalk can be reduced, so that deterioration of color purity can be suppressed. Since the incident angle varies depending on the image height, the offset amount Δ may be set according to the incident angle of the light beam on the imaging surface.
 また、本実施の形態のレンズ光学系Lに像側テレセントリック光学系を適用してもよい。像側テレセントリック光学系の場合、画角が変化しても、主光線はアレイ状光学素子Kに対して0度に近い値で入射するため、撮像領域全域にわたって、クロストークを低減することができる。図23は、像側テレセントリック光学系を適用した場合の撮像装置Aを示す模式図である。画角が変化しても主光線をアレイ状光学素子Kに対して0度に近い値で入射するように、L3レンズで調整している。 Also, an image side telecentric optical system may be applied to the lens optical system L of the present embodiment. In the case of an image side telecentric optical system, even if the angle of view changes, the chief ray is incident on the arrayed optical element K at a value close to 0 degrees, so that crosstalk can be reduced over the entire imaging region. . FIG. 23 is a schematic diagram illustrating the imaging apparatus A when the image side telecentric optical system is applied. The L3 lens is used to adjust the principal ray so that it is incident on the arrayed optical element K at a value close to 0 degrees even if the angle of view changes.
 また、本実施の形態3、4では、アレイ状光学素子Kはマイクロレンズアレイとしているが、マイクロレンズの各光学要素は、回転対称形状であってもよい。 In the third and fourth embodiments, the arrayed optical element K is a microlens array, but each optical element of the microlens may be rotationally symmetric.
 マイクロレンズの製法として、レジストを矩形にパーターニングし、熱処理によってレンズ曲面を形成する手法があるが、このようなマイクロレンズの斜視図は、図24(a1)のようになる。図24(a1)のマイクロレンズの等高線は、図24(a2)のようになるため、縦横方向と斜め方向の曲率半径が異なる。図24(a3)は、図24(a1)、(a2)に示すマイクロレンズを本実施の形態のアレイ状光学素子に適用した場合の、光線追跡シミュレーションの結果である。図24(a3)では、アレイ状光学素子Kを通過する光のうち1つの光学面領域を通過する光束のみを示しているが、このように回転非対称形状のマイクロレンズの場合、隣接の画素に光が漏れてクロストークが発生する。一方、回転対称形状のマイクロレンズの斜視図は、図24(b1)のようになる。図24(b1)のマイクロレンズの等高線は、図24(b2)のようになり、縦横方向と斜め方向の曲率半径は等しい。このような回転対称形状のマイクロレンズは、熱インプリントやUVインプリント製法により形成することができる。図24(b3)は、図24(b1)、(b2)に示すマイクロレンズを本実施の形態のアレイ状光学素子に適用した場合の、光線追跡シミュレーションの結果である。図24(b3)では、アレイ状光学素子Kを通過する光のうち1つの光学面領域を通過する光束のみを示しているが、図24(a3)のようなクロストークは、発生していないことがわかる。このように、マイクロレンズの各光学要素は、回転対称形状にすることにより、クロストークを低減させることができるため、色純度の劣化を抑制することができる。 As a method of manufacturing a microlens, there is a method of patterning a resist into a rectangle and forming a curved lens surface by heat treatment. A perspective view of such a microlens is as shown in FIG. Since the contour lines of the microlens in FIG. 24A1 are as shown in FIG. 24A2, the curvature radii in the vertical and horizontal directions and the oblique direction are different. FIG. 24 (a3) shows the result of ray tracing simulation when the microlens shown in FIGS. 24 (a1) and (a2) is applied to the arrayed optical element of the present embodiment. In FIG. 24 (a3), only the light beam that passes through one optical surface region of the light that passes through the arrayed optical element K is shown. Light leaks and crosstalk occurs. On the other hand, a perspective view of the rotationally symmetric microlens is as shown in FIG. The contour lines of the microlens in FIG. 24 (b1) are as shown in FIG. 24 (b2), and the curvature radii in the vertical and horizontal directions and the oblique direction are equal. Such a rotationally symmetric microlens can be formed by thermal imprinting or UV imprinting. FIG. 24 (b3) shows the result of ray tracing simulation when the microlens shown in FIGS. 24 (b1) and (b2) is applied to the arrayed optical element of the present embodiment. In FIG. 24 (b3), only the light beam passing through one optical surface region among the light passing through the arrayed optical element K is shown, but the crosstalk as shown in FIG. 24 (a3) does not occur. I understand that. Thus, since each optical element of the microlens can have a rotationally symmetric shape, crosstalk can be reduced, so that deterioration of color purity can be suppressed.
 また、光束B1、B2間のクロストークを低減させるため、絞りSは、領域の境界部に対応する位置に遮光領域を設けた構成であることが望ましい。 Further, in order to reduce the crosstalk between the light beams B1 and B2, it is desirable that the diaphragm S has a configuration in which a light shielding region is provided at a position corresponding to the boundary portion of the region.
 また、レンズL2の製造方法としては、図25(a)に示すように電子ビーム描画装置10を用いて金型11上に回折格子Gを形成するための凹凸12を形成するとよい。電子ビーム描画装置10を用いれば、非回転対称の構造も容易に形成できる。金型を形成したあとは、図25(b)に示すように樹脂材料であれば射出成形、ガラス材料であればプレス成形にてレンズL2を大量に生産できる。また、その他の加工方法としては、図26に示すように、領域ごとに分割して金型13Aから13Dを形成し、それらを繋ぎ合わせることで1つの金型を形成する方法もある。 Further, as a method of manufacturing the lens L2, as shown in FIG. 25A, it is preferable to form the irregularities 12 for forming the diffraction grating G on the mold 11 using the electron beam drawing apparatus 10. If the electron beam drawing apparatus 10 is used, a non-rotationally symmetric structure can be easily formed. After forming the mold, as shown in FIG. 25B, a large amount of lenses L2 can be produced by injection molding if it is a resin material and press molding if it is a glass material. As another processing method, as shown in FIG. 26, there is a method in which molds 13A to 13D are formed by dividing each region, and one mold is formed by connecting them.
 実施の形態1から9は、信号処理部Cを備えているが、撮像装置は、これらの信号処理部を備えていなくてもよい。その場合、撮像装置の外部のPC等を用いて、信号処理部Cが行う処理を行えばよい。すなわち、本発明の一態様によれば、レンズ光学系L、アレイ状光学素子Kおよび撮像素子Nを備える撮像装置と、外部の信号処理装置とを備えるシステムを実現することもできる。 Although Embodiments 1 to 9 include the signal processing unit C, the imaging apparatus may not include these signal processing units. In that case, the processing performed by the signal processing unit C may be performed using a PC or the like outside the imaging apparatus. That is, according to one aspect of the present invention, it is possible to realize a system including an imaging device including the lens optical system L, the arrayed optical element K, and the imaging device N, and an external signal processing device.
 本願に開示された撮像装置は、デジタルスチルカメラやデジタルビデオカメラ等の撮像装として有用である。また、車載用カメラ、セキュリティカメラ、内視鏡やカプセル内視鏡等の医療用、生態認証用、顕微鏡用、天体望遠鏡用等の分光画像取得用のカメラにも応用できる。 The imaging device disclosed in the present application is useful as an imaging device such as a digital still camera or a digital video camera. The present invention can also be applied to medical cameras such as in-vehicle cameras, security cameras, endoscopes and capsule endoscopes, biometric authentication cameras, microscopes, and astronomical telescopes.
A             撮像装置
L             レンズ光学系
L1            光学素子
L2            レンズ
D1、D2、D3、D4   光学面領域
S             絞り
K             アレイ状光学素子
N             撮像素子
Ni            撮像面
Ms、Md         撮像素子上のマイクロレンズ
M1、M2         アレイ状光学素子の光学要素
P1、P2、P3、P4、P 撮像素子上の画素
C             信号処理部
G、GA          回折格子
G1、G2、G3、G4   回折格子部
R1、R2、R3、R4   輪帯
A1、A2、A3、A4   回折段差
d1、d2、d3、d4   回折段差の深さ
B1、B2、B3      光束
F1            光学調整層
10            電子ビーム描画装置
11            金型
12            回折格子Gを形成するための凹凸
13A、13B、13C、13D        金型
A Imaging device L Lens optical system L1 Optical element L2 Lenses D1, D2, D3, D4 Optical surface area S Aperture K Array-like optical element N Imaging element Ni Imaging surface Ms, Md Microlenses M1, M2 on the imaging element Array-like optical Optical elements P1, P2, P3, P4, P of the element Pixel C on the image sensor G, GA diffraction gratings G1, G2, G3, G4 Diffraction grating parts R1, R2, R3, R4 Annulus A1, A2, A3, A4 Diffraction steps d1, d2, d3, d4 Diffraction step depths B1, B2, B3 Light flux F1 Optical adjustment layer 10 Electron beam drawing device 11 Mold 12 Concavities and convexities 13A, 13B, 13C for forming diffraction grating G , 13D mold

Claims (19)

  1.  所定の平面内に、第1の波長帯域の光を透過する第1の領域と、前記第1の波長帯域とは異なる第2の波長帯域の光を透過する第2の領域とを少なくとも有する光学部材と、
     前記第1の領域を透過する光が入射する領域に設けられた第1の回折段差と、前記第2の領域を透過する光が入射する領域に設けられ、前記第1の回折段差とは異なる深さを有する第2の回折段差とを有する回折格子と、
     複数の第1の画素と複数の第2の画素とを少なくとも有する撮像素子と、
     前記回折格子と前記撮像素子との間に配置され、前記第1の領域を透過した光を前記複数の第1の画素に入射させ、前記第2の領域を透過した光を前記複数の第2の画素に入射させるアレイ状光学素子とを備える、撮像装置。
    An optical having at least a first region that transmits light in a first wavelength band and a second region that transmits light in a second wavelength band different from the first wavelength band in a predetermined plane. Members,
    A first diffraction step provided in a region where light passing through the first region is incident and a region where light passing through the second region is incident are different from the first diffraction step. A diffraction grating having a second diffraction step having a depth;
    An imaging device having at least a plurality of first pixels and a plurality of second pixels;
    Light that is disposed between the diffraction grating and the imaging device and that has passed through the first region is incident on the plurality of first pixels, and light that has passed through the second region is incident on the plurality of second regions. And an arrayed optical element that is incident on the pixels.
  2.  前記光学部材は、前記第1、第2の領域以外の少なくとも第3の領域をさらに有し、
     前記回折格子は、前記第3の領域を透過する光が入射する領域に設けられ、前記第1、第2の回折段差とは異なる深さを有する第3の回折段差をさらに有し、
     前記撮像素子は、複数の第3の画素をさらに有し、
     前記第3の領域は、前記第1の波長帯域および前記第2の波長帯域と異なる第3の波長帯域の光を透過し、
     前記アレイ状光学素子は、前記第3の領域を透過した光を、前記複数の第3の画素に入射させる、請求項1に記載の撮像装置。
    The optical member further includes at least a third region other than the first and second regions,
    The diffraction grating further includes a third diffraction step provided in a region where light transmitted through the third region is incident, and having a depth different from the first and second diffraction steps,
    The imaging device further includes a plurality of third pixels,
    The third region transmits light in a third wavelength band different from the first wavelength band and the second wavelength band,
    The imaging apparatus according to claim 1, wherein the arrayed optical element causes light transmitted through the third region to enter the plurality of third pixels.
  3.  前記光学部材は、前記第1、第2、第3の領域以外の第4の領域をさらに有し、
     前記回折格子は、前記第4の領域を透過する光が入射する領域に設けられ、前記第1、第2および第3の回折段差とは異なる深さを有する第4の回折段差をさらに有し、
     前記撮像素子は複数の第4の画素をさらに有し、
     前記第4の領域は、前記第1、第2および第3の波長帯域と異なる第4の波長帯域の光を透過し、
     前記アレイ状光学素子は、前記第4の領域を透過した光を、前記複数の第4の画素に入射させる、請求項2に記載の撮像装置。
    The optical member further includes a fourth region other than the first, second, and third regions,
    The diffraction grating further includes a fourth diffraction step provided in a region where light transmitted through the fourth region is incident and having a depth different from that of the first, second, and third diffraction steps. ,
    The imaging device further includes a plurality of fourth pixels,
    The fourth region transmits light in a fourth wavelength band different from the first, second, and third wavelength bands;
    The imaging apparatus according to claim 2, wherein the arrayed optical element causes light transmitted through the fourth region to enter the plurality of fourth pixels.
  4.  前記光学部材と前記アレイ状光学素子との間に設けられ、前記光学部材を通過した光が入射する第1のレンズをさらに備える、請求項1から3のいずれかに記載の撮像装置。 The imaging apparatus according to any one of claims 1 to 3, further comprising a first lens that is provided between the optical member and the arrayed optical element and into which light that has passed through the optical member is incident.
  5.  前記回折格子は前記第1のレンズのうち前記光学部材と対向する面に設けられている、請求項4に記載の撮像装置。 The imaging apparatus according to claim 4, wherein the diffraction grating is provided on a surface of the first lens facing the optical member.
  6.  前記回折格子は前記光学部材のうち前記第1のレンズと対向する面に設けられている、請求項4に記載の撮像装置。 The imaging apparatus according to claim 4, wherein the diffraction grating is provided on a surface of the optical member that faces the first lens.
  7.  前記光学部材の前記第1および第2の領域は、前記第1のレンズの光軸を挟んで、互いに分離された複数の領域から構成されている、請求項4に記載の撮像装置。 The imaging device according to claim 4, wherein the first and second regions of the optical member are configured by a plurality of regions separated from each other with the optical axis of the first lens interposed therebetween.
  8.  前記第1の領域及び前記第2の領域のうち少なくとも1つの領域は、近赤外光の波長帯域の光を透過させ、可視光領域の波長の光を遮断する分光透過率特性を有する請求項1から7のいずれかに記載の撮像装置。 The at least one region of the first region and the second region has a spectral transmittance characteristic that transmits light in a wavelength band of near infrared light and blocks light in a wavelength of visible light region. The imaging device according to any one of 1 to 7.
  9.  前記第1の領域及び前記第2の領域のうち少なくとも1つの領域は、他の領域の波長帯域幅よりも相対的に狭い波長帯域幅の光を透過させる分光透過率特性を有する請求項1から7のいずれかに記載の撮像装置。 The at least one of the first region and the second region has a spectral transmittance characteristic that transmits light having a wavelength bandwidth relatively narrower than the wavelength bandwidth of the other region. The imaging apparatus according to any one of 7.
  10.  前記第1の領域および前記第2の領域は、前記第1のレンズの光軸を境界中心として分割された領域である、請求項4に記載の撮像装置。 The imaging apparatus according to claim 4, wherein the first area and the second area are areas divided with the optical axis of the first lens as a boundary center.
  11.  第2のレンズをさらに備え、
     前記光学部材は、前記第2のレンズと前記第1のレンズとの間に配置され、
     前記第2のレンズの表面には回折格子が設けられている、請求項4に記載の撮像装置。
    A second lens;
    The optical member is disposed between the second lens and the first lens,
    The imaging apparatus according to claim 4, wherein a diffraction grating is provided on a surface of the second lens.
  12.  前記回折格子が設けられた面上に形成された光学調整層をさらに備える、請求項1から11のいずれかに記載の撮像装置。 The imaging apparatus according to claim 1, further comprising an optical adjustment layer formed on a surface on which the diffraction grating is provided.
  13.  前記第1の領域および前記第2の領域の境界部に対応する位置に設けられた遮光領域をさらに備える請求項1から12のいずれかに記載の撮像装置。 The imaging device according to any one of claims 1 to 12, further comprising a light shielding region provided at a position corresponding to a boundary portion between the first region and the second region.
  14.  絞りをさらに備え、前記第1の領域及び前記第2の領域は、前記絞り近傍に配置されている、請求項1から13のいずれかに記載の撮像装置。 The imaging apparatus according to claim 1, further comprising a diaphragm, wherein the first area and the second area are arranged in the vicinity of the diaphragm.
  15.  前記回折格子は、前記絞り近傍に配置されている、請求項14に記載の撮像装置。 The imaging apparatus according to claim 14, wherein the diffraction grating is disposed in the vicinity of the diaphragm.
  16.  前記第1の回折段差に到達する光の波長帯幅の中央の値をλ1とすると、前記第1の回折段差の深さd1は、
    0.9λ1/{n(λ1)-1}≦d1≦1.1λ1/{n(λ1)-1}を満たし、
     前記第2の回折段差に到達する光の波長帯幅の中央の値をλ2とすると、前記第2の回折段差深さd2は、
    0.9λ2/{n(λ2)-1}≦d2≦1.1λ2/{n(λ2)-1}を満たす、請求項1から15のいずれかに記載の撮像装置。
    When the central value of the wavelength bandwidth of the light reaching the first diffraction step is λ 1 , the depth d 1 of the first diffraction step is
    0.9λ 1 / {n (λ 1 ) −1} ≦ d 1 ≦ 1.1λ 1 / {n (λ 1 ) −1}
    When the central value of the wavelength band width of the light reaching the second diffraction step is λ 2 , the second diffraction step depth d 2 is
    The imaging device according to claim 1, wherein 0.9λ 2 / {n (λ 2 ) −1} ≦ d 2 ≦ 1.1λ 2 / {n (λ 2 ) −1} is satisfied.
  17.  前記第1の回折段差における前記第1の波長帯域の光の回折効率は、前記第2の波長帯域の光の回折効率よりも高く、
     前記第2の回折段差における前記第2の波長帯域の光の回折効率は、前記第1の波長帯域の光の回折効率よりも高い、請求項1から15のいずれかに記載の撮像装置。
    The diffraction efficiency of the light in the first wavelength band at the first diffraction step is higher than the diffraction efficiency of the light in the second wavelength band,
    The imaging apparatus according to claim 1, wherein diffraction efficiency of light in the second wavelength band at the second diffraction step is higher than diffraction efficiency of light in the first wavelength band.
  18.  信号処理部をさらに備え、
     前記信号処理部は、前記複数の第1の画素において得られた画素値から前記第1の波長帯域に対応する第1の画像情報を生成し、前記複数の第2の画素において得られた画素値から前記第2の波長帯域に対応する第2の画像情報を生成する、請求項1から17のいずれかに記載の撮像装置。
    A signal processing unit,
    The signal processing unit generates first image information corresponding to the first wavelength band from pixel values obtained in the plurality of first pixels, and pixels obtained in the plurality of second pixels The imaging device according to claim 1, wherein second image information corresponding to the second wavelength band is generated from a value.
  19.  請求項1から17のいずれかに記載の撮像装置と、
     前記撮像装置における前記複数の第1の画素において得られた画素値から前記第1の波長帯域に対応する第1の画像情報を生成し、前記複数の第2の画素において得られた画素値から前記第2の波長帯域に対応する第2の画像情報を生成する信号処理装置とを備える撮像システム。
    An imaging device according to any one of claims 1 to 17,
    First image information corresponding to the first wavelength band is generated from pixel values obtained at the plurality of first pixels in the imaging device, and from pixel values obtained at the plurality of second pixels. An image pickup system comprising: a signal processing device that generates second image information corresponding to the second wavelength band.
PCT/JP2012/005095 2011-09-16 2012-08-10 Image-capturing device WO2013038595A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-203809 2011-09-16
JP2011203809 2011-09-16

Publications (1)

Publication Number Publication Date
WO2013038595A1 true WO2013038595A1 (en) 2013-03-21

Family

ID=47882848

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/005095 WO2013038595A1 (en) 2011-09-16 2012-08-10 Image-capturing device

Country Status (1)

Country Link
WO (1) WO2013038595A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019504325A (en) * 2016-02-02 2019-02-14 ケーエルエー−テンカー コーポレイション Hyperspectral imaging metrology system and method
WO2022264488A1 (en) * 2021-06-15 2022-12-22 ソニーグループ株式会社 Light-condensing element

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08220482A (en) * 1994-12-13 1996-08-30 Olympus Optical Co Ltd Optical system including diffraction optical element
JP2000098225A (en) * 1999-08-18 2000-04-07 Matsushita Electric Ind Co Ltd Method for designing diffraction means integrated type lens
JP2004157059A (en) * 2002-11-08 2004-06-03 Minolta Co Ltd Imaging device and lens optical system
US7433042B1 (en) * 2003-12-05 2008-10-07 Surface Optics Corporation Spatially corrected full-cubed hyperspectral imager
JP2010134042A (en) * 2008-12-02 2010-06-17 Fujifilm Corp Method for producing color filter and solid-state imaging apparatus
WO2010087208A1 (en) * 2009-02-02 2010-08-05 パナソニック株式会社 Diffractive optical element and manufacturing method thereof
JP2011075562A (en) * 2009-09-30 2011-04-14 Ricoh Co Ltd Adjustable multimode lightfield imaging system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08220482A (en) * 1994-12-13 1996-08-30 Olympus Optical Co Ltd Optical system including diffraction optical element
JP2000098225A (en) * 1999-08-18 2000-04-07 Matsushita Electric Ind Co Ltd Method for designing diffraction means integrated type lens
JP2004157059A (en) * 2002-11-08 2004-06-03 Minolta Co Ltd Imaging device and lens optical system
US7433042B1 (en) * 2003-12-05 2008-10-07 Surface Optics Corporation Spatially corrected full-cubed hyperspectral imager
JP2010134042A (en) * 2008-12-02 2010-06-17 Fujifilm Corp Method for producing color filter and solid-state imaging apparatus
WO2010087208A1 (en) * 2009-02-02 2010-08-05 パナソニック株式会社 Diffractive optical element and manufacturing method thereof
JP2011075562A (en) * 2009-09-30 2011-04-14 Ricoh Co Ltd Adjustable multimode lightfield imaging system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019504325A (en) * 2016-02-02 2019-02-14 ケーエルエー−テンカー コーポレイション Hyperspectral imaging metrology system and method
WO2022264488A1 (en) * 2021-06-15 2022-12-22 ソニーグループ株式会社 Light-condensing element

Similar Documents

Publication Publication Date Title
US8854525B2 (en) Imaging device, imaging system, and imaging method
US8717483B2 (en) Imaging device, imaging system, and imaging method
US20210082988A1 (en) Image-capture element and image capture device
US8836825B2 (en) Imaging apparatus
JP4077510B2 (en) Diffraction imaging lens, diffractive imaging lens optical system, and imaging apparatus using the same
EP2083447B1 (en) Image pickup apparatus
JP5144841B1 (en) Imaging device
TWI443366B (en) Imaging lens, and imaging module
WO2016098640A1 (en) Solid-state image pickup element and electronic device
US9099369B2 (en) Solid-state image sensor
JP6008300B2 (en) Imaging device
KR20100059896A (en) Image sensor
US8902339B2 (en) Solid-state imaging element and dispersing element array for improved color imaging
WO2013038595A1 (en) Image-capturing device
US9179114B2 (en) Solid-state image sensor
JP2005341301A (en) Double eye imaging device
US11930256B2 (en) Imaging device, imaging optical system, and imaging method
US20220360759A1 (en) Image Sensor and Image Apparatus
JP6911353B2 (en) Manufacturing method of solid-state image sensor
JP2012169673A (en) Solid-state imaging device and electronic apparatus
JP2009182550A (en) Camera module
JP6563243B2 (en) Imaging apparatus and camera system
JP2005338505A (en) Compound eye imaging apparatus
Radl Optimum pixel design for dispersive filtering

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12832583

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12832583

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP