WO2017051876A1 - 撮像素子および撮像装置 - Google Patents
撮像素子および撮像装置 Download PDFInfo
- Publication number
- WO2017051876A1 WO2017051876A1 PCT/JP2016/078034 JP2016078034W WO2017051876A1 WO 2017051876 A1 WO2017051876 A1 WO 2017051876A1 JP 2016078034 W JP2016078034 W JP 2016078034W WO 2017051876 A1 WO2017051876 A1 WO 2017051876A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- imaging
- photoelectric conversion
- light
- image
- image sensor
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 176
- 238000006243 chemical reaction Methods 0.000 claims abstract description 79
- 238000012545 processing Methods 0.000 claims description 40
- 239000000758 substrate Substances 0.000 description 15
- 238000000034 method Methods 0.000 description 13
- 239000010408 film Substances 0.000 description 12
- 238000012986 modification Methods 0.000 description 12
- 230000004048 modification Effects 0.000 description 12
- 239000003086 colorant Substances 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000011049 filling Methods 0.000 description 5
- 239000000463 material Substances 0.000 description 5
- 230000035945 sensitivity Effects 0.000 description 5
- 238000012546 transfer Methods 0.000 description 5
- 238000009825 accumulation Methods 0.000 description 4
- 230000000295 complement effect Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 238000005192 partition Methods 0.000 description 3
- 239000004986 Cholesteric liquid crystals (ChLC) Substances 0.000 description 2
- 229920001609 Poly(3,4-ethylenedioxythiophene) Polymers 0.000 description 2
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000006866 deterioration Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- AMGQUBHHOARCQH-UHFFFAOYSA-N indium;oxotin Chemical compound [In].[Sn]=O AMGQUBHHOARCQH-UHFFFAOYSA-N 0.000 description 2
- 239000011347 resin Substances 0.000 description 2
- 229920005989 resin Polymers 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 239000007772 electrode material Substances 0.000 description 1
- 238000005530 etching Methods 0.000 description 1
- 238000001746 injection moulding Methods 0.000 description 1
- 238000010030 laminating Methods 0.000 description 1
- 238000003754 machining Methods 0.000 description 1
- 238000000465 moulding Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000000377 silicon dioxide Substances 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14634—Assemblies, i.e. Hybrid structures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/42—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by switching between different modes of operation using different resolutions or aspect ratios, e.g. switching between interlaced and non-interlaced mode
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L25/00—Assemblies consisting of a plurality of individual semiconductor or other solid state devices ; Multistep manufacturing processes thereof
- H01L25/18—Assemblies consisting of a plurality of individual semiconductor or other solid state devices ; Multistep manufacturing processes thereof the devices being of types provided for in two or more different subgroups of the same main group of groups H01L27/00 - H01L33/00, or in a single subclass of H10K, H10N
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1462—Coatings
- H01L27/14621—Colour filter arrangements
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
- H01L27/14627—Microlenses
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14643—Photodiode arrays; MOS imagers
- H01L27/14645—Colour imagers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/13—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
- H04N23/16—Optical arrangements associated therewith, e.g. for beam-splitting or for colour correction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/957—Light-field or plenoptic cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/135—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
- H04N25/136—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements using complementary colours
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10K—ORGANIC ELECTRIC SOLID-STATE DEVICES
- H10K39/00—Integrated devices, or assemblies of multiple devices, comprising at least one organic radiation-sensitive element covered by group H10K30/00
- H10K39/30—Devices controlled by radiation
- H10K39/32—Organic image sensors
Definitions
- the present invention relates to an imaging element and an imaging apparatus.
- Patent Document 1 An imaging apparatus having a normal shooting mode and a refocus shooting mode is known (see Patent Document 1). In the prior art, there is a problem that the colors received by the two image sensors are not studied.
- An image pickup device includes a first image pickup unit having a plurality of first photoelectric conversion units that photoelectrically convert light of some wavelengths of incident light and transmit light of other wavelengths; A plurality of lenses on which light transmitted through the imaging unit is incident, and a second imaging unit that is provided for each of the plurality of lenses and includes a second photoelectric conversion unit that photoelectrically converts incident light.
- An imaging device according to a second aspect of the invention generates first image data based on an image sensor according to the first aspect and a signal from the first imaging unit, and based on a signal from the second imaging unit, An image processing unit that generates second image data having a smaller number of pixels than the first image data.
- FIG. 5A is a diagram illustrating a wavelength range in which photoelectric conversion is performed in the pixels of the photoelectric conversion element array.
- FIG. 5B is a diagram illustrating a wavelength range in which photoelectric conversion is performed in the pixels of the light receiving element array. It is a flowchart which illustrates the flow of the camera process which a control part performs. It is sectional drawing which shows typically one structure among the several micromirrors which comprise a micromirror array.
- FIG. 1 is a diagram illustrating a configuration of main parts of a camera 100 according to an embodiment.
- the coordinate axes shown in FIG. 1 light from the subject travels in the negative direction of the Z axis.
- the upward direction and the direction perpendicular to the Z axis are defined as the Y axis plus direction
- the direction perpendicular to the paper surface and the direction perpendicular to the Z axis and the Y axis is defined as the X axis plus direction.
- the orientation in each figure is expressed with reference to the coordinate axes in FIG.
- the imaging lens 201 is configured to be replaceable, and is used by being mounted on the body of the camera 100. Note that the imaging lens 201 may be integrated with the body of the camera 100.
- the camera 100 includes a first image sensor (imaging unit) 202 and a second image sensor (imaging unit) 204, and can capture a plurality of images in one shot.
- the imaging lens 201 guides light from the subject to the first imaging element 202.
- the first image sensor 202 has translucency, photoelectrically converts (absorbs) part of incident subject light, and transmits part of incident subject light (light that has not been absorbed). To do.
- a microlens array 203 is provided in close proximity to (may be in contact with) the surface on the negative side of the Z-axis of the first image sensor 202.
- the light transmitted through the first image sensor 202 is incident on the microlens array 203.
- the microlens array 203 is configured by two-dimensionally arranging microlenses (microlens L described later) in a lattice shape or a honeycomb shape.
- a second image sensor 204 is provided in the negative Z-axis direction with respect to the microlens array 203.
- the subject light that has passed through the microlens array 203 is incident on the second image sensor 204.
- the second image sensor 204 photoelectrically converts incident subject light.
- the control unit 205 controls the imaging operation of the camera 100. That is, drive control during photoelectric conversion for the first image sensor 202 and the second image sensor 204 and control such as reading out pixel signals after photoelectric conversion from the first image sensor 202 and the second image sensor 204 are performed. Do.
- Pixel signals read from the first image sensor 202 and the second image sensor 204 are sent to the image processing unit 207.
- the image processing unit 207 performs predetermined image processing on both pixel signals.
- the image data after the image processing is recorded on a recording medium 206 such as a memory card. Note that the pixel signals read from the first image sensor 202 and the second image sensor 204 may be recorded on the recording medium 206 as so-called RAW data without performing image processing.
- Display unit 208 reproduces and displays an image based on the image data, and displays an operation menu screen and the like. Display control for the display unit 208 is performed by the control unit 205.
- FIG. 2 is a perspective view of the optical system of the camera 100, that is, the image pickup lens 201, the first image pickup device 202, the microlens array 203, and the second image pickup device 204.
- the first image sensor 202 is disposed on the planned focal plane of the imaging lens 201.
- the first imaging element 202, the microlens array 203, and the second imaging element 204 are illustrated with a wider interval.
- the lens array 203 is in close contact.
- the distance between the first image sensor 202 and the second image sensor 204 is a distance corresponding to the focal length of the microlens L constituting the microlens array 203.
- the first image sensor 202 of the camera 100 described above captures a subject image projected onto the first image sensor 202 by the imaging lens 201.
- an image captured by the first image sensor 202 is referred to as a normal image.
- the second image sensor 204 of the camera 100 captures an image of light that has passed through the first image sensor 202.
- the second imaging element 204 is configured to capture a plurality of images with different viewpoints using a light field photography technique.
- FIG. 2 shows an example in which the microlens array 203 includes 5 ⁇ 5 microlenses L, but the number of microlenses L constituting the microlens array 203 is not limited to the illustrated number.
- each microlens L is divided into a plurality of pixels by the pixel group PXs of the second image sensor 204 arranged behind the microlens L (in the negative Z-axis direction). That is, each pixel constituting the pixel group PXs receives light from a certain part of the subject that has passed through different regions of the imaging lens 201.
- a light field (LF) image As many small images as the number of microlenses L, which are light intensity distributions indicating regions where subject light has passed through the imaging lens 201, are obtained for different parts of the subject.
- a collection of small images is referred to as a light field (LF) image.
- the incident direction of light to each pixel is determined by the positions of a plurality of pixels arranged behind each microlens L (in the negative Z-axis direction). That is, since the positional relationship between the microlens L and each pixel of the second imaging element 204 behind the microlens L is known as design information, the incident direction of light rays incident on each pixel via the microlens L ( Direction information). For this reason, the pixel signal of each pixel of the second image sensor 204 represents the intensity of light (light ray information) from a predetermined incident direction. In this specification, light from a predetermined direction that enters the pixels of the second image sensor 204 is referred to as a light beam.
- ⁇ Refocus processing> an LF image is subjected to image refocus processing using the data.
- the refocus processing is processing for generating an image at an arbitrary focus position or viewpoint by performing a calculation (calculation for rearranging the light rays) based on the light ray information and the direction information of the LF image.
- a refocus image an image at an arbitrary focus position or viewpoint generated by the refocus processing is referred to as a refocus image. Since such a refocus process (also referred to as a reconstruction process) is known, a detailed description of the refocus process is omitted.
- the refocus processing may be performed in the camera 100 by the image processing unit 207, or the LF image data recorded in the recording medium 206 is transmitted to an external device such as a personal computer and is performed by the external device. Also good.
- FIG. 3 is a cross-sectional view of the first image sensor 202, the microlens array 203, and the second image sensor 204, showing a cross section parallel to the XZ plane.
- FIG. 4 is a front view of the image sensor shown in FIG. 3 as seen from the Z-axis plus direction.
- the image sensor has a structure in which a first image sensor 202, a microlens array 203, and a second image sensor 204 are combined. From the first image sensor 202, a pixel signal of a normal image is read out. In addition, the pixel signal of the LF image is read from the second image sensor 204.
- the first imaging element 202 has a configuration in which a readout circuit layer 202C, a photoelectric conversion element array 202B, and a transparent electrode layer 202A formed on a transparent substrate are stacked in order from the Z-axis minus direction.
- the transparent electrode layer 202A is provided to apply a voltage to the photoelectric conversion elements in the photoelectric conversion element array 202B.
- various optical materials with high transparency that transmit visible light can be used.
- An example of the optical material is an inorganic transparent electrode film such as an indium tin oxide film (ITO) or an organic transparent conductive film such as polyethylene-dioxythiophene-polystyrene-sulphonate (PEDT / PSS).
- FIG. 5 (a) is a diagram showing a wavelength range in which photoelectric conversion is performed in the pixels of the photoelectric conversion element array 202B.
- the photoelectric conversion element array 202B for example, a plurality of photoelectric conversion elements each having peak sensitivity with respect to light in the wavelength range of Ye (yellow), Mg (magenta), and Cy (cyan) are as shown in FIG. Have a structure arranged in a two-dimensional array.
- Each pixel of the photoelectric conversion element array 202B is configured by a photoelectric conversion element formed of an organic photoelectric conversion material.
- organic photoelectric films for photoelectric conversion of Ye and Mg light are alternately arranged at each pixel position in odd rows
- organic photoelectric films for photoelectric conversion of Mg and Cy light are alternately arranged at each pixel position in even rows. Placed in.
- the photoelectric conversion element at each pixel position absorbs light in a wavelength region where photoelectric conversion is performed, and transmits light in a wavelength region where photoelectric conversion is not performed. That is, a pixel that photoelectrically converts Ye light transmits B (blue) light that is a complementary color of Ye. A pixel that photoelectrically converts Mg light transmits G (green) light that is a complementary color of Mg. Further, a pixel that photoelectrically converts Cy light transmits R (red) light that is a complementary color of Cy.
- the symbol L in FIG. 5A indicates one microlens that constitutes the microlens array 203 provided behind the first image sensor 202 (in the negative Z-axis direction).
- the B, G, and R light transmitted through the photoelectric conversion elements of the photoelectric conversion element array 202B is incident on the rear microlens L. That is, photoelectric conversion elements at a plurality of pixel positions of the first image sensor 202 correspond to one microlens L.
- the readout circuit layer 202C includes a readout circuit that reads out pixel signals (not shown) and pixel signals after photoelectric conversion by the photoelectric conversion element array 202B.
- the pixel electrode is made of a highly transparent optical material that transmits visible light.
- An example of the optical material is an inorganic transparent electrode material such as the indium tin oxide film (ITO) described above or an organic transparent conductive film such as PEDT / PSS.
- the readout circuit is configured by, for example, a thin film transistor (TFT) array.
- a microlens different from the microlens array 203 may be provided for each pixel position (on the imaging lens side) of the photoelectric conversion element array 202B so as to increase the amount of light incident on the photoelectric conversion element at each pixel position. .
- the microlenses L1 to L6 are formed integrally with the transmission substrate 203A.
- the transmission substrate 203A for example, a glass substrate, a plastic substrate, a silica substrate, or the like is used.
- the microlens array 203 can be formed by, for example, injection molding, pressure molding, or the like. Note that the microlenses L1 to L6 may be formed separately from the transmission substrate 203A.
- the surface on the negative side of the Z axis of the microlens array 203 may be bonded to the second image sensor 204 so as to function as a package member of the second image sensor 204. Accordingly, the second imaging element 204 can omit a package member such as glass or resin on the Z axis plus side with respect to the microlens array 203.
- the transmission substrate 203A of the microlens array 203 has a thickness corresponding to the focal length of each of the microlenses L1 to L6.
- the thickness of the transmissive substrate 203A is about 0.3 mm to several mm.
- the second image sensor 204 in FIG. 3 a commonly used CCD image sensor, CMOS image sensor, or the like can be used.
- the second image sensor 204 has a configuration in which a light receiving element array 204B and a color filter array 204A formed on the silicon substrate 204C are stacked in order from the negative direction of the Z axis.
- FIG. 5B is a diagram illustrating a wavelength range in which photoelectric conversion is performed in the pixels of the light receiving element array 204B.
- the light transmitted through each photoelectric conversion element in the photoelectric conversion element array 202B is any one of B, G, and R.
- FIG. B, G, R light is mixed and incident on each pixel PX constituting the pixel group PXs arranged behind the microlens L of 5 (b) (Z-axis minus direction). Is done.
- the second image sensor 204 is provided with a color filter array 204A.
- a plurality of filters that selectively transmit light in the RGB (red, green, blue) wavelength region are arranged in a two-dimensional array as shown in FIG. 5B.
- filters that respectively transmit B and G light are alternately arranged at the pixel positions of the odd-numbered rows in correspondence with the positions of the pixels PX of the light-receiving element array 204B. Filters that respectively transmit G and R light are alternately arranged at each pixel position.
- a light receiving element such as a photodiode is disposed in each pixel PX of the light receiving element array 204B.
- the light receiving element array 204B has a plurality of pixels PX formed in a two-dimensional array. Between the pixels PX, a charge transfer electrode (not shown) and a light shielding film formed on the charge transfer electrode are provided. Each pixel PX receives one of B, G, and R light through the color filter array 204A. Each pixel PX generates a charge corresponding to the amount of light incident on the photodiode. The charges accumulated in each pixel PX are sequentially transferred to the charge transfer electrode by a transfer transistor (not shown) and sequentially read out.
- the second image sensor 204 has a back-illuminated configuration, and the photodiode of the pixel PX is provided on the back side (Z-axis plus side) of the charge transfer electrode.
- the opening to the photodiode can be made larger than that in the case of the front surface irradiation, so that a decrease in the amount of light photoelectrically converted by the second image sensor 204 can be suppressed. For this reason, even if it does not provide a condensing lens for every pixel PX, sufficient intensity
- microlenses L1 to L6 are located behind the first image sensor 202 (in the negative Z-axis direction).
- the color filter array 204A of the second image sensor 204 is located behind the microlenses L1 to L6 (Z-axis minus direction).
- An enlarged view of the configuration per microlens corresponds to FIG.
- the light receiving element array 204B of the second imaging element 204 a plurality of pixels PX are formed in a two-dimensional array, and a pixel group PXs including a predetermined number of pixels PX is assigned to each of the microlenses L1 to L6.
- the pixel group PXs is indicated by a white background, and pixels not included in the pixel group PXs are indicated by oblique lines.
- FIG. 4 and 5B show an example in which 8 ⁇ 8 pixel groups PXs are assigned to each of the microlenses L1 to L6.
- the number of pixels PX constituting the pixel group PXs is limited to the number shown in the figure.
- the number of microlenses L1 to L6 in FIG. 4 is not limited to the number shown.
- the arrangement of the pixels PX in the light receiving element array 204B may be arranged by separating the pixel groups PXs for each microlens L as shown in FIG. 2, or as shown in FIG. 4 and FIG.
- a plurality of pixels PX may be arranged in a two-dimensional array without isolating the pixel group PXs.
- the relationship between the pixel interval (pitch) of the first image sensor 202 and the pixel interval (pitch) of the second image sensor 204 is such that the pixel interval of the first image sensor 202 is the second. It is configured wider than the pixel interval of the image sensor 204. The reason is to suppress the generation of diffracted light in the visible light region.
- the pixel interval of the first image sensor 202 is preferably 4 ⁇ m or more, and more preferably 20 ⁇ m or more.
- the pixel interval is an interval between the central portions of two adjacent pixels.
- the control unit 205 increases the pixel signal level of the LF image acquired by the second image sensor 204 by increasing the sensitivity of the second image sensor 204 or increasing the exposure time (charge accumulation time). I do.
- the reason for this is that although the second imaging element 204 has a back-illuminated configuration, the pixel size of the second imaging element 204 is smaller than that of the first imaging element 202, and the pixel signal of the acquired LF image. This is because the level is lower than the pixel signal level of the normal image.
- the control unit 205 determines the sensitivity of the second image sensor 204 based on the pixel signal level obtained by the first image sensor 202. For example, the lower the pixel signal level obtained by the first image sensor 202 is, the higher the sensitivity of the second image sensor 204 is, or the pixel signal level obtained by the second image sensor 204 is changed by the first image sensor 202. The sensitivity of the second image sensor 204 is adjusted so as to approach the pixel signal level obtained.
- control unit 205 determines the charge accumulation time of the second image sensor 204 based on the pixel signal level obtained by the first image sensor 202. For example, as the pixel signal level obtained by the first image sensor 202 is lower, the charge accumulation time of the second image sensor 204 is lengthened, or the pixel signal level obtained by the second image sensor 204 is changed to the first image sensor. The charge accumulation time of the second image sensor 204 is adjusted so as to approach the pixel signal level obtained in 202.
- the control unit 205 generates an image file to be recorded on the recording medium 206.
- the control unit 205 includes normal image data based on the pixel signal read from the first image sensor 202 in the image file.
- the control unit 205 includes LF image data based on the pixel signal read from the second image sensor 204 in the image file.
- LF image data image data at an arbitrary focus position or viewpoint (refocus image) generated by the refocus processing may be included in the image file.
- the data of the LF image and the data of the refocus image can be included in the image file as a plurality of related image data.
- a plurality of image files having the same file name but different extension names are generated, and a plurality of images related to the plurality of image files are generated.
- These data may be included one by one.
- an image file that includes LF image data and an image file that includes refocus image data have the same file names with different extensions. By using the same file name, it is easy for the user to understand that related image data is included.
- a plurality of refocus images corresponding to a plurality of focus positions can be generated from LF image data.
- a plurality of refocus image data corresponding to a plurality of focus positions is generated based on the LF image data, the number of related images increases, so the above-described multi-picture format image file or the same file.
- the control unit 205 may select both shooting modes for recording data of both the normal image and the LF image. .
- both shooting modes since both normal image data and LF image data are recorded, the normal image data and the LF image data are used as a plurality of related image data.
- the use of the above-mentioned multi-picture format image file or a plurality of image files having the same file name but different extension names can make it easier for the user to handle the image data. it can.
- FIG. 6 is a flowchart illustrating the flow of camera processing executed by the control unit 205.
- the control unit 205 executes a program for performing the processing of FIG. 6 when the main switch is turned on or when the operation for returning from the sleep state is performed.
- the control unit 205 performs mode selection. For example, the control unit 205 determines whether the shooting mode is the normal shooting mode, the LF shooting mode, or the both shooting mode based on the setting state of the operation member (not shown), and the process proceeds to step S20.
- the control unit 205 may automatically determine the mode selection instead of performing the mode selection based on the setting state of the operation member. For example, when the mode selection is automatically determined based on the shooting scene mode, the control unit 205 selects the normal shooting mode in the case of distant shooting or astronomical shooting. These shootings are based on the idea that it is less necessary to generate a refocus image based on the LF image.
- control unit 205 selects the LF shooting mode when the automatic determination is made according to the state of the camera 100, for example, when the remaining battery level falls below a predetermined value.
- the auto focus adjustment (AF) operation is omitted, and a power saving operation is performed to delay the decrease in the remaining battery level. This is because if there is LF image data, a refocus image at an arbitrary focus position can be generated later.
- step S20 the control unit 205 selects an image sensor to be driven and proceeds to step S30.
- the control unit 205 drives the first image sensor 202 in the normal shooting mode and the both shooting modes.
- the control unit 205 sets the second imaging element 204 as a driving target in the LF shooting mode and the both shooting modes. That is, in the case of both shooting modes, both the first image sensor 202 and the second image sensor 204 are driven.
- step S30 the control unit 205 performs an imaging operation by driving the imaging element selected in step S20, and the process proceeds to step S40.
- step S ⁇ b> 40 the control unit 205 sends an instruction to the image processing unit 207, and the pixels read from the first image sensor 202 or the second image sensor 204, or the first image sensor 202 and the second image sensor 204.
- the signal is subjected to predetermined image processing, and the process proceeds to step S50.
- the image processing includes, for example, contour enhancement processing, color interpolation processing, white balance processing, and the like. Note that before step 30, there is a step of determining whether or not the shutter is released, and if the shutter is released, the process may proceed to step 30.
- control unit 205 performs refocus processing as image processing on the image signal read from the second image sensor 204 when the LF shooting mode or both shooting modes are selected in the mode selection in step S10. By doing so, a refocus image at a predetermined focus position and viewpoint is generated.
- step S50 the control unit 205 causes the display unit 208 to reproduce and display an image based on the image-processed data.
- the control unit 205 causes the display unit 208 to display the normal image and the refocus image when both shooting modes are selected in the mode selection in step S10.
- the normal image and the refocus image may be displayed side by side, or may be displayed by switching the display of each image one by one.
- control unit 205 selects the LF shooting mode and the both shooting modes in step S10 and displays a refocus image on the display unit 208 in step S50, the control unit 205 performs image processing based on a user operation.
- the refocus process may be performed again by the unit 207, and the refocus image generated by the refocus process may be displayed on the display unit 208.
- the display unit 208 displays a refocus image that focuses on the subject displayed at the tap position.
- control unit 205 displays a refocus image in which the focus position is changed in accordance with the amount of movement of the operation bar. 208 may be displayed.
- step S60 the control unit 205 generates an image file and proceeds to step S70.
- the control unit 205 generates an image file including normal image data in the normal shooting mode.
- an image file including LF image data or LF image data and refocus image data is generated.
- an image file including normal image data and LF image data, or normal image data, LF image data, and refocus image data is generated.
- step S70 the control unit 205 records the image file on the recording medium 206, and proceeds to step S80.
- step S80 the control unit 205 determines whether or not to end. For example, when the main switch is turned off or when a predetermined time has elapsed in the non-operation state, the control unit 205 makes an affirmative determination in step S80 and ends the process of FIG. On the other hand, for example, if an operation is performed on the camera 100, the control unit 205 makes a negative determination in step S80 and returns to step S10. The control unit 205 that has returned to step S10 repeats the above-described processing.
- the image sensor of the camera 100 photoelectrically converts a part of incident light and transmits a part of the light.
- the color of light (Ye, Mg, Cy) to be photoelectrically converted and the color of the transmitted light (Y B, G, and R) have a first imaging element 202 having a plurality of first imaging pixels and a plurality of microlenses L, and one microlens L among the plurality of microlenses L has a plurality of first lenses.
- a microlens array 203 that receives light of different colors (B, G, and R) that has passed through one imaging pixel, and a plurality of second that receives light that has passed through one microlens L among the plurality of microlenses L. And a second imaging element 204 having a plurality of imaging pixels PX. As a result, both the first image sensor 202 and the second image sensor 204 can capture images in color.
- the number of first imaging pixels of the first imaging device 202 is larger than the number of microlenses L. That is, since light of different colors (B, G, R) that has passed through the plurality of first imaging pixels is incident on the microlens L, color unevenness can be suppressed.
- the imaging device since the interval between the first imaging pixels of the first imaging device 202 is wider than the interval between the second imaging pixels PX of the second imaging device 204, the generation of diffracted light is suppressed. can do. As a result, it is possible to prevent image quality deterioration of the acquired image.
- the interval (30 nm or more) between the first imaging pixels of the first imaging device 202 reduces, for example, the diffracted light of visible light generated by the incident light on the first imaging device 202. Since the interval is used, it is possible to prevent image quality deterioration of the acquired image.
- the plurality of second imaging pixels PX of the second imaging device 204 photoelectrically convert light of different colors (B, G, R). A color LF image can be acquired.
- the color (B, G, R) of light that is photoelectrically converted by the second imaging pixel PX of the second imaging device 204 is the color of light that is photoelectrically converted by the first imaging pixel ( Different from Ye, Mg, Cy). Thereby, it can be set as the structure which utilized the characteristic of the organic photoelectric film.
- the first imaging device 202, the microlens array 203, and the second imaging device 204 are stacked. Thereby, an image pick-up element can be integrated and it can be made easy to handle.
- the camera 100 generates an image processing unit 207 that generates normal image data based on the first pixel signals generated by the imaging elements 202 to 204 and the first imaging pixels of the first imaging element 202. And an image processing unit 207 that generates LF image data having a smaller number of pixels than normal image data based on the second pixel signal generated by the second imaging pixel PX of the second imaging element 204. . Thereby, two different kinds of images can be obtained by one-shot imaging.
- a normal shooting mode for generating a normal image based on the normal image data generated by the image processing unit 207, and an LF for generating a refocused image based on the LF image data generated by the image processing unit 207.
- a control unit 205 for switching between shooting modes is provided. This makes it possible to appropriately switch the shooting mode for obtaining two different types of images.
- the control unit 205 switches between the normal shooting mode and the LF shooting mode according to the set shooting scene mode. Since the imaging mode is automatically switched based on the setting state of the camera 100 such as the imaging scene mode, the user-friendly camera 100 can be obtained.
- the image sensor of the embodiment described above can also be described as follows.
- the imaging device includes a first imaging unit 202 having a plurality of first photoelectric conversion units 202B that photoelectrically convert light of some wavelengths of incident light and transmit light of other wavelengths, and a first imaging unit And a plurality of lenses L that constitute the microlens 203, and a second photoelectric conversion unit 204B that is provided for each of the plurality of lenses and photoelectrically converts incident light. 2 imaging unit 204.
- the number of first photoelectric conversion units 202B of the first imaging unit 202 constituting the imaging device described in (1) is smaller than the number of second photoelectric conversion units 204B of the second imaging unit 204.
- the interval between the centers of two adjacent first photoelectric conversion units 202B is wider than the interval between the centers of two adjacent second photoelectric conversion units 204B. .
- the resolution of the first imaging unit 202 is lower than the resolution of the second imaging unit 204.
- the distance between the centers of two adjacent first photoelectric conversion units 202B is 4 ⁇ m or more.
- the plurality of second photoelectric conversion units 204B photoelectrically convert light having different wavelengths.
- the wavelength of light that is photoelectrically converted by the second photoelectric conversion unit 204B is different from the wavelength of light that is photoelectrically converted by the first photoelectric conversion unit 202B.
- the wavelength of light that is photoelectrically converted by the second photoelectric conversion unit 204B is the same as the wavelength of light that is photoelectrically converted by the first photoelectric conversion unit 202B.
- the plurality of first photoelectric conversion units 202B are configured by organic photoelectric films that photoelectrically convert light of different wavelengths, and the plurality of second photoelectric conversion units 204B are , A color filter and a photoelectric conversion unit, or a photoelectric conversion unit that receives light of different wavelengths in the depth direction.
- the imaging device of (1) to (9) above includes a lens array 203 having a plurality of lenses L, and the first imaging unit 202, the lens array 203, and the second imaging unit 204 are stacked. .
- the imaging device generates first image data based on the imaging elements of (1) to (10) and a signal from the first imaging unit 202, and based on a signal from the second imaging unit 204, An image processing unit that generates second image data having a smaller number of pixels than the first image data.
- a first mode for generating a first image based on the first image data generated by the image processing unit, and a second image based on the second image data generated by the image processing unit A mode switching unit that switches between the second mode for generating (13)
- the mode switching unit switches between the first mode and the second mode according to the set imaging scene mode.
- the functions of the microlenses L1 to L6 may be provided by forming a lens region having a refractive index higher than that of the transparent substrate 203A inside the transmissive substrate 203A. Thereby, the surface on the plus side of the Z axis of the microlens array 203 can be flattened.
- the joining surface in the case of joining the surface on the plus side of the Z axis of the microlens array 203 on the minus side of the Z axis of the first image sensor 202 is widened. It can be secured. Thereby, it is possible to easily configure an integrated image sensor in which the first image sensor 202, the microlens array 203, and the second image sensor 204 are stacked.
- Modification 2 As another example of flattening the surface on the plus side of the Z-axis of the microlens array 203, the concave surfaces around the microlenses L1 to L6 in FIG. 3 are lower in refraction than the refractive index of the members constituting the microlenses L1 to L6. Flattening may be performed by filling with a transparent member.
- the lens array may be made thin by using a Fresnel lens instead of the microlenses L1 to L6. Also in this case, it is preferable to perform flattening by filling the concave surface around the Fresnel lens with a transparent member having a lower refractive index than the refractive index of the member constituting the Fresnel lens.
- the lens array may be constituted by lenticular lenses instead of the microlenses L1 to L6. Also in this case, it is preferable to perform planarization by filling the concave surface around the lenticular lens with a transparent member having a refractive index lower than that of the member constituting the lenticular lens.
- FIG. 7 is a cross-sectional view schematically showing a configuration of one of the plurality of micromirrors 23B constituting the micromirror array.
- a large number of micromirrors 23B shown in FIG. 7 are arranged in a two-dimensional array.
- the micromirror 23B is configured by laminating a reflective linearly polarizing plate 122, a quarter wavelength (1 / 4 ⁇ ) plate 123, and a reflecting mirror 124 in order from the side closer to the first image sensor 202.
- the reflective linearly polarizing plate 122 reflects the S-polarized component of the incident light and transmits the P-polarized component.
- the quarter-wave plate 123 is installed at an angle of 45 degrees with respect to the axis of the reflective linear polarizing plate 122.
- the reflecting mirror 124 is formed by forming a concave surface on a transparent substrate and then filling the concave surface with an optical adhesive having the same refractive index as that of the transparent substrate. Cholesteric liquid crystal is applied to the concave surface (or the convex surface on the back side) to form a circularly polarized light separating layer.
- the circularly polarized light separation layer made of cholesteric liquid crystal allows the left circularly polarized light to pass through and reflects the right circularly polarized light as the right circularly polarized light.
- the reflecting mirror 124 is provided so that the second image sensor 204 is positioned at the focal position.
- the focal length f is R / 2 with respect to the radius of curvature R of the concave surface.
- the focal length f is 2R. Therefore, by using the reflecting mirror 124, the focal length f is shortened to 1 ⁇ 4 the length of the microlens. it can.
- the micromirror array in which the micromirrors 23B formed as described above are arranged in a two-dimensional array can flatten the surface on the Z-axis plus side and the surface on the Z-axis minus side. Thereby, it is possible to secure a wide joint surface when the surface on the Z axis minus side of the first imaging element 202 is joined to the surface on the Z axis plus side of the micromirror array. In addition, it is possible to secure a wide joint surface when the surface on the Z axis plus side of the second image sensor 204 is joined to the surface on the Z axis minus side of the micromirror array.
- the wavelength range for photoelectric conversion in the first image sensor 202 may be changed from YeMgCy ⁇ RGB.
- the wavelength range in the case of RGB is narrower than the wavelength range in the case of YeMgCy.
- the transmission wavelength range (YeMgCy) that is a complementary color is wider than the wavelength range (RGB) that is absorbed (photoelectric conversion).
- the amount of light photoelectrically converted by the second image sensor 204 can be increased.
- the pixel signal level of the LF image acquired by the second image sensor 204 can be increased.
- the wavelength range for photoelectric conversion in the second image sensor 204 may remain RGB, but may be YeMgCy.
- an image sensor in which the wavelength range for photoelectric conversion differs depending on the thickness direction (Z-axis direction) of the element may be used.
- the color filter array 204A is not necessary, and color interpolation processing for the pixel signal read from the second image sensor 204 can be eliminated.
- an effect of increasing the amount of light photoelectrically converted by the second image sensor 204 is obtained.
- the pixel signal level of the LF image acquired by the second image sensor 204 can be increased.
- an effect of reducing the processing load on the image processing unit 207 can be obtained.
- the first imaging element 202 and the second imaging element 204 are formed by providing a micro hole 211 in the transmission substrate 203A of the micro lens array 203 of FIG. 3 and forming a conductor in the micro hole 211. Is electrically connected. Accordingly, since a circuit between the first image sensor 202 and the second image sensor 204 can be connected, pixel signals after photoelectric conversion by the first image sensor 202 and the second image sensor 204 are read in common. It can be read by the circuit.
- a partition wall 210 that blocks light may be provided at the boundary between the microlenses L1 to L6.
- the reason for providing the partition wall 210 is that light that has passed through the microlenses L1 to L6 is received by the pixel group PXs arranged behind the microlenses L1 to L6 (in the negative direction of the Z axis), and the adjacent microlenses L1 to L6 are arranged. This is to prevent entry into the pixel group PXs arranged behind.
- the partition wall 210 is formed, for example, by forming a grid-like deep groove in the microlens array 203 by machining or etching and filling the groove with a light shielding resin.
- First imaging having a plurality of first imaging pixels in which a part of incident light is photoelectrically converted and part of the light is transmitted, and the color of the photoelectrically converted light and the color of the transmitted light are different.
- a microlens array having a plurality of microlenses, in which light of different colors transmitted through the plurality of first imaging pixels is incident on one microlens among the plurality of microlenses, and among the plurality of microlenses
- An imaging device comprising: a second imaging unit having a plurality of second imaging pixels on which light transmitted through one microlens enters.
- the interval between the first imaging pixels is an interval in which diffracted light generated by light incident on the first imaging unit is reduced.
- the plurality of second image pickup pixels each photoelectrically convert light of different colors.
- the color of light photoelectrically converted by the second imaging pixel is different from the color of light photoelectrically converted by the first imaging pixel.
- the first imaging unit includes an organic photoelectric film that photoelectrically converts light of different colors, or a color filter and an organic photoelectric film.
- An image pick-up part is an image pick-up element comprised by a color filter and a light-receiving part, or the light-receiving part which receives the light of a different color in a depth direction.
- the first imaging unit, the microlens array, and the second imaging unit are stacked.
- the imaging apparatus generates first image data based on the imaging elements (1) to (9) and the first signal generated by the first imaging pixel.
- a second image data generation unit that generates second image data having a smaller number of pixels than the first image data based on the second signal generated by the second imaging pixel.
- a first mode for generating a first image based on the first image data generated by the first image data generation unit and a second image data generation unit.
- An imaging apparatus comprising a mode switching unit that switches between a second mode for generating a second image based on the second image data.
- the mode switching unit switches between the first mode and the second mode according to a set imaging scene mode.
- DESCRIPTION OF SYMBOLS 100 ... Camera, 201 ... Imaging lens, 202 ... 1st imaging device, 203 ... Micro lens array, 204 ... 2nd imaging device, 205 ... Control part, 207 ... Image processing part, 208 ... Display part, L, L1 L6... Micro lens, PX... Pixel of the second image sensor 204, PXs.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Power Engineering (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Solid State Image Pick-Up Elements (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Studio Devices (AREA)
Abstract
Description
発明の第2の態様による撮像装置は、第1の態様による撮像素子と、前記第1撮像部からの信号に基づき、第1画像データを生成し、前記第2撮像部からの信号に基づき、前記第1画像データより画素数が少ない第2画像データを生成する画像処理部とを備える。
図1は、一実施の形態によるカメラ100の要部構成を説明する図である。図1に示した座標軸において、被写体からの光はZ軸マイナス方向へ向かう。また、上向きかつZ軸に直交する向きをY軸プラス方向とし、紙面に垂直な手前方向で、Z軸およびY軸に直交する向きをX軸プラス方向とする。以降に示すいくつかの図においては、図1の座標軸を基準としてそれぞれの図における向きを表す。
なお、第1の撮像素子202および第2の撮像素子204からそれぞれ読み出された画素信号に画像処理を施すことなしに、いわゆるRAWデータとして記録媒体206へ記録してもよい。
なお、わかりやすく図示するために、第1の撮像素子202、マイクロレンズアレイ203、および第2の撮像素子204の間隔を広げて図示しているが、実際には第1の撮像素子202とマイクロレンズアレイ203とを密着させている。また、第1の撮像素子202と第2の撮像素子204との間隔は、マイクロレンズアレイ203を構成するマイクロレンズLの焦点距離に応じた距離である。
上述したカメラ100の第1の撮像素子202は、撮像レンズ201によって第1の撮像素子202に投射された被写体像を撮像する。本願明細書において、第1の撮像素子202によって撮像された画像を通常画像と呼ぶ。
カメラ100の第2の撮像素子204は、第1の撮像素子202を透過した光の像を撮像する。第2の撮像素子204は、ライト・フィールド・フォトグラフィ(Light Field Photography)技術を用いて、視点が異なる複数の画像を撮像するように構成されている。
なお、図2では、マイクロレンズアレイ203が5個×5個のマイクロレンズLを有する例を示すが、マイクロレンズアレイ203を構成するマイクロレンズLの数は、図示した数に限定されない。
本願明細書では、第2の撮像素子204の画素に入射される所定の方向からの光を光線と呼ぶことにする。
一般に、LF画像は、そのデータを用いて画像のリフォーカス処理が施される。リフォーカス処理は、LF画像が有する上記光線情報と上記方向情報とに基づいた演算(光線を並べ替える演算)を行うことによって、任意のピント位置や視点での画像を生成する処理をいう。本願明細書では、リフォーカス処理によって生成された任意のピント位置や視点での画像をリフォーカス画像と呼ぶ。このようなリフォーカス処理(再構築処理とも呼ばれる)は公知であるので、リフォーカス処理についての詳細な説明は省略する。
なお、リフォーカス処理は、画像処理部207によってカメラ100内で行ってもよいし、記録媒体206に記録されたLF画像のデータをパーソナルコンピュータなどの外部機器へ送信し、外部機器に行わせてもよい。
次に、カメラ100の撮像素子の具体的構成例について説明する。本実施の形態では、第1の撮像素子202および第2の撮像素子204から、光電変換後の画素信号を独立して読み出す例について説明する。図3は、第1の撮像素子202、マイクロレンズアレイ203および第2の撮像素子204の断面図であり、X-Z平面に平行な断面を示す。図4は、図3の撮像素子をZ軸プラス方向から見た正面図である。
第1の撮像素子202は、Z軸マイナス方向から順に、透明基板上に形成された読み出し回路層202C、光電変換素子アレイ202B、および透明電極層202Aを重ねた構成を有する。
なお、光電変換素子アレイ202Bの各画素位置毎(撮像レンズ側)にマイクロレンズアレイ203とは別のマイクロレンズを設けて各画素位置の光電変換素子に入射する光量を多くするようにしてもよい。
図3、図4のマイクロレンズアレイ203は、マイクロレンズL1~L6が透過基板203Aと一体に形成される。透過基板203Aには、例えば、ガラス基板、プラスチック基板またはシリカ基板等を用いる。マイクロレンズアレイ203は、例えば、射出成形、加圧成形等により形成することができる。
なお、マイクロレンズL1~L6を透過基板203Aとは別体に形成してもよい。
図3の第2の撮像素子204は、一般に用いられているCCDイメージセンサやCMOSイメージセンサなどを用いることができる。第2の撮像素子204は、Z軸マイナス方向から順に、シリコン基板204C上に形成された受光素子アレイ204B、およびカラーフィルタアレイ204Aを重ねた構成を有する。
なお、図5(b)では、複数の画素PXのうち画素群PXsを白地で示し、画素群PXsに含めない画素を斜線で示す。
制御部205は、第2の撮像素子204の感度を上げたり、露光時間(電荷蓄積時間)を長くしたりして、第2の撮像素子204で取得されるLF画像の画素信号レベルを高める制御を行う。
この理由は、第2の撮像素子204を裏面照射型の構成にするとはいえ、第1の撮像素子202に比べると第2の撮像素子204の画素サイズが小さく、取得されるLF画像の画素信号レベルが通常画像の画素信号レベルに比べて低いからである。
制御部205は、記録媒体206に記録する画像ファイルを生成する。制御部205は、通常画像のみを記録する通常撮影モードの場合、第1の撮像素子202から読み出した画素信号に基づく通常画像のデータを画像ファイルに含める。
図6は、制御部205が実行するカメラ処理の流れを例示するフローチャートである。制御部205は、メインスイッチのオン操作がなされた場合や、スリープ状態からの復帰操作がなされた場合に、図6の処理を行うプログラムを実行する。図6のステップS10において、制御部205はモード選択を行う。制御部205は、例えば、不図示の操作部材の設定状態に基づいて、通常撮影モード、LF撮影モード、または両撮影モードかを判断してステップS20へ進む。
なお、ステップ30の前にシャッタレリーズがされているか否かを判別するステップがあって、レリーズがされていればステップ30へ進むようにしてもよい。
(1)カメラ100の撮像素子は、入射光の一部の光を光電変換し一部の光を透過させ、それぞれ光電変換する光の色(Ye、Mg、Cy)と透過する光の色(B、G、R)が異なる複数の第1の撮像画素を有する第1の撮像素子202と、複数のマイクロレンズLを有し、複数のマイクロレンズLのうち一つのマイクロレンズLに複数の第1の撮像画素を透過した異なる色(B、G、R)の光が入射するマイクロレンズアレイ203と、複数のマイクロレンズLのうち一つのマイクロレンズLを透過した光が入射する複数の第2の撮像画素PXを有する第2の撮像素子204とを備える。これにより、第1の撮像素子202と第2の撮像素子204との双方によって、それぞれカラーで撮像することができる。
(1)撮像素子は、入射光のうち一部の波長の光を光電変換し他の波長の光を透過させる複数の第1光電変換部202Bを有する第1撮像部202と、第1撮像部を透過した光が入射する複数のレンズL、すなわちマイクロレンズ203を構成する個々のレンズLと、複数のレンズ毎に複数設けられた、入射光を光電変換する第2光電変換部204Bを有する第2撮像部204とを備える。
(2)上記(1)の記載の撮像素子を構成する第1撮像部202の第1光電変換部202Bの数は、第2撮像部204の第2光電変換部204Bの数よりも少ない。
(3)上記(1)、(2)の撮像素子において、2つの隣り合う第1光電変換部202Bの中心の間隔は、2つの隣り合う第2の光電変換部204Bの中心の間隔よりも広い。
(4)上記(1)~(3)の撮像素子において、第1撮像部202の解像度が第2撮像部204の解像度よりも低い。
(5)上記(3)の撮像素子において、2つの隣り合う第1光電変換部202Bの中心の間隔は4μm以上である。
(6)上記(1)~(5)の撮像素子において、複数の第2光電変換部204Bは、それぞれ異なる波長の光を光電変換する。
(7)上記(6)の撮像素子において、第2光電変換部204Bが光電変換する光の波長は、第1光電変換部202Bが光電変換する光の波長と異なる。
(8)上記(6)の撮像素子において、第2光電変換部204Bが光電変換する光の波長は、第1光電変換部202Bが光電変換する光の波長と同じである。
(9)上記(6)~(8)の撮像素子において、複数の第1光電変換部202Bは、異なる波長の光を光電変換する有機光電膜により構成され、複数の第2光電変換部204Bは、カラーフィルタおよび光電変換部、または、深さ方向に異なる波長の光を受光する光電変換部により構成される。
(10)上記(1)~(9)の撮像素子において、複数のレンズLを有するレンズアレイ203を備え、第1撮像部202とレンズアレイ203と第2撮像部204とは、積層されている。
(11)撮像装置は、上記(1)~(10)の撮像素子と、第1撮像部202からの信号に基づき、第1画像データを生成し、第2撮像部204からの信号に基づき、第1画像データより画素数が少ない第2画像データを生成する画像処理部とを備える。(12)上記(11)の撮像装置において、画像処理部で生成された第1画像データによる第1画像を生成する第1モードと、画像処理部で生成された第2画像データによる第2画像を生成する第2モードとを切り替えるモード切り替え部を備える。(13)上記(12)の撮像装置において、モード切り替え部は、設定されている撮像シーンモードによって第1モードと第2モードとを切り替える。
(変形例1)
上述した実施の形態において、透過基板203Aの内部に、透明基板203Aの屈折率より高屈折率のレンズ領域を形成することにより、マイクロレンズL1~L6の機能を持たせてもよい。これにより、マイクロレンズアレイ203のZ軸プラス側の面を平坦化することができる。
マイクロレンズアレイ203のZ軸プラス側の面を平坦化する他の例として、図3のマイクロレンズL1~L6の周囲の凹面を、マイクロレンズL1~L6を構成する部材の屈折率よりも低屈折の透明部材で埋めることによって平坦化を行ってもよい。
複数のマイクロレンズLを有するマイクロレンズアレイ203の代わりに、本願出願人が先に出願して国際公開されたWO14/129630号に記載されている、複数のマイクロミラーを有するマイクロミラーアレイを用いてもよい。図7は、マイクロミラーアレイを構成する複数のマイクロミラー23Bのうちの一つの構成を模式的に示す断面図である。マイクロミラーアレイは、図7に示すマイクロミラー23Bが、二次元アレイ状に多数配列されたものである。
第1の撮像素子202において光電変換する波長域を、YeMgCy→RGBにしてもよい。一般に、YeMgCyの場合の波長域よりも、RGBの場合の波長域の方が狭い。このため、第1の撮像素子202において光電変換する波長域をRGBにすると、吸収(光電変換)される波長域(RGB)に対して、補色である透過波長域(YeMgCy)が広くなることから、第2の撮像素子204で光電変換される光量を増加させることができる。この結果として、第2の撮像素子204で取得されるLF画像の画素信号レベルを高めることができる。
なお、第2の撮像素子204において光電変換する波長域はRGBのままでよいが、YeMgCyであってもよい。
第2の撮像素子204として、光電変換する波長域が素子の厚み方向(Z軸方向)によって異なるイメージセンサを用いてもよい。このイメージセンサを用いる場合は、カラーフィルタアレイ204Aが不要である上に、第2の撮像素子204から読み出された画素信号に対する色補間処理も不要にすることができる。カラーフィルタアレイ204Aを用いないことは、第2の撮像素子204で光電変換される光量を増加させる効果が得られる。この結果として、第2の撮像素子204で取得されるLF画像の画素信号レベルを高めることができる。
また、色補間処理が不要になれば、画像処理部207の処理負担を軽減する効果が得られる。
上記実施の形態では、第1の撮像素子202および第2の撮像素子204から光電変換後の画素信号を独立した読み出し回路により読み出す例について説明した。この代わりに、第1の撮像素子202および第2の撮像素子204から光電変換後の画素信号を共通の読み出し回路により読み出すように構成してもよい。
図3のマイクロレンズアレイ203において、各マイクロレンズL1~L6の境界部分に、光を遮光する隔壁210を設けてもよい。隔壁210を設ける理由は、マイクロレンズL1~L6を通過した光が、そのマイクロレンズL1~L6の後ろ(Z軸マイナス方向)に配列された画素群PXsで受光され、隣のマイクロレンズL1~L6の後ろに配列された画素群PXsへ進入しないようにするためである。隔壁210は、例えば、機械加工やエッチングによりマイクロレンズアレイ203に格子状の深い溝を形成し、その溝の中に遮光性樹脂を充填して形成される。
上記実施の形態では、静止画を撮像する場合を例示したが、動画を撮像する場合にも適用して構わない。
したがって、以下の撮像素子および撮像装置も本発明に含まれる。
(1)入射光の一部の光を光電変換し一部の光を透過させ、それぞれ光電変換する光の色と透過する光の色が異なる複数の第1の撮像画素を有する第1の撮像部と、複数のマイクロレンズを有し、複数のマイクロレンズのうち一つのマイクロレンズに複数の第1の撮像画素を透過した異なる色の光が入射するマイクロレンズアレイと、複数のマイクロレンズのうち一つのマイクロレンズを透過した光が入射する複数の第2の撮像画素を有する第2の撮像部とを備える撮像素子。
(2)上記(1)の撮像素子において、第1の撮像画素の数はマイクロレンズの数よりも多い撮像素子。
(3)上記(1),(2)の撮像素子において、第1の撮像画素の間隔は第2の撮像画素の間隔よりも広い撮像素子。
(4)上記(3)の撮像素子において、第1の撮像画素の間隔は、第1の撮像部への入射光により生じる回折光が少なくなる間隔である撮像素子。
(5)(1)~(4)の撮像素子において、複数の第2の撮像画素は、それぞれ異なる色の光を光電変換する撮像素子。
(6)上記(5)の撮像素子において、第2の撮像画素が光電変換する光の色は、第1の撮像画素が光電変換する光の色と異なる撮像素子。
(7)上記(5)の撮像素子において、第2の撮像画素が光電変換する光の色は、第1の撮像画素が光電変換する光の色と同じである撮像素子。
(8)上記(5)~(7)の撮像素子において、第1の撮像部は、異なる色の光を光電変換する有機光電膜、または、カラーフィルタおよび有機光電膜により構成され、第2の撮像部は、カラーフィルタおよび受光部、または、深さ方向に異なる色の光を受光する受光部により構成される撮像素子。
(9)(1)~(8)の撮像素子において、第1の撮像部と前記マイクロレンズアレイと第2の撮像部とは、積層されている撮像素子。
(10)撮像装置は、上記(1)~(9)の撮像素子と、第1の撮像画素で生成された第1の信号に基づき、第1の画像データを生成する第1の画像データ生成部と、第2の撮像画素で生成された第2の信号に基づき、第1の画像データより画素数が少ない第2の画像データを生成する第2の画像データ生成部とを備える。
(11)上記(10)の撮像装置において、第1の画像データ生成部で生成された第1の画像データによる第1画像を生成する第1モードと、第2の画像データ生成部で生成された第2の画像データによる第2画像を生成する第2モードとを切り替えるモード切り替え部を備える撮像装置。
(12)上記(11)の撮像装置において、モード切り替え部は、設定されている撮像シーンモードによって第1モードと前記第2モードとを切り替える撮像装置。
日本国特許出願2015年第188248号(2015年9月25日出願)
Claims (13)
- 入射光のうち一部の波長の光を光電変換し他の波長の光を透過させる複数の第1光電変換部を有する第1撮像部と、
前記第1撮像部を透過した光が入射する複数のレンズと、
前記複数のレンズ毎に複数設けられた、入射光を光電変換する第2光電変換部を有する第2撮像部と、
を備える撮像素子。 - 請求項1に記載の撮像素子において、
前記第1撮像部の前記第1光電変換部の数が前記第2撮像部の前記第2光電変換部の数よりも少ない撮像素子。 - 請求項1または2に記載の撮像素子において、
2つの隣り合う前記第1光電変換部の中心の間隔は、2つの隣り合う前記第2光電変換部の中心の間隔よりも広い撮像素子。 - 請求項1から3までのいずれか一項に記載の撮像素子において、
前記第1撮像部の解像度が前記第2撮像部の解像度よりも低い撮像素子。 - 請求項3に記載の撮像素子において、
前記2つの隣り合う前記第1光電変換部の中心の間隔は4μm以上である撮像素子。 - 請求項1から5までのいずれか一項に記載の撮像素子において、
複数の前記第2光電変換部は、それぞれ異なる波長の光を光電変換する撮像素子。 - 請求項6に記載の撮像素子において、
前記第2光電変換部が光電変換する光の波長は、前記第1光電変換部が光電変換する光の波長と異なる撮像素子。 - 請求項6に記載の撮像素子において、
前記第2光電変換部が光電変換する光の波長は、前記第1光電変換部が光電変換する光の波長と同じである撮像素子。 - 請求項6から8までのいずれか一項に記載の撮像素子において、
複数の前記第1光電変換部は、異なる波長の光を光電変換する有機光電膜により構成され、複数の前記第2光電変換部は、カラーフィルタおよび光電変換部、または、深さ方向に異なる波長の光を受光する光電変換部により構成される撮像素子。 - 請求項1から9までのいずれか一項に記載の撮像素子において、
前記複数のレンズを有するレンズアレイを備え、
前記第1撮像部と前記レンズアレイと前記第2撮像部とは、積層されている撮像素子。 - 請求項1から10までのいずれか一項に記載の撮像素子と、
前記第1撮像部からの信号に基づき、第1画像データを生成し、前記第2撮像部からの信号に基づき、前記第1画像データより画素数が少ない第2画像データを生成する画像処理部と、
を備える撮像装置。 - 請求項11に記載の撮像装置において、
前記画像処理部で生成された前記第1画像データによる第1画像を生成する第1モードと、前記画像処理部で生成された前記第2画像データによる第2画像を生成する第2モードとを切り替えるモード切り替え部を備える撮像装置。 - 請求項12に記載の撮像装置において、
前記モード切り替え部は、設定されている撮像シーンモードによって前記第1モードと前記第2モードとを切り替える撮像装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201680055421.1A CN108140655A (zh) | 2015-09-25 | 2016-09-23 | 拍摄元件以及拍摄装置 |
JP2017541590A JPWO2017051876A1 (ja) | 2015-09-25 | 2016-09-23 | 撮像素子および撮像装置 |
US15/761,799 US20180278859A1 (en) | 2015-09-25 | 2016-09-23 | Image sensor and image-capturing device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-188248 | 2015-09-25 | ||
JP2015188248 | 2015-09-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017051876A1 true WO2017051876A1 (ja) | 2017-03-30 |
Family
ID=58386861
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/078034 WO2017051876A1 (ja) | 2015-09-25 | 2016-09-23 | 撮像素子および撮像装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20180278859A1 (ja) |
JP (1) | JPWO2017051876A1 (ja) |
CN (1) | CN108140655A (ja) |
WO (1) | WO2017051876A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10810717B2 (en) * | 2017-08-31 | 2020-10-20 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and image processing system |
JP7416605B2 (ja) | 2019-11-14 | 2024-01-17 | 日本放送協会 | 撮像素子 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018211601A1 (ja) * | 2017-05-16 | 2018-11-22 | オリンパス株式会社 | 撮像装置および撮像システム |
CN110690237B (zh) * | 2019-09-29 | 2022-09-02 | Oppo广东移动通信有限公司 | 一种图像传感器、信号处理方法及存储介质 |
TWI739431B (zh) * | 2019-12-09 | 2021-09-11 | 大陸商廣州印芯半導體技術有限公司 | 資料傳輸系統及其資料傳輸方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007228368A (ja) * | 2006-02-24 | 2007-09-06 | Institute Of National Colleges Of Technology Japan | 撮像素子用のカラーフィルターブロック |
JP2009017079A (ja) * | 2007-07-03 | 2009-01-22 | Sony Corp | 撮像装置及び撮像方法 |
JP2013145292A (ja) * | 2012-01-13 | 2013-07-25 | Nikon Corp | 固体撮像装置および電子カメラ |
JP2014039078A (ja) * | 2012-08-10 | 2014-02-27 | Olympus Corp | 固体撮像装置および撮像装置 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004064554A (ja) * | 2002-07-30 | 2004-02-26 | Olympus Corp | カメラ及びこれに用いる撮像素子ユニット |
US20100157117A1 (en) * | 2008-12-18 | 2010-06-24 | Yu Wang | Vertical stack of image sensors with cutoff color filters |
US9197804B1 (en) * | 2011-10-14 | 2015-11-24 | Monolithic 3D Inc. | Semiconductor and optoelectronic devices |
US9532033B2 (en) * | 2010-11-29 | 2016-12-27 | Nikon Corporation | Image sensor and imaging device |
JP6168994B2 (ja) * | 2011-10-17 | 2017-07-26 | キヤノン株式会社 | 撮像装置、およびその制御方法 |
US9184198B1 (en) * | 2013-02-20 | 2015-11-10 | Google Inc. | Stacked image sensor with cascaded optical edge pass filters |
-
2016
- 2016-09-23 WO PCT/JP2016/078034 patent/WO2017051876A1/ja active Application Filing
- 2016-09-23 CN CN201680055421.1A patent/CN108140655A/zh active Pending
- 2016-09-23 JP JP2017541590A patent/JPWO2017051876A1/ja not_active Withdrawn
- 2016-09-23 US US15/761,799 patent/US20180278859A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007228368A (ja) * | 2006-02-24 | 2007-09-06 | Institute Of National Colleges Of Technology Japan | 撮像素子用のカラーフィルターブロック |
JP2009017079A (ja) * | 2007-07-03 | 2009-01-22 | Sony Corp | 撮像装置及び撮像方法 |
JP2013145292A (ja) * | 2012-01-13 | 2013-07-25 | Nikon Corp | 固体撮像装置および電子カメラ |
JP2014039078A (ja) * | 2012-08-10 | 2014-02-27 | Olympus Corp | 固体撮像装置および撮像装置 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10810717B2 (en) * | 2017-08-31 | 2020-10-20 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and image processing system |
JP7416605B2 (ja) | 2019-11-14 | 2024-01-17 | 日本放送協会 | 撮像素子 |
Also Published As
Publication number | Publication date |
---|---|
CN108140655A (zh) | 2018-06-08 |
JPWO2017051876A1 (ja) | 2018-08-09 |
US20180278859A1 (en) | 2018-09-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102390672B1 (ko) | 이면 조사형 촬상 소자, 그 제조 방법 및 촬상 장치 | |
JP5879549B2 (ja) | ライトフィールド撮像装置、および画像処理装置 | |
WO2017051876A1 (ja) | 撮像素子および撮像装置 | |
US7233359B2 (en) | Image sensing apparatus having image signals generated from light between optical elements of an optical element array | |
KR100875938B1 (ko) | 광학기기 및 빔 스플리터 | |
JP5421207B2 (ja) | 固体撮像装置 | |
JP5538553B2 (ja) | 固体撮像素子及び撮像装置 | |
JP5547349B2 (ja) | デジタルカメラ | |
JP5563166B2 (ja) | 固体撮像装置及びデジタルカメラ | |
US7515818B2 (en) | Image capturing apparatus | |
JP5227368B2 (ja) | 3次元撮像装置 | |
US20110273599A1 (en) | Image-capturing device and imaging apparatus | |
JP5503209B2 (ja) | 撮像素子及び撮像装置 | |
JP2014187160A (ja) | 固体撮像装置および携帯情報端末 | |
JP6004280B2 (ja) | ライトフィールド撮像装置および撮像素子 | |
JP5532766B2 (ja) | 撮像素子及び撮像装置 | |
JP2002204462A (ja) | 撮像装置及びその制御方法及び制御プログラム及び記憶媒体 | |
KR20140061234A (ko) | 이미지 생성 장치 및 이미지 생성 방법 | |
JP2012211942A (ja) | 固体撮像素子及び撮像装置 | |
JP2007140176A (ja) | 電子カメラ | |
JP6090360B2 (ja) | 撮像素子および撮像装置 | |
JP5537687B2 (ja) | 固体撮像装置 | |
JP2018160912A (ja) | 撮像素子および撮像装置 | |
JP2008129360A (ja) | 焦点検出装置及び撮像装置 | |
JP2007187806A (ja) | 撮像装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16848664 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15761799 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2017541590 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16848664 Country of ref document: EP Kind code of ref document: A1 |