WO2013164901A1 - 固体撮像装置 - Google Patents
固体撮像装置 Download PDFInfo
- Publication number
- WO2013164901A1 WO2013164901A1 PCT/JP2013/001728 JP2013001728W WO2013164901A1 WO 2013164901 A1 WO2013164901 A1 WO 2013164901A1 JP 2013001728 W JP2013001728 W JP 2013001728W WO 2013164901 A1 WO2013164901 A1 WO 2013164901A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- photosensitive cell
- incident
- signal
- photosensitive
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 105
- 230000003595 spectral effect Effects 0.000 claims description 166
- 238000000034 method Methods 0.000 claims description 40
- 238000006243 chemical reaction Methods 0.000 claims description 33
- 230000000295 complement effect Effects 0.000 claims description 32
- 230000003287 optical effect Effects 0.000 claims description 32
- 238000012545 processing Methods 0.000 claims description 29
- 238000004364 calculation method Methods 0.000 claims description 14
- 230000007423 decrease Effects 0.000 claims description 12
- 230000008569 process Effects 0.000 claims description 10
- 238000012937 correction Methods 0.000 claims description 4
- 230000009467 reduction Effects 0.000 claims description 3
- 230000014509 gene expression Effects 0.000 description 26
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 17
- 238000010586 diagram Methods 0.000 description 10
- 238000009826 distribution Methods 0.000 description 10
- 230000035945 sensitivity Effects 0.000 description 9
- 239000003086 colorant Substances 0.000 description 7
- 239000004065 semiconductor Substances 0.000 description 7
- 238000004611 spectroscopical analysis Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000000926 separation method Methods 0.000 description 5
- 239000000758 substrate Substances 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 239000011347 resin Substances 0.000 description 4
- 229920005989 resin Polymers 0.000 description 4
- 238000001444 catalytic combustion detection Methods 0.000 description 3
- 238000002474 experimental method Methods 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 2
- 238000005253 cladding Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 238000002834 transmittance Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 239000010408 film Substances 0.000 description 1
- 230000004907 flux Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000012860 organic pigment Substances 0.000 description 1
- 238000000059 patterning Methods 0.000 description 1
- 239000010453 quartz Substances 0.000 description 1
- 238000010187 selection method Methods 0.000 description 1
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N silicon dioxide Inorganic materials O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000000427 thin-film deposition Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/646—Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
- H01L27/14627—Microlenses
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1462—Coatings
- H01L27/14621—Colour filter arrangements
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
- H01L27/14629—Reflectors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
Definitions
- This application relates to a technology for increasing the sensitivity and color of a solid-state imaging device.
- image sensors In recent years, there has been a remarkable increase in functionality and performance of digital cameras and digital movies using solid-state image sensors such as CCDs and CMOSs (hereinafter sometimes referred to as “image sensors”). In particular, due to rapid progress in semiconductor manufacturing technology, the pixel structure in an image sensor has been miniaturized. As a result, the pixels of the image sensor and the drive circuit are highly integrated, and the performance of the image sensor is increasing. In particular, in recent years, a camera using a backside illumination type image sensor that receives light on the back surface side rather than the surface (front surface) side on which the wiring layer of the solid-state image sensor is formed has been developed, and its characteristics are attracting attention. ing. On the other hand, with the increase in the number of pixels of the image sensor, the amount of light received by one pixel is reduced, which causes a problem that the camera sensitivity is reduced.
- ⁇ The sensitivity of the camera is reduced due to the use of color filters for color separation in addition to the increase in the number of pixels.
- a subtractive color filter using an organic pigment as a coloring matter is arranged facing each photosensitive cell of the image sensor. Since the color filter absorbs light other than the color component to be used, when such a color filter is used, the light utilization rate of the camera is lowered.
- each color filter of R, G, B is Only the R, G, B light is transmitted and the remaining light is absorbed. Therefore, the light used in the color camera with the Bayer array is about 1/3 of the entire incident light.
- Japanese Patent Application Laid-Open No. H10-228561 discloses a technique for increasing the amount of received light by attaching a microlens array to the light receiving portion of the image sensor in order to capture much incident light.
- the light aperture ratio in the image sensor can be substantially improved by condensing the light sensing cell using the microlens.
- This technique is currently used in most solid-state image sensors. If this technique is used, the substantial aperture ratio is certainly improved, but it does not solve the problem of a decrease in the light utilization rate due to the color filter.
- Patent Document 2 discloses a technique for making maximum use of light by combining a multilayer color filter (dichroic mirror) and a microlens.
- a multilayer color filter dichroic mirror
- a microlens a plurality of dichroic mirrors that selectively transmit light in a specific wavelength region and reflect light in other wavelength regions without absorbing light are used.
- FIG. 10 is a diagram schematically showing a cross section in a direction perpendicular to the imaging surface of the imaging device disclosed in Patent Document 2.
- the imaging device includes condensing microlenses 4a and 4b, a light shielding unit 20, photosensitive cells 2a, 2b, and 2c, and dichroic mirrors 17, 18, and 19 disposed on and inside the imaging device, respectively. It has.
- the dichroic mirrors 17, 18, and 19 are disposed so as to face the photosensitive cells 2a, 2b, and 2c, respectively.
- the dichroic mirror 17 has characteristics of transmitting R light and reflecting G light and B light.
- the dichroic mirror 18 has a characteristic of reflecting G light and transmitting R light and B light.
- the dichroic mirror 19 has characteristics of reflecting B light and transmitting R light and G light.
- the light incident on the microlens 4 a is incident on the first dichroic mirror 17 after the light flux is adjusted by the microlens 4 b.
- the first dichroic mirror 17 transmits R light but reflects G light and B light.
- the light transmitted through the first dichroic mirror 17 is incident on the photosensitive cell 2a.
- the G light and B light reflected by the first dichroic mirror 17 are incident on the adjacent second dichroic mirror 18.
- the second dichroic mirror 18 reflects G light out of incident light and transmits B light.
- the G light reflected by the second dichroic mirror 18 enters the photosensitive cell 2b.
- the B light transmitted through the second dichroic mirror 18 is reflected by the third dichroic mirror 19 and is incident on the photosensitive cell 2c immediately below it.
- Patent Document 3 discloses an image sensor that can prevent light loss by using a microprism.
- This imaging device has a structure in which different photosensitive cells receive light separated into red, green, and blue by a microprism. Such an image sensor can also prevent light loss.
- Patent Document 2 and Patent Document 3 it is necessary to provide as many photosensitive cells as the number of dichroic mirrors to be used or the number of spectrums. For example, in order to detect light of three colors of RGB, there is a problem that the number of photosensitive cells must be increased by a factor of three compared to the number of photosensitive cells when a conventional color filter is used.
- Patent Document 4 discloses a technique for increasing the utilization factor of light using a dichroic mirror and reflection.
- FIG. 11 shows a part of a cross-sectional view of an image sensor using the technique.
- dichroic mirrors 22 and 23 are disposed in a translucent resin 21.
- the dichroic mirror 22 has characteristics of transmitting G light and reflecting R light and B light.
- the dichroic mirror 23 has characteristics of transmitting R light and reflecting G light and B light.
- R light and G light can all be detected by the following principle.
- the R light is reflected by the dichroic mirror 22 and transmitted by the dichroic mirror 23.
- the R light reflected by the dichroic mirror 22 is further reflected at the interface between the translucent resin 21 and air and enters the dichroic mirror 23.
- the R light passes through the dichroic mirror 23, and further passes through the organic dye filter 25 and the microlens 26 having R light transmittance. In this way, although part of the light is reflected by the metal layer 27, most of the R light incident on the dichroic mirrors 22 and 23 is incident on the light sensing unit.
- the G light when the G light is incident on the dichroic mirrors 22 and 23, the G light is transmitted through the dichroic mirror 22 and reflected by the dichroic mirror 23.
- the G light reflected by the dichroic mirror 23 is further totally reflected at the interface between the translucent resin 21 and air, and enters the dichroic mirror 22.
- the G light passes through the dichroic mirror 22, and further passes through the organic dye filter 24 and the micro lens 26 having G light transmittance. In this way, although part of the light is reflected by the metal layer 27, most of the G light incident on the dichroic mirrors 22 and 23 is incident on the light sensing unit without any loss.
- the technique disclosed in Patent Document 4 can receive two colors with almost no loss, although one color of RGB light is lost. For this reason, it is not necessary to arrange the light sensing units for the three colors RGB.
- the light utilization rate with only the organic dye filter is about 3
- the light utilization factor when using this technique is about 2/3 of the total incident light. That is, according to this technique, the imaging sensitivity is improved about twice. However, even with this technique, one of the three colors is lost.
- Patent Document 5 discloses a colorization technique that increases the light utilization rate without significantly increasing the number of photosensitive cells using a spectral element. According to this technique, light is incident on different photosensitive cells depending on the wavelength range by the spectral elements arranged corresponding to the photosensitive cells. Each photosensitive cell receives light on which components in different wavelength ranges are superimposed from a plurality of spectral elements. As a result, a color signal can be generated by signal calculation using a photoelectric conversion signal output from each photosensitive cell.
- One embodiment of the present invention can increase the light utilization rate without significantly increasing the number of photosensitive cells, and even if crosstalk of incident light to each photosensitive cell increases due to generation of oblique light, Provide color imaging technology with good color reproducibility.
- a solid-state imaging device includes a first photosensitive cell, a second photosensitive cell, and a third photosensitive sensor, which are each arranged in 2 rows and 2 columns.
- a photosensitive cell array in which a plurality of unit blocks including a cell and a fourth photosensitive cell are two-dimensionally arranged on an imaging surface, and in each unit block, the first photosensitive cell is in one row and one column.
- the second photosensitive cell is disposed in the second row and the first column
- the third photosensitive cell is disposed in the first row and the second column
- the fourth photosensitive cell is disposed in the second row and the second column.
- a light-sensitive cell array arranged in a row, a first spectral element, and a spectral element array including a second spectral element having a spectral characteristic different from that of the first spectral element, wherein the first spectral element
- An element is disposed on at least one of the first and third photosensitive cells. At least a portion is arranged so as to be opposed to each other, and is configured to separate incident light in the arrangement direction of the first and third photosensitive cells according to the wavelength.
- the second and fourth light sensing cells are arranged so that at least a part of the second light-splitting element faces at least one of the second and fourth light-sensitive cells, and incident light is detected according to the wavelength.
- a spectral element array configured to be separated in the cell arrangement direction, and provided in a region covering each of the first and third photosensitive cells, and focused on the first and third photosensitive cells. And a plurality of condensing elements configured to do so.
- An image pickup apparatus includes the above-described solid-state image pickup device, an optical system that forms an image on the image pickup surface of the solid-state image pickup device, and first to fourth outputs that are output from the first to fourth photosensitive cells, respectively.
- a signal processing unit for generating a color signal based on the fourth photoelectric conversion signal To a signal processing unit for generating a color signal based on the fourth photoelectric conversion signal.
- the light utilization rate is high without significantly increasing the number of photosensitive cells. Even if the crosstalk of incident light to each photosensitive cell increases due to the occurrence, color imaging with good color reproducibility becomes possible.
- FIG. 2 is a perspective view schematically showing an arrangement relationship between the photosensitive cell array 200 and the spectral element array 100 in the solid-state imaging device according to the embodiment of the present invention.
- FIG. It is a top view which shows an example of the unit block of the solid-state image sensor by one Embodiment of this invention.
- FIG. 2B is a cross-sectional view taken along line A-A ′ in FIG. 2A.
- FIG. 2B is a sectional view taken along line B-B ′ in FIG. 2A.
- FIG. 2B is a cross-sectional view taken along line C-C ′ in FIG. 2A.
- FIG. 2B is a sectional view taken along line D-D ′ in FIG. 2A.
- FIG. 1 is a block diagram illustrating a schematic configuration of an imaging apparatus according to Embodiment 1.
- FIG. It is a figure which shows typically the lens and imaging device in Embodiment 1.
- FIG. 2 is a diagram illustrating an example of a pixel array of an image sensor according to Embodiment 1.
- FIG. 1 is a plan view showing a basic structure of an image sensor in Embodiment 1.
- FIG. 6B is a sectional view taken along line AA ′ in FIG. 6A.
- FIG. 6B is a sectional view taken along line BB ′ in FIG. 6A. It is a figure which shows the intensity
- FIG. 4 is a flowchart illustrating a procedure of color information generation processing in the first embodiment.
- FIG. 6 is a plan view showing a basic structure of an image sensor in Embodiment 2. It is the sectional view on the AA 'line in FIG. 8A.
- FIG. 8B is a sectional view taken along line BB ′ in FIG. 8A. It is a figure which shows the intensity
- FIG. It is a figure which shows the other example of a spectroscopic element. It is sectional drawing of the conventional image pick-up element using a micro lens and a multilayer film filter (dichroic mirror). It is sectional drawing of the conventional image pick-up element using a multilayer filter (dichroic mirror) and reflection.
- a solid-state imaging device includes a first photosensitive cell, a second photosensitive cell, a third photosensitive cell, and a fourth photosensitive cell, each arranged in 2 rows and 2 columns.
- a light-sensitive cell array in which a plurality of unit blocks including light-sensitive cells are two-dimensionally arranged on the imaging surface, and in each unit block, the first light-sensitive cell is arranged in the first row and the first column, The second photosensitive cell is arranged in the second row and the first column, the third photosensitive cell is arranged in the first row and the second column, and the fourth photosensitive cell is arranged in the second row and the second column.
- a spectral element array including a photosensitive cell array, a first spectral element, and a second spectral element having a spectral characteristic different from that of the first spectral element, wherein the first spectral element is the first spectral element. And at least part of the first light-splitting element faces at least one of the third photosensitive cell and the third photosensitive cell. Arranged so that incident light is separated in the arrangement direction of the first and third photosensitive cells according to the wavelength, and the second spectral element is the second and fourth At least one part of the second light-splitting element is arranged so as to face at least one of the light-sensitive cells, and separates incident light in the arrangement direction of the second and fourth light-sensitive cells according to the wavelength.
- a plurality of spectral element arrays each configured to focus on the first and third photosensitive cells, each provided in a region covering the first and third photosensitive cells And a light condensing element.
- the first light-splitting element faces the first or third light-sensitive cell and has light in a different wavelength range for the first light-sensitive cell and the third light-sensitive cell.
- the second light-splitting element faces the second or fourth photosensitive cell and has a wavelength range different from that of the second photosensitive cell and the fourth photosensitive cell. It arrange
- the first light-splitting element causes the light in the first wavelength region to enter the third light-sensitive cell, and the first light-sensitive element.
- Complementary light of the light in the first wavelength region is incident on the cell
- the second spectral element causes light in the second wavelength region to be incident on the second photosensitive cell, and is incident on the fourth photosensitive cell.
- the complementary color light of the light in the second wavelength region is incident.
- the spectral element array further includes a third spectral element and a fourth spectral element, and the first to fourth spectral elements are Each of the first to fourth photosensitive cells is disposed so as to face the first to fourth photosensitive cells, and is designed to make light of different wavelength ranges incident on the first to fourth photosensitive cells.
- the first to fourth spectral elements have different characteristics for separating incident light into primary color light and complementary color light, or different incident light 3 It has the characteristic of separating light into two wavelength ranges.
- the first spectral element causes light in a first wavelength region to enter the third photosensitive cell, and the first spectral element Complementary color light of the light in the first wavelength range is incident on one photosensitive cell, the third spectral element causes light in the second wavelength region to be incident on the first photosensitive cell, and the third spectral element The complementary color light of the light in the second wavelength region is incident on the photosensitive cell.
- the first spectral element causes light in a first wavelength region to enter the first photosensitive cell.
- the complementary light of the light in the first wavelength range is incident on the third photosensitive cell, and the third spectral element causes the light in the first wavelength range to be incident on the first photosensitive cell, Complementary color light of the light in the first wavelength region is incident on the third photosensitive cell.
- the second spectral element causes light in a third wavelength range to enter the fourth photosensitive cell.
- the complementary light of the light of the third wavelength region is incident on the second photosensitive cell
- the fourth spectral element is incident light of the third wavelength region on the fourth photosensitive cell
- the complementary light of the light in the third wavelength region is incident on the second photosensitive cell.
- the first wavelength range is one of red, green, and blue wavelength ranges
- the second wavelength range is red, green
- the third wavelength range is the remaining one of the red, green, and blue wavelength ranges.
- each of the plurality of condensing elements is a cylindrical lens having a curved surface on the light incident side.
- An imaging apparatus includes a solid-state imaging device according to any one of items (1) to (10), an optical system that forms an image on an imaging surface of the solid-state imaging device, And a signal processing unit that generates color signals based on the first to fourth photoelectric conversion signals output from the first to fourth photosensitive cells, respectively.
- the signal processing unit may include a signal output from the first photosensitive cell and a signal output from the third photosensitive cell. Generated by a process including an addition operation of a first luminance signal generated by a process including an addition operation and a signal output from the second photosensitive cell and a signal output from the fourth photosensitive cell. A second luminance signal is generated, and the color signal is corrected based on a difference result between the first luminance signal and the second luminance signal.
- a coefficient k1 that defines a light collection characteristic of the light collection element, and a reduction amount of the coefficient k1 due to an influence of oblique light incident on the solid-state image pickup element
- a recording medium on which information indicating a correlation between a correction coefficient k2 indicating the degree of decrease in spectral characteristics of the first and second spectral elements due to the influence of the oblique light is recorded is further provided.
- the signal processing unit calculates a difference result between the first luminance signal and the second luminance signal, the coefficient k1, and the correlation.
- the color signal is corrected based on the indicated information.
- the signal processing unit indicates a difference between the first photoelectric conversion signal and the third photoelectric conversion signal.
- the first color difference signal and the second color difference are obtained by signal calculation using a first difference signal and a second difference signal indicating a difference between the second photoelectric conversion signal and the fourth photoelectric conversion signal.
- a signal is generated, and the first color difference signal and the second color difference signal are corrected using a difference result between the first luminance signal and the second luminance signal.
- a solid-state imaging device includes a photosensitive cell array including a plurality of photosensitive cells (pixels) arranged two-dimensionally on an imaging surface. And a spectral element array including a plurality of spectral elements.
- FIG. 1 is a perspective view schematically showing a part of the photosensitive cell array 200 and the spectral element array 100 formed on the imaging surface of the solid-state imaging device 10.
- the spectral element array 100 is disposed on the light incident side facing the photosensitive cell array 200.
- the arrangement, shape, size, and the like of the photosensitive cells 2 are not limited to the example shown in this figure, and may be any known arrangement, shape, and size.
- the spectral element array 100 is represented by a quadrangular prism for convenience, the spectral element array 100 does not actually have such a shape and can have various structures.
- each light sensing cell 2 When each light sensing cell 2 receives light, it outputs an electrical signal (hereinafter referred to as a “photoelectric conversion signal” or a “pixel signal”) corresponding to the intensity of the light received by photoelectric conversion (incident light amount). To do.
- each photosensitive cell 2 receives light in a plurality of wavelength regions (color components) whose traveling directions are changed by the spectral element array 100. As a result, the light received by each photosensitive cell 2 has a spectral distribution (intensity distribution for each wavelength range) different from the light received when it is assumed that there is no spectral element.
- FIG. 2A is a plan view showing an example of a basic pixel configuration (unit block) 40 of the photosensitive cell array 200.
- FIG. The photosensitive cell array 200 has a structure in which a plurality of unit blocks 40 each including four photosensitive cells 2a, 2b, 2c, and 2d are two-dimensionally arranged on the imaging surface. In the example shown in the figure, four photosensitive cells are arranged in two rows and two columns in one unit block.
- reference numeral 4c denotes a horizontal micro lenticular lens configured to collect light only in the vertical direction. This micro lenticular lens is a cylindrical lens having a curved surface on the light incident side, and is arranged every other row in the vertical direction.
- the horizontal direction in FIG. 2A is referred to as “horizontal direction”
- the vertical direction is referred to as “vertical direction”.
- 2B and 2C are diagrams schematically showing a cross section taken along line AA ′ and a cross section taken along line BB ′ in FIG. 2A, respectively.
- the traveling direction changes depending on the color component, and as a result, the spectral distribution of light received by each photosensitive cell is different from each other. Is shown.
- the spectral element array 100 does not exist, the light received by each photosensitive cell is referred to as “cell incident light” of the photosensitive cell.
- the photosensitive cells 2a to 2d included in one unit block are close to each other, it can be considered that the light intensity and the spectral distribution included in the cell incident light of the photosensitive cells are almost the same.
- 2D and 2E are diagrams schematically showing a CC ′ line cross section and a DD ′ line cross section in FIG. 2A, respectively.
- the light incident on the image sensor 10 is collected in the vertical direction (left and right direction in FIG. 2D) by the microlenticular lens 4c.
- the traveling direction does not change depending on the color component when passing through the spectral element array 100.
- the light incident on the region where the micro lenticular lens 4c is not disposed travels straight without change in the traveling direction by the spectral element array 100.
- 2D and 2E show how such light is incident.
- the micro lenticular lens 4c is designed such that the vertical width is slightly wider than the interval (pixel pitch) between adjacent photosensitive cells.
- the light receiving amount increases in the photosensitive cells 2a and 2b, and conversely, the light receiving amount decreases in the photosensitive cells 2c and 2d.
- the intensity (light quantity) of light received by each photosensitive cell when it is assumed that there is no micro lenticular lens 4c and light incident on the image sensor 10 enters vertically will be described.
- the light intensity is expressed using symbols.
- the intensity of the visible light component of the cell incident light of the photosensitive cell is represented by the symbol “W”.
- the visible light color component excluding each color component is referred to as “complementary color” of the color component
- the complementary color light is referred to as “complementary color light”.
- the complementary color of the first color component C1 is represented by C2 + C3
- the complementary color of the second color component C2 is represented by C1 + C3
- the complementary color of the third color component C3 is represented by C1 + C2.
- the complementary color of the color component Cn (Cn is any one of C1, C2, and C3) and its intensity may be represented by Cn ⁇ .
- the combination of the first to third color components is typically a combination of three primary colors of red (R), green (G), and blue (B), but divides visible light into three wavelength ranges. Any other combination of color components may be used.
- the operation of the spectral element array 100 in this embodiment will be described. If it is assumed that there is no micro lenticular lens 4c and light is incident on the image pickup element 10 perpendicularly to the image pickup surface, the spectral element array 100 has the first incident light (intensity W) from the first photosensitive cell 2a. The light obtained by adding the light of the first color component (intensity C1) to the light other than the complementary color light (intensity C1 ⁇ ) of the color component is incident on the first photosensitive cell 2a.
- the complementary color light (intensity C1 ⁇ ) of the first color component is added to the light obtained by removing the light (intensity C1) of the first color component from the cell incident light (intensity W) of the second photosensitive cell 2b. Light is incident on the second photosensitive cell 2b. Further, complementary color light (intensity C3 ⁇ ) of the third color component was added to light obtained by removing light of the third color component (intensity C3) from the cell incident light (intensity W) of the third photosensitive cell 2c. Light is incident on the third photosensitive cell 2c.
- the light of the third color component (intensity C3) is added to the light obtained by removing the complementary color light (intensity C3 ⁇ ) of the third color component from the cell incident light (intensity W) of the fourth photosensitive cell 2d.
- Light is incident on the fourth photosensitive cell 2d.
- the intensity of light received by the photosensitive cells 2a to 2d in this case is indicated by symbols. Since the photosensitive cells 2a to 2d are configured to output photoelectric conversion signals (pixel signals) corresponding to the amount of received light, signals indicating C1, C2, and C3, that is, color signals, are calculated by calculation between these signals. Is obtained.
- the micro lenticular lens 4c since the micro lenticular lens 4c is actually arranged, it is necessary to consider the influence of the micro lenticular lens 4c.
- the amount of incident light on the photosensitive cells 2a and 2b increases and the amount of incident light on the photosensitive cells 2c and 2d decreases compared to the case where the micro lenticular lens 4c is not provided. If the rate of change in the amount of incident light is k1 (k1 is a real number satisfying 0 ⁇ k1 ⁇ 1), the amount of incident light to the photosensitive cells 2a and 2b is (1 + k1) times, and the amount to the photosensitive cells 2c and 2d is increased.
- the amount of incident light is (1 ⁇ k1) times.
- FIG. 2F shows the light received by the photosensitive cells 2a-2d in this case.
- FIG. 2G shows the light received by the photosensitive cells 2a-2d in this case.
- FIG. 2H shows the light received by the photosensitive cells 2a-2d when this effect is taken into account.
- k3 is a real number satisfying 0 ⁇ k3 ⁇ 1, and is a coefficient representing the degree of relaxation of the spectrum as will be described later.
- the influence of the micro lenticular lens 4c and the generation of oblique light affects only the vertical direction of imaging.
- the photosensitive cells 2a and 2b are (1 + k1 ⁇ k2) times W ⁇ C1 ⁇ + C1 and W ⁇ C1 + C1 ⁇ , respectively, and the photosensitive cells 2c and 2d are respectively W ⁇ C3 + C3 ⁇ and W ⁇ C3 ⁇ + C3 ( It receives light having an intensity expressed by 1 ⁇ k1 + k2) times.
- Each photosensitive cell outputs a photoelectric conversion signal (pixel signal) corresponding to these intensities.
- S2a to S2d can be expressed by the following formulas 1 to 4, respectively.
- D1 and D2 are expressed by the following formulas 5 and 6, respectively.
- the influence on the horizontal direction of the micro lenticular lens 4c and the generation of oblique light will be considered. Since the micro lenticular lens 4c collects light only in the vertical direction, there is no influence in the horizontal direction. However, with respect to the influence of oblique light generation, as described above, the mixing of light increases due to the influence of crosstalk, and the color separability decreases. That is, the absolute values of D1 and D2 decrease due to the influence. When the degree of the decrease is represented by the symbol k3, D1 and D2 are represented by the following formulas 7 and 8, respectively.
- D1 2k3 (1 + k1-k2) (2C1s-Ws)
- D2 2k3 (1-k1 + k2) (2C3s-Ws)
- the signals D1 and D2 expressed by Expressions 13 and 14 can be used as color difference signals.
- color signals C1s, C2s, and C3s are obtained by matrix operation.
- the color signal can be calculated by signal calculation based on the four photoelectric conversion signals S2a to S2d output from the photosensitive cells 2a to 2d.
- the color difference signals D1 and D2 represented by the expressions 13 and 14 are affected by the micro lenticular lens 4c and the generation of oblique light. If k1, k2, and k3 are known, the signals when there is no such influence are reversed. Can be requested. Hereinafter, an example of how to obtain these coefficients will be described.
- k1 is a parameter determined by the shape and arrangement of the micro lenticular lens 4c, and can be obtained by optical design.
- k2 and k3 are parameters determined by the influence of oblique light, both are determined by oblique light, and thus have a correlation.
- the values can be obtained by substituting the values of the signals S2a to S2d into the equations 15 to 17.
- k3 can be determined from the value of k2 obtained using Equations 15 to 17 by previously obtaining the correlation between k2 and k3 through experiments or simulations. Since the amount of oblique light depends on the optical aperture value, the correlation between k2 and k3 can be stored in a database by, for example, an experiment for measuring a color change while changing the optical aperture value. Instead of creating a database, a function indicating the correlation between k2 and k3 may be obtained.
- color information can be obtained by signal calculation using a spectral element without using a color filter that absorbs part of light, and each light detection is performed by generation of oblique light. Even if crosstalk of incident light to the cell increases, the influence can be corrected. As a result, light loss can be prevented, imaging sensitivity can be increased, and good color reproducibility can be realized.
- the spectral element array 100 is illustrated as a continuous element that covers a plurality of photosensitive cells.
- the spectral element array 100 includes a plurality of spatially separated spectral elements. It can be an aggregate.
- a spectroscopic element for example, a high refractive index transparent member described later, a microprism, or the like can be used.
- the spectral element array 100 according to the present disclosure may be configured in any manner as long as the photoelectric conversion signals represented by the above formulas 1 to 4 or 9 to 12 can be obtained. You may go.
- the spectral element array 100 divides incident light into C1 light and C1 ⁇ light that is complementary color light for the first row in FIG. Although it is divided into C3 ⁇ light which is complementary color light, the method of spectroscopy is not limited to this example.
- the spectral element array 100 may be configured so that the light sensing cells 2a to 2d receive light having different spectral distributions.
- the spectral element array 100 may be configured such that the photosensitive cells 2a to 2d output signals S2a to S2d shown in the following equations 18 to 21, respectively.
- FIG. 3 is a block diagram illustrating the overall configuration of the imaging apparatus according to the first embodiment.
- the imaging apparatus according to the present embodiment is a digital electronic camera, and includes an imaging unit 300 and a signal processing unit 400 that generates a signal (image signal) indicating an image based on a signal transmitted from the imaging unit 300. ing. Note that the imaging device may generate only a still image or may have a function of generating a moving image.
- the imaging unit 300 is a solid that converts an optical lens 12, an optical diaphragm 50, an optical filter 11, and optical information formed through the optical lens 12 and the optical filter 11 into an electrical signal by photoelectric conversion.
- An image sensor 10 image sensor
- the imaging unit 300 further generates a basic signal for driving the imaging device 10, receives an output signal from the imaging device 10, and sends it to the signal processing unit 400, and signal generation / reception.
- an element driving unit 14 that drives the image sensor 10 based on the basic signal generated by the unit 13.
- the optical lens 12 is a known lens and may be a lens unit having a plurality of lenses.
- the optical filter 11 is a combination of a quartz low-pass filter for reducing moire patterns generated due to pixel arrangement and an infrared cut filter for removing infrared rays.
- the image sensor 10 is typically a CMOS or a CCD, and is manufactured by a known semiconductor manufacturing technique.
- the signal generation / reception unit 13 and the element driving unit 14 are configured by an LSI such as a CCD driver, for example.
- the signal processing unit 400 generates an image signal by processing a signal transmitted from the imaging unit 300, a memory 30 for storing various data generated in the process of generating the image signal, and the generated signal And an image signal output unit 16 for sending the image signal to the outside.
- the image signal generation unit 15 can be suitably realized by a combination of hardware such as a known digital signal processor (DSP) and software that executes image processing including image signal generation processing.
- the memory 30 is configured by a DRAM or the like. The memory 30 records the signal transmitted from the imaging unit 300 and temporarily records the image data generated by the image signal generation unit 15 and the compressed image data. These image data are sent to a recording medium (not shown) or a display unit via the image signal output unit 16.
- the imaging apparatus may include known components such as an electronic shutter, a viewfinder, a power source (battery), and a flashlight, but a description thereof is omitted because it is not particularly necessary for understanding the present embodiment.
- the above configuration is merely an example, and publicly known elements can be used in appropriate combinations for the constituent elements other than the image sensor 10 and the image signal generation unit 15.
- FIG. 4 is a diagram schematically illustrating a state in which light transmitted through the lens 12 is incident on the image sensor 10 during exposure.
- the lens 12 can be generally composed of a plurality of lenses arranged in the optical axis direction, but is drawn as a single lens for simplicity.
- a photosensitive cell array 200 including a plurality of photosensitive cells (pixels) arranged two-dimensionally is disposed on the imaging surface 10a of the imaging element 10. Each photosensitive cell is typically a photodiode, and outputs a photoelectric conversion signal (pixel signal) corresponding to the amount of incident light by photoelectric conversion.
- Light (visible light) transmitted through the lens 12, the optical diaphragm 50, and the optical filter 11 is incident on the imaging surface 10a.
- the intensity of light incident on the imaging surface 10a and the distribution (spectral distribution) of the amount of incident light for each wavelength range differ depending on the incident position.
- the aperture value of the optical aperture 50 is larger, the oblique light increases, and as the aperture value is smaller, the oblique light decreases.
- FIG. 5 is a plan view showing an example of a pixel array in the present embodiment.
- the photosensitive cell array 200 includes, for example, a plurality of photosensitive cells arranged in a square lattice pattern on the imaging surface 10a as shown in FIG.
- the photosensitive cell array 200 includes a plurality of unit blocks 40, and each unit block 40 includes four photosensitive cells 2a, 2b, 2c, and 2d. As shown in the figure, the four photosensitive cells 2a to 2d included in each unit block are close to each other, but even if they are separated from each other, by appropriately configuring the spectral element array 100 described later, Color information can be obtained.
- Each unit block may include five or more photosensitive cells.
- the spectral element array 100 including a plurality of spectral elements is arranged on the light incident side facing the photosensitive cell array 200.
- one spectral element is provided for each of the four photosensitive cells included in each unit block.
- the spectroscopic element in the present embodiment is an optical element that directs incident light in different directions according to the wavelength range by using diffraction of light generated at the boundary between two types of translucent members having different refractive indexes.
- This type of spectroscopic element consists of a high refractive index transparent member (core part) formed of a material having a relatively high refractive index and a low contact with each side surface of the core part formed of a material having a relatively low refractive index. And a refractive index transparent member (cladding portion). Due to the difference in refractive index between the core part and the clad part, a phase difference occurs between the light transmitted through the core part and diffraction occurs.
- a high refractive index transparent member may be referred to as a “spectral element” because spectroscopy is possible due to a difference in refractive index between the core portion and the cladding portion. Details of such a diffractive spectral element are disclosed in, for example, Japanese Patent No. 4264465.
- the spectral element array 100 having the spectral elements as described above can be manufactured by performing thin film deposition and patterning by a known semiconductor manufacturing technique.
- the material (refractive index), shape, size, arrangement pattern, and the like of the spectral elements it becomes possible to separate and integrate light in a desired wavelength range into individual photosensitive cells.
- a signal corresponding to a necessary color component can be calculated from a set of photoelectric conversion signals output from each photosensitive cell.
- FIG. 6A is a plan view showing the basic structure of the image sensor 10.
- the spectroscopic elements 1a, 1b, 1c, and 1d are disposed to face the four photosensitive cells 2a, 2b, 2c, and 2d, respectively.
- a plurality of patterns having such a basic structure are repeatedly formed on the imaging surface 10a.
- the micro lenticular lenses 4c are arranged in the upper direction of the spectral element array 100 every other row in the vertical direction.
- a micro lenticular lens 4c is disposed on the spectral elements 1a and 1b, and the micro lenticular lens 4c has a convex surface.
- the imaging device 10 includes a semiconductor substrate 7 made of a material such as silicon, photosensitive cells 2a to 2d arranged inside the semiconductor substrate 7, and a surface side of the semiconductor substrate 7 (light is incident thereon).
- the structure shown in FIGS. 6A to 6C can be manufactured by a known semiconductor manufacturing technique.
- the image sensor 10 shown in FIGS. 6A to 6C has a surface irradiation type structure in which light is incident on each photosensitive cell from the wiring layer 5 side.
- the imaging device 10 of the present embodiment is not limited to such a structure, and may have a back-illuminated structure that receives light from the opposite side of the wiring layer 5.
- the spectroscopic elements 1a and 1b have a rectangular cross section that is long in the direction in which light is transmitted, and perform spectroscopic analysis using a difference in refractive index between itself and the transparent layer 6a.
- the spectroscopic element 1a makes yellow (Ye) light incident on the opposing photosensitive cell 2a, and divides blue (B) light into the photosensitive cell 2b and the photosensitive cell (not shown) included in the adjacent unit block by half. Make it incident.
- yellow (Ye) light is light composed of red (R) light and green (G) light.
- the spectroscopic element 1b allows cyan (Cy) light to enter the opposing photosensitive cell 2b, and halves the R light to the photosensitive cell 2a and photosensitive cells (not shown) included in other adjacent unit blocks. Make incidents one by one.
- the cyan (Cy) light is light composed of green (G) light and blue (B) light.
- the length and thickness of the spectral elements 1a and 1b are designed so that the spectral elements 1a and 1b have the spectral characteristics described above.
- the intensities of the R light and the G light included in the incident light do not always match, so the Ye light is not necessarily light that makes yellow, which is a mixed color of red and green, visible.
- the intensities of the G light and the B light included in the incident light do not necessarily match, and thus the Cy light is not necessarily light that makes a cyan color, which is a mixed color of green and blue, visible.
- the Cy light is light that makes green visible as in the G light.
- the photosensitive cell 2a receives Ye light (WB) from the spectral element 1a, and R is also detected from the spectral element 1b and the spectral elements included in the adjacent unit block. Receive half light. Further, the photosensitive cell 2b receives Cy light (WR) from the spectral element 1b, and also receives half of B light from the spectral element 1a and from the spectral elements (not shown) included in the adjacent unit blocks.
- WB Ye light
- R is also detected from the spectral element 1b and the spectral elements included in the adjacent unit block. Receive half light.
- the photosensitive cell 2b receives Cy light (WR) from the spectral element 1b, and also receives half of B light from the spectral element 1a and from the spectral elements (not shown) included in the adjacent unit blocks.
- WR Cy light
- the spectroscopic elements 1c and 1d also have a rectangular cross section that is long in the direction in which light is transmitted, and perform spectroscopic analysis by the difference in refractive index between itself and the transparent layer 6a.
- the light-splitting element 1c causes magenta (M) light to enter the opposing photosensitive cell 2c, and halves green (G) light to the photosensitive cell 2d and the photosensitive cell (not shown) included in the adjacent unit block. Make it incident.
- magenta (M) light is light composed of red (R) light and blue (B) light.
- the spectral element 1d causes green (G) light to enter the opposing photosensitive cell 2d, and magenta (M) to the photosensitive cell 2c and the photosensitive cells (not shown) included in other adjacent unit blocks.
- Light is incident half by one.
- the length and thickness of the spectral elements 1c and 1d are designed so that the spectral elements 1c and 1d have the spectral characteristics described above.
- the magenta light is not necessarily light for visually recognizing the magenta color that is a mixed color of red and blue.
- the M light is light that causes red to be visually recognized, like the R light.
- the photosensitive cells 2a to 2d are shown in FIGS. 6B and 6C. As shown, the light of the intensity represented by (W ⁇ B + R), (WR ⁇ B), (WG ⁇ M), and (WM ⁇ G) is received. However, in practice, the light shown in FIG. 6D is incident on the photosensitive cells 2a to 2d because it is affected by the micro lenticular lens 4c and further by the oblique light due to the opening of the optical diaphragm 50.
- k1 is a coefficient determined by the optical structure of the microlenticular lens 4c
- k2 and k3 are coefficients determined by the influence of oblique light.
- signals corresponding to the intensities of red light, green light, and blue light are respectively represented by Rs, Gs, and Bs.
- the signal Cs corresponding to the intensity of cyan light is Gs + Bs
- the signal Ys corresponding to the intensity of yellow light is Rs + Gs
- the signal Ms corresponding to the intensity of magenta light is Rs + Bs
- the signal Ws corresponding to the intensity of white light is Rs + Gs + Bs.
- the photosensitive cells 2a to 2d output photoelectric conversion signals 2a to 2d represented by the following equations 22 to 25, respectively.
- S2a (1 + k1-k2) (Ws ⁇ k3 ⁇ Bs + k3 ⁇ Rs)
- S2b (1 + k1-k2) (Ws ⁇ k3 ⁇ Rs + k3 ⁇ Bs)
- S2c (1 ⁇ k1 + k2) (Ws ⁇ k3 ⁇ Gs + k3 ⁇ Ms)
- S2d (1 ⁇ k1 + k2) (Ws ⁇ k3 ⁇ Ms + k3 ⁇ Gs)
- FIG. 7 is a flowchart showing the procedure of color information generation processing in the present embodiment.
- step S10 the image signal generation unit 15 acquires photoelectric conversion signals S2a to S2d. Subsequently, in step S12, the signal 2 (1 + k1 ⁇ k2) k3 (Rs ⁇ Bs) is calculated by the calculation of (S2a ⁇ S2b), and the signal 2 (1 ⁇ k1 + k2) k3 (Ms ⁇ Gs) is calculated by the calculation of (S2c ⁇ S2d). ) Is generated. Next, in step 14, the difference signals are added and subtracted to generate color difference signals D11 and D12 shown in the following equations 26 and 27.
- step S16 S2a and S2b are added, and S2c and S2d are added to generate addition signals 2 (1 + k1-k2) Ws and 2 (1-k1 + k2) Ws shown in the following equations 28 and 29, respectively.
- S2a + S2b 2 (1 + k1-k2) Ws
- S2c + S2d 2 (1-k1 + k2) Ws
- step S18 the color difference signals D11 and D12 are corrected using the difference result between the addition signals shown in k1, equations 28 and 29, and the correlation result between k2 and k3 measured in advance, and the color difference signal 2 (2Rs -Gs), 2 (2Bs-Gs).
- the coefficient k1 and the database or function indicating the correlation between k2 and k3 are recorded in advance in a recording medium such as the memory 30.
- step S22 RGB color signals, that is, Rs, Gs, and Bs are obtained from the two color difference signals and one luminance signal by matrix calculation.
- the image signal generation unit 15 performs the above signal calculation for each unit block 40 of the photosensitive cell array 200 to thereby generate a signal (referred to as a “color image signal”) indicating an image of each color component of R, G, and B. Generate.
- the generated color image signal is output to a recording medium (not shown) or a display unit by the image signal output unit 16.
- the coefficients k1, k2, and k3 may be set to different values depending on the unit block 40. By setting these coefficients to appropriate values according to the position of the unit block 40, the color reproducibility can be further improved.
- a color image signal is obtained by the addition / subtraction process using the photoelectric conversion signals S2a to S2d and the color signal correction process using k1, k2, and k3.
- the imaging element 10 in the present embodiment since no optical element that absorbs light is used, the loss of light can be significantly reduced as compared with the conventional technique using a color filter or the like, and an optical diaphragm or the like can be used. Good color reproducibility can be achieved even when oblique light is generated.
- the spectral element array 100 having a basic configuration of 2 rows and 2 columns is disposed facing the photosensitive cell array 200.
- Micro lenticular lenses 4c that collect light in the vertical direction every other row are arranged on the top of the spectral element array 100.
- a spectral element 1a that divides light into blue light and non-blue light is arranged in the first row and the first column.
- a spectral element 1b that separates light into red light and non-red light is arranged.
- a spectral element 1c that divides light into green light and non-green light is arranged.
- a spectral element 1d that divides light into magenta light (that is, other than green light) and light other than magenta light (that is, green light) is disposed. Since such an array pattern of spectral elements is repeatedly formed on the imaging surface, even if the selection method of the unit block 40 in the photosensitive cell array 200 is changed one row or one column at a time, the four photoelectric conversion signals obtained are: It is always a combination of four signals represented by equations 22-25. That is, by performing the above signal calculation while shifting the pixel block to be calculated one row and one column at a time, it is possible to obtain information on each RGB color component by the number of pixels. This means that the resolution of the imaging device can be increased to the number of pixels.
- the imaging apparatus of the present embodiment can generate a high-resolution color image in addition to higher sensitivity than the conventional imaging apparatus. Further, by correcting the color signal using the known coefficient k1, the coefficient k2 obtained by signal processing, and the coefficient k3 obtained from k2 based on the correlation between k2 and k3 obtained in advance through experiments or the like, It is also possible to eliminate the influence of crosstalk caused by oblique light.
- the image signal generation unit 15 does not necessarily generate all the image signals of the three color components. It may be configured to generate only one or two color image signals depending on the application. Moreover, you may amplify, synthesize
- each spectral element has the above-described spectral performance strictly, but their spectral performance may be slightly shifted. That is, the photoelectric conversion signal actually output from each photosensitive cell may be slightly deviated from the photoelectric conversion signals shown in Equations 22-25. Even when the spectral performance of each spectral element deviates from the ideal performance, good color information can be obtained by correcting the signal according to the degree of deviation.
- the signal calculation performed by the image signal generation unit 15 in the present embodiment can be executed not by the imaging apparatus itself but by another device.
- the color information can also be generated by causing an external device that has received an input of the photoelectric conversion signal output from the image sensor 10 to execute a program that defines the signal calculation processing in the present embodiment.
- the image signal generation unit 15 that performs the above arithmetic processing may not be provided in the imaging apparatus.
- the basic structure of the image sensor 10 is not limited to the configuration shown in FIGS. 6A to 6C.
- the effect of this embodiment is not changed even if the spectral element 1a and the spectral element 1b are interchanged or the spectral element 1c and the spectral element 1d are interchanged.
- the arrangement of the first row and the arrangement of the second row shown in FIG. 6A may be interchanged, and the spectral elements 1a, 1b and the spectral elements 1c, 1d are arranged in the column direction instead of the row direction. Even if it is done, the effectiveness is not changed. In that case, the micro lenticular lens 4c may be formed so as to extend in the column direction.
- a cylindrical lens having a different curvature at its interface is used.
- the present invention is not limited to this.
- it may be one using a diffraction effect, and any optical element that collects incident light in the vertical direction as an optical function may be used.
- such an optical element is referred to as a “light collecting element”.
- the spectral characteristics of the spectral elements 1a to 1d may be different from the characteristics shown in FIGS. 6A to 6C.
- the spectral elements 1a to 1d do not cause half of the light of some color components to enter the light-sensitive cells on both sides, but make light of the color components enter the adjacent light-sensitive cells on one side. Also good.
- the color of the light that the spectral elements 1a to 1d split is not limited to the above example. If the spectral element array 100 is configured so that the spectral distribution of light received by the photosensitive cells 2a to 2d is different, color information can be obtained by signal processing according to the configuration.
- the spectroscopic element is not limited as long as light of a desired color component can be incident on each photosensitive cell. It may be anything. For example, a microprism or the like may be used as the spectral element. It is also possible to use different types of spectral elements in combination.
- the imaging apparatus according to the present embodiment is different from the imaging apparatus according to the first embodiment only in the configuration and signal processing of the spectral elements of the imaging element 10, and the other components are the same. The following description will be focused on the differences from the imaging apparatus of the first embodiment, and the description of overlapping points will be omitted.
- FIG. 8A is a diagram illustrating a basic configuration of the image sensor 10 according to the present embodiment.
- 8B is a cross-sectional view taken along the line A-A ′ of FIG. 8A
- FIG. 8C is a cross-sectional view taken along the line B-B ′ of FIG. 8A.
- the configuration of the spectral elements of the image sensor 10 in the present embodiment is that the spectral element 1a divides light into blue light and non-blue light in the first row and first column in the spectral element array 100. Is placed. In the second row and the second column, a spectral element 1b that divides light into red light and non-red light is arranged.
- the spectral elements are not arranged on the photosensitive cells in the first row, second column and the second row, first column.
- the photosensitive cells 2a to 2d have (WB), (W + B), (W + R), (W, respectively) as shown in FIGS. 8B and 8C.
- Light having an intensity represented by (WR) is incident.
- the image signal generation unit 15 uses the difference signals S2c-S2d and S2b-S2a as shown in the following equations 34 and 35 to generate the R color signal 2k3 (1-k1 + k2) Rs and the B color signal. 2k3 (1 + k1-k2) Bs are made respectively.
- S2c ⁇ S2d 2k3 (1 ⁇ k1 + k2) Rs
- S2b-S2a 2k3 (1 + k1-k2) Bs
- a color image signal is obtained by the addition / subtraction process using the photoelectric conversion signals S2a to S2d and the color signal correction process using k1, k2, and k3.
- the spectral element 1a faces the photosensitive cell 2a
- the spectral element 1b faces the photosensitive cell 2d
- the spectral element 1a faces the photosensitive cell 2b
- the spectroscopic element 1b may be disposed so as to face the photosensitive cell 2c.
- each spectral element may be configured to split light in the vertical direction instead of in the horizontal direction.
- the micro lenticular lens 2b may be provided so as to extend in the vertical direction.
- each spectral element is disposed so as to face one of the photosensitive cells, but a part of the spectral element is located on the boundary of the plurality of photosensitive cells. It may be arranged as follows. For example, a first spectral element that separates incident light in the horizontal direction is disposed on the boundary between the photosensitive cells 2a and 2b, and a second element that separates incident light in the horizontal direction on the boundary between the photosensitive cells 2c and 2d. A spectral element may be arranged. Even in such a configuration, if the spectral characteristics of the first and second spectral elements are different and, as a result, light having different spectral distributions is incident on the photosensitive cells 2a to 2d, signal processing is possible. Can generate color information.
- Each spectral element separates incident light into light of three different wavelength ranges instead of separating incident light into primary colors (R, G, B) and complementary colors (Cy, Mg, Ye). You may have.
- the incident light is separated into three components C1, C2, and C3, the C1 light is applied to the opposite photosensitive cell, the C2 light is applied to one adjacent photosensitive cell, and the other adjacent A spectral element that allows C3 light to enter the photosensitive cell may be used.
- the solid-state imaging device and the imaging device of the present disclosure are effective for all cameras using the solid-state imaging device.
- it can be used for consumer cameras such as digital still cameras and digital video cameras, and industrial solid-state surveillance cameras.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Power Engineering (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- Computer Hardware Design (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Color Television Image Signal Generators (AREA)
- Solid State Image Pick-Up Elements (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
Abstract
Description
(式1) S2a=(1+k1-k2)(Ws-C1^s+C1s)=2(1+k1-k2)C1s
(式2) S2b=(1+k1-k2)(Ws-C1s+C1^s)=2(1+k1-k2)C1^s
(式3) S2c=(1-k1+k2)(Ws-C3s+C3^s)=2(1-k1+k2)C3^s
(式4) S2d=(1-k1+k2)(Ws-C3^s+C3s)=2(1-k1+k2)C3s
(式5) D1=S2a-S2b=2(1+k1-k2)(C1s-C1^s)
(式6) D2=S2d-S2c=2(1-k1+k2)(C3s-C3^s)
(式7) D1=k3(S2a-S2b)=2k3(1+k1-k2)(C1s-C1^s)
(式8) D2=k3(S2d-S2c)=2k3(1-k1+k2)(C3s-C3^s)
(式9) S2a=(1+k1-k2)(Ws-k3C1^s+k3C1s)=(1+k1-k2){(1-k3)Ws+2k3C1s}
(式10) S2b=(1+k1-k2)(Ws-k3C1s+k3C1^s)=(1+k1-k2){(1-k3)Ws+2k3C1^s}
(式11) S2c=(1-k1+k2)(Ws-k3C3s+k3C3^s)=(1-k1+k2){(1-k3)Ws+2k3C3^s}
(式12) S2d=(1-k1+k2)(Ws-k3C3^s+k3C3s)=(1-k1+k2){(1-k3)Ws+2k3C3s}
(式13) D1=2k3(1+k1-k2)(2C1s-Ws)
(式14) D2=2k3(1-k1+k2)(2C3s-Ws)
式13、14で表される信号D1、D2は、色差信号として用いることができる。
(式15) S2a+S2b=2(1+k1-k2)Ws
(式16) S2c+S2d=2(1-k1+k2)Ws
(式17) S2a+S2b+S2c+S2d=4Ws
(式18) S2a=(1+k1-k2)(Ws-k3C1s+k3C2s)
(式19) S2b=(1+k1-k2)(Ws-k3C2s+k3C1s)
(式20) S2c=(1-k1+k2)(Ws-k3C3s+k3C3s^)
(式21) S2d=(1-k1+k2)(Ws-k3C3s^+k3C3s)
図3は、第1の実施形態による撮像装置の全体構成を示すブロック図である。本実施形態の撮像装置は、デジタル式の電子カメラであり、撮像部300と、撮像部300から送出される信号に基づいて画像を示す信号(画像信号)を生成する信号処理部400とを備えている。なお、撮像装置は静止画のみを生成してもよいし、動画を生成する機能を備えていてもよい。
(式22)S2a=(1+k1-k2)(Ws-k3・Bs+k3・Rs)
(式23)S2b=(1+k1-k2)(Ws-k3・Rs+k3・Bs)
(式24)S2c=(1-k1+k2)(Ws-k3・Gs+k3・Ms)
(式25)S2d=(1-k1+k2)(Ws-k3・Ms+k3・Gs)
(式26)D11=(S2a-S2b)+(S2c-S2d)
=2(1+k1-k2)k3(Rs-Bs)+2(1-k1+k2)k3(Ms-Gs)
=2k3{(2Rs―Gs)-(k1-k2)(2Bs―Gs)}
(式27)D12=(S2c-S2d)-(S2a-S2b)
=2(1-k1+k2)k3(Ms-Gs)-2(1+k1-k2)k3(Rs-Bs)
=2k3{(2Bs―Gs)-(k1-k2)(2Rs―Gs)}
(式28) S2a+S2b=2(1+k1-k2)Ws
(式29) S2c+S2d=2(1-k1+k2)Ws
次に、図8A~8Cを参照しながら、第2の実施形態を説明する。本実施形態の撮像装置は、実施形態1の撮像装置と比較して、撮像素子10の分光要素の構成および信号処理のみが異なっており、その他の構成要素は同一である。以下、実施形態1の撮像装置との相違点を中心に説明し、重複する点は説明を省略する。
(式30) S2a=(1+k1-k2)(Ws-k3・Bs)
(式31) S2b=(1+k1-k2)(Ws+k3・Bs)
(式32) S2c=(1-k1+k2)(Ws+k3・Rs)
(式33) S2d=(1-k1+k2)(Ws-k3・Rs)
(式34) S2c-S2d=2k3(1-k1+k2)Rs
(式35) S2b-S2a=2k3(1+k1-k2)Bs
(式36) S2c+S2d=2(1-k1+k2)Ws
(式37) S2b+S2a=2(1+k1-k2)Ws
2,2a,2b,2c,2d 撮像素子の光感知セル
4a、4b マイクロレンズ
4c マイクロレンチキュラーレンズ
5 撮像素子の配線層
6a、6b 透明層
7 シリコン基板
9 固定基板
10 撮像素子
11 光学フィルタ
12 光学レンズ
13 信号発生/受信部
14 素子駆動部
15 画像信号生成部
16 画像信号出力部
17 赤(R)光以外を反射する多層膜フィルタ(ダイクロイックミラー)
18 緑(G)光のみを反射する多層膜フィルタ(ダイクロイックミラー)
19 青(B)光のみを反射する多層膜フィルタ(ダイクロイックミラー)
20 遮光部
21 透光性の樹脂
22 G光透過の多層膜フィルタ(ダイクロイックミラー)
23 R光透過の多層膜フィルタ(ダイクロイックミラー)
24 G光透過の有機色素フィルタ
25 R光透過の有機色素フィルタ
26 マイクロレンズ
27 金属層
30 メモリ
40 光感知セルの単位ブロック
50 光学絞り
100 分光要素アレイ
200 光感知セルアレイ
300 撮像部
400 信号処理部
Claims (15)
- 各々が2行2列に配列された第1の光感知セル、第2の光感知セル、第3の光感知セル、および第4の光感知セルを含む複数の単位ブロックが撮像面に2次元状に配列された光感知セルアレイであって、各単位ブロックにおいて、前記第1の光感知セルは1行1列目に配置され、前記第2の光感知セルは2行1列目に配置され、前記第3の光感知セルは1行2列目に配置され、前記第4の光感知セルは2行2列目に配置されている光感知セルアレイと、
第1の分光要素、および前記第1の分光要素とは分光特性の異なる第2の分光要素を含む分光要素アレイであって、前記第1の分光要素は、前記第1および第3の光感知セルの少なくとも一方に前記第1の分光要素の少なくとも一部が対向するように配置され、入射する光を波長に応じて前記第1および第3の光感知セルの配列方向に分離するように構成されており、前記第2の分光要素は、前記第2および第4の光感知セルの少なくとも一方に前記第2の分光要素の少なくとも一部が対向するように配置され、入射する光を波長に応じて前記第2および第4の光感知セルの配列方向に分離するように構成されている分光要素アレイと、
各々が前記第1および第3の光感知セルを覆う領域に設けられ、前記第1および第3の光感知セルに集光するように構成された複数の集光素子と、
を備える固体撮像素子。 - 前記第1の分光要素は、前記第1または第3の光感知セルに対向し、前記第1の光感知セルおよび前記第3の光感知セルに異なる波長域の光が入射するように配置され、
前記第2の分光要素は、前記第2または第4の光感知セルに対向し、前記第2の光感知セルおよび前記第4の光感知セルに異なる波長域の光が入射するように配置されている、請求項1に記載の固体撮像素子。 - 前記第1の分光要素は、前記第3の光感知セルに第1波長域の光を入射させ、前記第1の光感知セルに前記第1波長域の光の補色光を入射させ、
前記第2の分光要素は、前記第2の光感知セルに第2波長域の光を入射させ、前記第4の光感知セルに前記第2波長域の光の補色光を入射させる、
請求項2に記載の固体撮像素子。 - 前記分光要素アレイは、さらに第3の分光要素および第4の分光要素を含み、
前記第1から第4の分光要素は、それぞれ前記第1から第4の光感知セルに対向して配置され、前記第1から第4の光感知セルに異なる波長域の光を入射させるように設計されている、
請求項1に記載の固体撮像素子。 - 前記第1から第4の分光要素は、入射する光を原色光と補色光とに分離する特性、または入射光を異なる3つの波長域の光に分離する特性を有している、請求項4に記載の固体撮像素子。
- 前記第1の分光要素は、前記第3の光感知セルに第1波長域の光を入射させ、前記第1の光感知セルに前記第1波長域の光の補色光を入射させ、
前記第3の分光要素は、前記第1の光感知セルに第2波長域の光を入射させ、前記第3の光感知セルに前記第2波長域の光の補色光を入射させる、
請求項4または5に記載の固体撮像素子。 - 前記第1の分光要素は、前記第1の光感知セルに第1波長域の光を入射させ、前記第3の光感知セルに前記第1波長域の光の補色光を入射させ、
前記第3の分光要素は、前記第1の光感知セルに前記第1波長域の光を入射させ、前記第3の光感知セルに前記第1波長域の光の補色光を入射させる、
請求項4から6のいずれかに記載の固体撮像素子。 - 前記第2の分光要素は、前記第4の光感知セルに第3波長域の光を入射させ、前記第2の光感知セルに前記第3波長域の光の補色光を入射させ、
前記第4の分光要素は、前記第4の光感知セルに前記第3波長域の光を入射させ、前記第2の光感知セルに前記第3波長域の光の補色光を入射させる、
請求項5から7のいずれかに記載の固体撮像素子。 - 前記第1波長域は、赤、緑、青の波長域の1つであり、前記第2波長域は、赤、緑、青の波長域の他の1つであり、前記第3波長域は、赤、緑、青の波長域の残りの1つである、請求項8に記載の固体撮像素子。
- 前記複数の集光素子の各々は、光入射側に曲面を有するシリンドリカルレンズである、請求項1から9のいずれかに記載の固体撮像素子。
- 請求項1から10のいずれかに記載の固体撮像素子と、
前記固体撮像素子の撮像面に像を形成する光学系と、
前記第1から第4の光感知セルからそれぞれ出力される第1から第4の光電変換信号に基づいて色信号を生成する信号処理部と、
を備える撮像装置。 - 前記信号処理部は、前記第1の光感知セルから出力される信号と前記第3の光感知セルから出力される信号との加算演算を含む処理によって生成される第1の輝度信号、および前記第2の光感知セルから出力される信号と前記第4の光感知セルから出力される信号との加算演算を含む処理によって生成される第2の輝度信号を生成し、前記第1の輝度信号と前記第2の輝度信号との差分結果に基づいて、前記色信号を補正する、請求項11に記載の撮像装置。
- 前記集光素子による集光特性を規定する係数k1、および前記固体撮像素子に入射する斜め光の影響による前記係数k1の減少量を示す補正係数k2と前記斜め光の影響による前記第1および第2の分光要素の分光特性の低下の程度を示す係数k3との相関関係を示す情報が記録された記録媒体をさらに備えている請求項12に記載の撮像装置。
- 前記信号処理部は、前記第1の輝度信号と前記第2の輝度信号との差分結果、前記係数k1、および前記相関関係を示す情報に基づいて、前記色信号を補正する、請求項13に記載の撮像装置。
- 前記信号処理部は、前記第1の光電変換信号と前記第3の光電変換信号との差分を示す第1の差信号と、前記第2の光電変換信号と前記第4の光電変換信号との差分を示す第2の差信号とを用いた信号演算により、第1の色差信号および第2の色差信号を生成し、前記第1の輝度信号と前記第2の輝度信号との差分結果を用いて、前記第1の色差信号および前記第2の色差信号を補正する、請求項12から14のいずれかに記載の撮像装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013537347A JP6039567B2 (ja) | 2012-05-02 | 2013-03-14 | 固体撮像装置 |
CN201380001203.6A CN103503144B (zh) | 2012-05-02 | 2013-03-14 | 固体摄像装置 |
US14/009,477 US9179114B2 (en) | 2012-05-02 | 2013-03-14 | Solid-state image sensor |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-105293 | 2012-05-02 | ||
JP2012105293 | 2012-05-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013164901A1 true WO2013164901A1 (ja) | 2013-11-07 |
Family
ID=49514308
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/001728 WO2013164901A1 (ja) | 2012-05-02 | 2013-03-14 | 固体撮像装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9179114B2 (ja) |
JP (1) | JP6039567B2 (ja) |
CN (1) | CN103503144B (ja) |
WO (1) | WO2013164901A1 (ja) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6295526B2 (ja) | 2013-07-11 | 2018-03-20 | ソニー株式会社 | 固体撮像装置および電子機器 |
US10148864B2 (en) | 2015-07-02 | 2018-12-04 | Pixart Imaging Inc. | Imaging device having phase detection pixels and regular pixels, and operating method thereof |
US9978154B2 (en) * | 2015-07-02 | 2018-05-22 | Pixart Imaging Inc. | Distance measurement device base on phase difference and distance measurement method thereof |
CN108347554B (zh) * | 2018-03-20 | 2024-06-14 | 浙江国自机器人技术有限公司 | 一种工业相机 |
JP7106363B2 (ja) * | 2018-06-18 | 2022-07-26 | キヤノン株式会社 | 画像形成装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005244947A (ja) * | 2004-01-26 | 2005-09-08 | Matsushita Electric Ind Co Ltd | 固体撮像装置およびこれを用いたカメラ |
JP2007208817A (ja) * | 2006-02-03 | 2007-08-16 | Toshiba Corp | 固体撮像装置 |
WO2009019818A1 (ja) * | 2007-08-06 | 2009-02-12 | Panasonic Corporation | 撮像用光検出装置 |
WO2009153937A1 (ja) * | 2008-06-18 | 2009-12-23 | パナソニック株式会社 | 固体撮像装置 |
JP2011066097A (ja) * | 2009-09-16 | 2011-03-31 | Nec Corp | 裏面入射型受光素子および裏面入射型受光素子の製造方法 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5990467A (ja) | 1982-11-15 | 1984-05-24 | Mitsubishi Electric Corp | 固体撮像素子 |
JP2000151933A (ja) | 1998-11-06 | 2000-05-30 | Nec Corp | 撮像素子及びその製造方法 |
JP3821614B2 (ja) * | 1999-08-20 | 2006-09-13 | 独立行政法人科学技術振興機構 | 画像入力装置 |
JP2001309395A (ja) | 2000-04-21 | 2001-11-02 | Sony Corp | 固体撮像素子及びその製造方法 |
JP4652634B2 (ja) | 2001-08-31 | 2011-03-16 | キヤノン株式会社 | 撮像装置 |
JP5331107B2 (ja) * | 2008-12-19 | 2013-10-30 | パナソニック株式会社 | 撮像装置 |
JP5296077B2 (ja) * | 2009-01-14 | 2013-09-25 | パナソニック株式会社 | 撮像装置 |
US9099369B2 (en) * | 2012-05-02 | 2015-08-04 | Panasonic Intellectual Property Corporation Of America | Solid-state image sensor |
-
2013
- 2013-03-14 JP JP2013537347A patent/JP6039567B2/ja not_active Expired - Fee Related
- 2013-03-14 CN CN201380001203.6A patent/CN103503144B/zh not_active Expired - Fee Related
- 2013-03-14 WO PCT/JP2013/001728 patent/WO2013164901A1/ja active Application Filing
- 2013-03-14 US US14/009,477 patent/US9179114B2/en not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005244947A (ja) * | 2004-01-26 | 2005-09-08 | Matsushita Electric Ind Co Ltd | 固体撮像装置およびこれを用いたカメラ |
JP2007208817A (ja) * | 2006-02-03 | 2007-08-16 | Toshiba Corp | 固体撮像装置 |
WO2009019818A1 (ja) * | 2007-08-06 | 2009-02-12 | Panasonic Corporation | 撮像用光検出装置 |
WO2009153937A1 (ja) * | 2008-06-18 | 2009-12-23 | パナソニック株式会社 | 固体撮像装置 |
JP2011066097A (ja) * | 2009-09-16 | 2011-03-31 | Nec Corp | 裏面入射型受光素子および裏面入射型受光素子の製造方法 |
Also Published As
Publication number | Publication date |
---|---|
US9179114B2 (en) | 2015-11-03 |
CN103503144A (zh) | 2014-01-08 |
US20140055650A1 (en) | 2014-02-27 |
JP6039567B2 (ja) | 2016-12-07 |
JPWO2013164901A1 (ja) | 2015-12-24 |
CN103503144B (zh) | 2017-02-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5503459B2 (ja) | 固体撮像素子および撮像装置 | |
JP6039558B2 (ja) | 固体撮像装置 | |
JP5237998B2 (ja) | 固体撮像素子、撮像装置および信号処理方法 | |
TWI500330B (zh) | 攝像裝置 | |
JP5503458B2 (ja) | 固体撮像素子および撮像装置 | |
JP5331107B2 (ja) | 撮像装置 | |
WO2011010455A1 (ja) | 撮像装置および固体撮像素子 | |
JP5325117B2 (ja) | 固体撮像装置 | |
JP5113249B2 (ja) | 撮像装置 | |
JP6039567B2 (ja) | 固体撮像装置 | |
JP5894573B2 (ja) | 固体撮像素子、撮像装置および信号処理方法 | |
WO2014033976A1 (ja) | 固体撮像素子、撮像装置および信号処理方法 | |
JP5997149B2 (ja) | 固体撮像素子、撮像装置、および信号処理方法 | |
JP5852006B2 (ja) | 固体撮像素子、撮像装置および信号処理方法 | |
JP2014086743A (ja) | 固体撮像素子、撮像装置、および信号処理方法 | |
JP2014086742A (ja) | 固体撮像素子、撮像装置、および信号処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2013537347 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14009477 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13784977 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13784977 Country of ref document: EP Kind code of ref document: A1 |