WO2021020156A1 - Élément d'imagerie, dispositif d'imagerie, dispositif de traitement du signal et procédé de traitement du signal - Google Patents
Élément d'imagerie, dispositif d'imagerie, dispositif de traitement du signal et procédé de traitement du signal Download PDFInfo
- Publication number
- WO2021020156A1 WO2021020156A1 PCT/JP2020/027787 JP2020027787W WO2021020156A1 WO 2021020156 A1 WO2021020156 A1 WO 2021020156A1 JP 2020027787 W JP2020027787 W JP 2020027787W WO 2021020156 A1 WO2021020156 A1 WO 2021020156A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pixel
- pixels
- image
- small
- area
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
Definitions
- the present technology relates to an image pickup device, an image pickup device, a signal processing device, and a signal processing method. , Regarding signal processing method.
- the light from the subject is modulated by a light-shielding film covering the light receiving surface of each pixel of the image sensor to take an image, and a restored image in which the image of the subject is formed is restored by a predetermined arithmetic process.
- An image pickup device has been proposed (see, for example, Patent Document 1).
- This technology was made in view of such a situation, and expands the dynamic range in an imaging device that does not use an imaging lens.
- the image pickup element on the first side surface of the present technology receives incident light from an incident subject without going through either the image pickup lens or the pinhole, and a detection signal indicating an output pixel value modulated by the incident angle of the incident light.
- the ratio of the area of the large detection area set so that the incident light can be detected on the large light receiving surface to the area of the large setting area set in the large light receiving surface of each of the large pixels, and each The ratio of the area of the small detection area set so that the incident light can be detected on the small light receiving surface is substantially equal to the area of the small setting area set in the small light receiving surface of the small pixel.
- the image sensor on the second side of the present technology receives incident light from an incident subject without going through either an imaging lens or a pinhole, and a detection signal indicating an output pixel value modulated by the incident angle of the incident light.
- a pixel array containing at least a large pixel group including a plurality of large pixels having a large light receiving surface and a small pixel group including a plurality of small pixels having a small light receiving surface.
- the first restored image is restored from the first detected image based on the detection signals output from the plurality of large pixels, and the second detected image based on the detection signals output from the plurality of small pixels is restored. It is provided with a restoration unit that restores the restored image of 2 and generates a composite restored image by synthesizing the first restored image and the second restored image.
- the signal processing device on the third aspect of the present technology receives incident light from an incident subject without going through either an image sensor or a pinhole, and detects an output pixel value modulated by the incident angle of the incident light.
- a pixel group including a plurality of pixels that output one signal each, and having at least a large pixel group including a plurality of large pixels having a large light receiving surface and a small pixel group including a plurality of small pixels having a small light receiving surface.
- An image sensor in which the ratio of the area of the small detection area set so that the incident light can be detected on the small light receiving surface is substantially equal to the area of the small setting area set in the small light receiving surface of each small pixel.
- the first restored image is restored from the first detected image based on the detection signals output from the plurality of large pixels, and the second detected image based on the detection signals output from the plurality of small pixels is restored. It is provided with a restoration unit that restores the restored image of 2 and generates a composite restored image by synthesizing the first restored image and the second restored image.
- the signal processing method of the third aspect of the present technology receives incident light from an incident subject without going through either an image sensor or a pinhole, and detects an output pixel value modulated by the incident angle of the incident light.
- a pixel group including a plurality of pixels that output one signal each, and having at least a large pixel group including a plurality of large pixels having a large light receiving surface and a small pixel group including a plurality of small pixels having a small light receiving surface.
- An image sensor in which the ratio of the area of the small detection area set so that the incident light can be detected on the small light receiving surface is substantially equal to the area of the small setting area set in the small light receiving surface of each small pixel.
- the first restored image is restored from the first detected image based on the detection signals output from the plurality of large pixels, and the second detected image based on the detection signals output from the plurality of small pixels is restored.
- a composite restored image is generated by restoring the restored image of 2 and synthesizing the first restored image and the second restored image.
- a pixel array unit including at least a large pixel group including a plurality of large pixels having a large light receiving surface and a small pixel group including a plurality of small pixels having a small light receiving surface, which is a pixel group including a plurality of pixels to be output.
- the plurality of image pickup devices in which the ratio of the area of the small detection area set so that the incident light can be detected on the small light receiving surface is substantially equal to the area of the small setting area set in the small light receiving surface of the pixel.
- the detection signal is output from the large pixel and the plurality of small pixels, respectively.
- a pixel array unit including at least a large pixel group including a plurality of large pixels having a large light receiving surface and a small pixel group including a plurality of small pixels having a small light receiving surface, which is a pixel group including a plurality of pixels to be output.
- the plurality of image pickup devices in which the ratio of the area of the small detection area set so that the incident light can be detected on the small light receiving surface is substantially equal to the area of the small setting area set in the small light receiving surface of the pixel.
- the detection signal is output from the large pixel and the plurality of small pixels, respectively, and the first restored image is restored from the first detected image based on the detection signals output from the plurality of large pixels, and the plurality of said
- the second restored image is restored from the second detected image based on the detection signal output from the small pixel, and the composite restored image is generated by synthesizing the first restored image and the second restored image.
- a pixel array unit including at least a large pixel group including a plurality of large pixels having a large light receiving surface and a small pixel group including a plurality of small pixels having a small light receiving surface, which is a pixel group including a plurality of pixels to be output.
- a plurality of image pickup devices in which the ratio of the area of the small detection area set so that the incident light can be detected on the small light receiving surface is substantially equal to the area of the small setting area set in the small light receiving surface of the pixel.
- the first restored image is restored from the first detected image based on the detected signal output from the large pixel
- the second restored image is restored from the second detected image based on the detected signal output from the plurality of small pixels.
- the image is restored, and the composite restored image is generated by synthesizing the first restored image and the second restored image.
- Embodiment >> Embodiments of the present technology will be described with reference to FIGS. 1 to 20.
- FIG. 1 is a block diagram showing a configuration example of an image pickup apparatus 101 to which the present technology is applied.
- the image sensor 101 includes an image sensor 121, a restoration unit 122, a control unit 123, an input unit 124, a detection unit 125, an association unit 126, a display unit 127, a storage unit 128, a recording / playback unit 129, a recording medium 130, and a communication unit. 131 is provided. Further, signal processing and imaging are performed by the restoration unit 122, the control unit 123, the input unit 124, the detection unit 125, the association unit 126, the display unit 127, the storage unit 128, the recording / playback unit 129, the recording medium 130, and the communication unit 131.
- a signal processing control unit 111 that controls the device 101 and the like is configured.
- the imaging device 101 does not include an imaging lens (imaging lens free).
- the image sensor 121, the restoration unit 122, the control unit 123, the input unit 124, the detection unit 125, the association unit 126, the display unit 127, the storage unit 128, the recording / playback unit 129, and the communication unit 131 are connected via the bus B1. They are connected to each other and send and receive data via bus B1.
- the description of the bus B1 when each part of the image pickup apparatus 101 transmits / receives data via the bus B1 will be omitted. For example, it is described that when the input unit 124 supplies data to the control unit 123 via the bus B1, the input unit 124 supplies the data to the control unit 123.
- the image sensor 121 is an image sensor in which the detection sensitivity of each pixel has an incident angle directivity, and outputs an image composed of a detection signal indicating a detection signal level according to the amount of incident light to the restoration unit 122 or the bus B1.
- the incident angle directivity means to make the light receiving sensitivity characteristic according to the incident angle of the incident light to each pixel different for each pixel.
- the light-receiving sensitivity characteristics of all the pixels do not have to be completely different, and the light-receiving sensitivity characteristics of some pixels may be the same.
- the image sensor 121 may have the same basic structure as that of a general image sensor such as a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
- CMOS Complementary Metal Oxide Semiconductor
- the image sensor 121 has a configuration in which each pixel constituting the pixel array unit is different from that of a general one, and has a configuration in which the incident angle directivity is provided, for example, as will be described later with reference to FIGS. 3 to 5. doing.
- the image sensor 121 has different light receiving sensitivities (changes) for each pixel according to the incident angle of the incident light, and has incident angle directivity with respect to the incident angle of the incident light for each pixel.
- the subject surface 102 of the subject in the upper left of FIG. 2 is composed of a point light source PA to a point light source PC, and the point light source PA to the point light source PC emits a plurality of light rays having a light intensity a to a light intensity c, respectively.
- the image pickup device 121 includes pixels having different incident angle directivity (hereinafter, referred to as pixel Pa to pixel Pc) at positions Pa to position Pc.
- light rays of the same light intensity emitted from the same point light source are incident on each pixel of the image sensor 121.
- light rays having a light intensity a emitted from the point light source PA are incident on the pixels Pa to the pixels Pc of the image sensor 121, respectively.
- the light rays emitted from the same point light source are incident at different angles of incidence for each pixel.
- the light rays from the point light source PA are incident on the pixels Pa to the pixels Pc at different angles of incidence.
- the incident angle directivity of the pixels Pa to Pc is different, light rays of the same light intensity emitted from the same point light source are detected in each pixel with different sensitivities. As a result, light rays having the same light intensity are detected at different detection signal levels for each pixel. For example, the detection signal level for a light beam having a light intensity a from the point light source PA is a different value for each pixel Pa to pixel Pc.
- the light receiving sensitivity level of each pixel with respect to the light rays from each point light source is obtained by multiplying the light intensity of the light rays by a coefficient indicating the light receiving sensitivity (that is, the incident angle directivity) with respect to the incident angle of the light rays.
- the detection signal level of the pixel Pa with respect to the light beam from the point light source PA is obtained by multiplying the light intensity a of the light ray of the point light source PA by a coefficient indicating the incident angle directivity of the pixel Pa with respect to the incident angle of the light ray on the pixel Pa. It is required by.
- the detection signal levels DA, DB, and DC of the pixels Pc, Pb, and Pa are represented by the following equations (1) to (3), respectively.
- DA ⁇ 1 ⁇ a + ⁇ 1 ⁇ b + ⁇ 1 ⁇ c ⁇ ⁇ ⁇ (1)
- DB ⁇ 2 ⁇ a + ⁇ 2 ⁇ b + ⁇ 2 ⁇ c ⁇ ⁇ ⁇ (2)
- DC ⁇ 3 ⁇ a + ⁇ 3 ⁇ b + ⁇ 3 ⁇ c ⁇ ⁇ ⁇ (3)
- the coefficient ⁇ 1 is a coefficient indicating the incident angle directivity of the pixel Pc with respect to the incident angle of the light ray from the point light source PA to the pixel Pc, and is set according to the incident angle. Further, ⁇ 1 ⁇ a indicates the detection signal level of the pixel Pc with respect to the light beam from the point light source PA.
- the coefficient ⁇ 1 is a coefficient indicating the incident angle directivity of the pixel Pc with respect to the incident angle of the light ray from the point light source PB to the pixel Pc, and is set according to the incident angle. Further, ⁇ 1 ⁇ b indicates the detection signal level of the pixel Pc with respect to the light beam from the point light source PB.
- the coefficient ⁇ 1 is a coefficient indicating the incident angle directivity of the pixel Pc with respect to the incident angle of the light ray from the point light source PC to the pixel Pc, and is set according to the incident angle. Further, ⁇ 1 ⁇ c indicates the detection signal level of the pixel Pc with respect to the light beam from the point light source PC.
- the detection signal level DA of the pixel Pa determines the light intensities a, b, and c of the light rays from the point light sources PA, PB, and PC in the pixel Pc, and the incident angle directivity according to each incident angle. It is obtained by the sum of products with the indicated coefficients ⁇ 1, ⁇ 1 and ⁇ 1.
- the detection signal level DB of the pixel Pb includes the light intensities a, b, and c of the light rays from the point light sources PA, PB, and PC in the pixel Pb, and their respective incidents. It is obtained by the sum of products with the coefficients ⁇ 2, ⁇ 2, and ⁇ 2 that indicate the incident angle directivity according to the angle.
- the detection signal level DC of the pixel Pc is the light intensities a, b, c of the light rays from the point light sources PA, PB, and PC in the pixel Pa, and their respective incident angles. It is obtained by the sum of products with the coefficients ⁇ 2, ⁇ 2, and ⁇ 2 indicating the incident angle directivity according to.
- the detection signal levels DA, DB, and DC of the pixels Pa, Pb, and Pc are the light of light rays emitted from the point light sources PA, PB, and PC, respectively, as shown in the equations (1) to (3).
- the strengths a, b, and c are mixed. Therefore, as shown in the upper right of FIG. 2, the detection signal level in the image sensor 121 is different from the light intensity of each point light source on the subject surface 102. Therefore, the image obtained by the image sensor 121 is different from the image of the subject surface 102.
- the light intensity a to the light intensity c of the light rays of each point light source PA or point light source PC can be obtained. Then, by arranging the pixels having the pixel values corresponding to the obtained light intensity a to the light intensity c according to the arrangement (relative position) of the point light source PA to the point light source PC, as shown in the lower right of FIG. , The restored image in which the image of the subject surface 102 is formed is restored.
- a coefficient set (for example, coefficients ⁇ 1, ⁇ 1, ⁇ 1) in which the coefficients are summarized for each equation constituting the simultaneous equations is referred to as a coefficient set.
- a set of a plurality of coefficient sets corresponding to a plurality of equations included in the simultaneous equations (for example, coefficient set ⁇ 1, ⁇ 1, ⁇ 1, coefficient set ⁇ 2, ⁇ 2, ⁇ 2, coefficient set ⁇ 3, ⁇ 3, ⁇ 3) is used as a coefficient. Called a set group.
- the incident angle of the light beam from each point light source of the subject surface 102 to the image sensor 121 is different, so that a different coefficient set is set for each subject distance. A group is needed.
- a coefficient set group for each distance (subject distance) from the image pickup element 121 to the subject surface is prepared in advance, and the coefficient set group is switched for each subject distance to create simultaneous equations.
- the created simultaneous equations it is possible to obtain restored images of the subject surface having various subject distances based on one detected image. For example, after capturing the detected image once and recording it, the recorded detection image is used to switch the coefficient set group according to the distance to the subject surface, and the restored image is restored to obtain an arbitrary subject distance. It is possible to generate a restored image of the subject surface.
- each pixel 121a needs to be set so as to ensure the independence of the simultaneous equations described above.
- the image output by the image sensor 121 is an image composed of detection signals in which the image of the subject is not formed as shown in the upper right of FIG. 2, the subject cannot be visually recognized. That is, the detection image composed of the detection signals output by the image sensor 121 is a set of pixel signals, but is an image in which the subject cannot be recognized (the subject cannot be visually recognized) even if the user visually recognizes the subject.
- an image composed of a detection signal in which an image of a subject is not formed, that is, an image captured by the image pickup device 121 is referred to as a detection image.
- incident angle directivity does not necessarily have to be different for each pixel, and pixels having the same incident angle directivity may be included.
- the restoration unit 122 corresponds to, for example, the subject distance corresponding to the distance from the image sensor 121 in FIG. 2 to the subject surface 102 (subject surface corresponding to the restored image), and the coefficients ⁇ 1 to ⁇ 3, ⁇ 1 to ⁇ 3, ⁇ 1 described above.
- a coefficient set group corresponding to ⁇ 3 is acquired from the storage unit 128.
- the restoration unit 122 is represented by the above equations (1) to (3) by using the detection signal level of each pixel of the detection image output from the image sensor 121 and the acquired coefficient set group. Create simultaneous equations.
- the restoration unit 122 obtains the pixel value of each pixel constituting the image in which the image of the subject shown in the lower right of FIG. 2 is formed by solving the created simultaneous equations. As a result, an image in which the user can visually recognize the subject (the subject can be visually recognized) is restored from the detected image.
- the image restored from this detected image will be referred to as a restored image.
- the restored image is not an image that can identify the subject like a normal image, but in this case as well, the restored image It is called.
- a restored image which is an image in which an image of a subject is formed, and an image before color separation or simultaneous processing such as demosaic processing is referred to as a RAW image, and is detected by the image sensor 121.
- the image is distinguished as an image that follows the arrangement of color filters but is not a RAW image.
- the number of pixels of the image sensor 121 and the number of pixels of the pixels constituting the restored image do not necessarily have to be the same.
- the restoration unit 122 performs demosaic processing, ⁇ correction, white balance adjustment, conversion processing to a predetermined compression format, etc. on the restored image as necessary. Then, the restoration unit 122 outputs the restored image to the bus B1.
- the control unit 123 includes, for example, various processors, controls each unit of the image pickup apparatus 101, and executes various processes.
- the input unit 124 includes an input device (for example, a key, a switch, a button, a dial, a touch panel, a remote controller, etc.) for operating the image pickup device 101 and inputting data used for processing.
- the input unit 124 outputs an operation signal, input data, and the like to the bus B1.
- the detection unit 125 includes an image pickup device 101, various sensors used for detecting the state of the subject, and the like.
- the detection unit 125 may include an acceleration sensor or gyro sensor that detects the posture or movement of the image pickup device 101, a position detection sensor that detects the position of the image pickup device 101 (for example, a GNSS (Global Navigation Satellite System) receiver, etc.), or a subject. It is equipped with a distance measuring sensor or the like that detects the distance.
- the detection unit 125 outputs a signal indicating the detection result to the bus B1.
- the association unit 126 associates the detected image obtained by the image sensor 121 with the metadata corresponding to the detected image.
- the metadata includes, for example, a coefficient set group for restoring a restored image using a target detected image, a subject distance, and the like.
- the method of associating the detected image with the metadata is not particularly limited as long as the correspondence between the detected image and the metadata can be specified. For example, by adding metadata to the image data including the detected image, giving the same ID to the detected image and the metadata, or recording the detected image and the metadata on the same recording medium 130, the detected image and the metadata can be obtained. Metadata is associated.
- the display unit 127 is composed of, for example, a display, and displays various information (for example, a restored image, etc.). It is also possible that the display unit 127 is provided with an audio output unit such as a speaker to output audio.
- the storage unit 128 includes one or more storage devices such as a ROM (Read Only Memory), a RAM (Random Access Memory), and a flash memory, and stores, for example, programs and data used for processing of the image pickup device 101.
- the storage unit 128 stores a set of coefficients corresponding to the above-mentioned coefficients ⁇ 1 to ⁇ 3, ⁇ 1 to ⁇ 3, ⁇ 1 to ⁇ 3 in association with various subject distances. More specifically, for example, the storage unit 128 stores a coefficient set group including a coefficient for each pixel 121a of the image sensor 121 for each point light source set on the subject surface 102 for each subject surface 102 at each subject distance. doing.
- the recording / reproducing unit 129 records the data on the recording medium 130 and reproduces (reads) the data recorded on the recording medium 130. For example, the recording / reproducing unit 129 records the restored image on the recording medium 130 or reads it out from the recording medium 130. Further, for example, the recording / reproducing unit 129 records the detected image and the corresponding metadata on the recording medium 130 or reads the detected image from the recording medium 130.
- the recording medium 130 is composed of, for example, any one of HDD (Hard Disk Drive), SSD (Solid State Drive), magnetic disk, optical disk, magneto-optical disk, semiconductor memory, etc., or a combination thereof.
- HDD Hard Disk Drive
- SSD Solid State Drive
- magnetic disk magnetic disk
- optical disk magnetic disk
- magneto-optical disk semiconductor memory, etc., or a combination thereof.
- the communication unit 131 communicates with another device (for example, another image pickup device, signal processing device, etc.) by a predetermined communication method.
- the communication method of the communication unit 131 may be either wired or wireless. It is also possible for the communication unit 131 to support a plurality of communication methods.
- FIG. 3 shows a front view of a part of the pixel array portion of the image sensor 121. Note that FIG. 3 shows an example in which the number of pixels in the pixel array unit is 6 vertical pixels ⁇ 6 horizontal pixels, but the number of pixels in the pixel array unit is not limited to this.
- a light-shielding film 121b which is one of the modulation elements, is provided for each pixel 121a so as to cover a part of the light-receiving region (light-receiving surface) of the photodiode, and each pixel 121a is provided with a light-shielding film 121b.
- the incident light is optically modulated according to the incident angle. Then, for example, by providing the light-shielding film 121b in a different range for each pixel 121a, the light receiving sensitivity of the incident light with respect to the incident angle becomes different for each pixel 121a, and each pixel 121a has a different incident angle directivity. ..
- the light-shielding area of the photodiode is different in the light-shielding area (light-shielding area (position)) depending on the light-shielding film 121b-1 and the light-shielding film 121b-2 provided. And at least one of the shaded areas is different). That is, in the pixel 121a-1, a light-shielding film 121b-1 is provided so as to block a part of the left side of the light receiving region of the photodiode by a predetermined width.
- a light-shielding film 121b-2 is provided so as to block a part of the right side of the light-receiving region by a predetermined width.
- the width of the light-shielding film 121b-1 to block the light-receiving area of the photodiode and the width of the light-shielding film 121b-2 to block the light-receiving area of the photodiode may be different or the same. ..
- the light-shielding film 121b is randomly arranged in the pixel array unit so as to block light in a different range of the light-receiving region for each pixel.
- the upper part of FIG. 4 is a side sectional view of the image sensor 121 in the first configuration example
- the middle part of FIG. 4 is a top view of the image sensor 121 in the first configuration example
- the side sectional view of the upper part of FIG. 4 is an AB cross section in the middle part of FIG.
- the lower part of FIG. 4 is an example of a circuit configuration of the image sensor 121.
- the adjacent pixels 121a-1 and 121a-2 are of a so-called back-illuminated type in which a wiring layer Z12 is provided at the bottom layer in the drawing and a photoelectric conversion layer Z11 is provided above the wiring layer Z12.
- pixels 121a-1 and 121a-2 when it is not necessary to distinguish between pixels 121a-1 and 121a-2, the description of the number at the end of the code is omitted, and the term is simply referred to as pixel 121a.
- the numbers and alphabets at the end of the reference numerals may be omitted for other configurations as well.
- FIG. 4 shows only a side view and a top view of two pixels constituting the pixel array portion of the image sensor 121, and it goes without saying that a larger number of pixels 121a are arranged. Is omitted.
- the pixels 121a-1 and 121a-2 are provided with photodiodes 121e-1 and 121e-2 in the photoelectric conversion layer Z11, respectively.
- photodiodes 121e-1 and 121e-2 On the photodiodes 121e-1 and 121e-2, on-chip lenses 121c-1, 121c-2 and color filters 121d-1 and 121d-2 are laminated from above, respectively.
- the on-chip lenses 121c-1 and 121c-2 collect the incident light on the photodiodes 121e-1 and 121e-2.
- the color filters 121d-1 and 121d-2 are optical filters that transmit light of specific wavelengths such as red, green, blue, infrared and white. In the case of white, the color filters 121d-1 and 121d-2 may or may not be transparent filters.
- light-shielding films 121g-1 to 121g-3 are formed at the boundaries between the pixels, and for example, as shown in FIG. 4, the incident light It suppresses the occurrence of crosstalk when L is incident on adjacent pixels.
- the light-shielding films 121b-1 and 121b-2 shield a part of the light-receiving surface S when viewed from the upper surface.
- the light receiving surface S of the photodiodes 121e-1 and 121e-2 in the pixels 121a-1 and 121a-2 different ranges are shielded by the light-shielding films 121b-1 and 121b-2, whereby different incident angles are obtained.
- the directivity is set independently for each pixel.
- the light-shielding range does not have to be different for all the pixels 121a of the image sensor 121, and there may be pixels 121a in which the same range is partially light-shielded.
- the light-shielding film 121b-1 and the light-shielding film 121g-1 are connected to each other and are L-shaped when viewed from the side surface.
- the light-shielding film 121b-2 and the light-shielding film 121g-2 are connected to each other and are formed in an L shape when viewed from the side surface.
- the light-shielding film 121b-1, the light-shielding film 121b-2, and the light-shielding film 121g-1 to 121g-3 are made of a metal, for example, tungsten (W), aluminum (Al), or Al and copper. It is composed of an alloy with (Cu).
- the light-shielding film 121b-1, the light-shielding film 121b-2, and the light-shielding film 121g-1 to 121g-3 are simultaneously formed by the same metal as the wiring in the same process as the process in which the wiring is formed in the semiconductor process. It may be done.
- the film thicknesses of the light-shielding film 121b-1, the light-shielding film 121b-2, and the light-shielding film 121g-1 to 121g-3 do not have to be the same depending on the position.
- the pixel 121a includes a photodiode 161 (corresponding to the photodiode 121e), a transfer transistor 162, an FD (Floating Diffusion) unit 163, a selection transistor 164, and an amplification transistor 165. , And a reset transistor 166, which is connected to the current source 168 via a vertical signal line 167.
- the anode electrode of the photodiode 161 is grounded, and the cathode electrode is connected to the gate electrode of the amplification transistor 165 via the transfer transistor 162.
- the transfer transistor 162 is driven according to the transfer signal TG. For example, when the transfer signal TG supplied to the gate electrode of the transfer transistor 162 becomes high level, the transfer transistor 162 is turned on. As a result, the electric charge stored in the photodiode 161 is transferred to the FD unit 163 via the transfer transistor 162.
- the FD unit 163 is a floating diffusion region having a charge capacitance C1 provided between the transfer transistor 162 and the amplification transistor 165, and temporarily stores the charge transferred from the photodiode 161 via the transfer transistor 162.
- the FD unit 163 is a charge detection unit that converts an electric charge into a voltage, and the electric charge stored in the FD unit 163 is converted into a voltage in the amplification transistor 165.
- the selection transistor 164 is driven according to the selection signal SEL, and turns on when the selection signal SEL supplied to the gate electrode reaches a high level to connect the amplification transistor 165 and the vertical signal line 167.
- the amplification transistor 165 serves as an input section of a source follower, which is a read-out circuit that reads out a signal obtained by photoelectric conversion in the photodiode 161.
- a detection signal (pixel signal) at a level corresponding to the charge stored in the FD section 163 is generated.
- the value of this detection signal (output pixel value) is modulated according to the incident angle of the incident light from the subject, and the characteristics (directivity) differ depending on the incident angle (having incident angle directivity).
- the reset transistor 166 is driven according to the reset signal RST. For example, the reset transistor 166 is turned on when the reset signal RST supplied to the gate electrode reaches a high level, and the electric charge accumulated in the FD unit 163 is discharged to the power supply VDD to reset the FD unit 163.
- the shape of the light-shielding film 121b of each pixel 121a is not limited to the example of FIG. 3, and can be set to any shape.
- the shape extending in the horizontal direction, the L-shaped shape extending in the vertical direction and the horizontal direction, the shape provided with the rectangular opening, and the like in FIG. 3 can be obtained.
- FIG. 5 is a diagram showing a second configuration example of the image sensor 121.
- a side sectional view of the pixel 121a of the image sensor 121 which is a second configuration example, is shown in the upper part of FIG. 5, and a top view of the image sensor 121 is shown in the middle part of FIG.
- the side sectional view of the upper part of FIG. 5 is an AB cross section in the middle part of FIG.
- the lower part of FIG. 5 is an example of a circuit configuration of the image sensor 121.
- the image sensor 121 of FIG. 5 In the image sensor 121 of FIG. 5, four photodiodes 121f-1 to 121f-4 are formed in one pixel 121a, and a light-shielding film 121g is formed in a region for separating the photodiodes 121f-1 to 121f-4. In that respect, the configuration is different from that of the image sensor 121 of FIG. That is, in the image sensor 121 of FIG. 5, the light-shielding film 121g is formed in a “+” shape when viewed from the upper surface.
- the common configurations thereof are designated by the same reference numerals as those in FIG. 4, and detailed description thereof will be omitted.
- the photodiodes 121f-1 to 121f-4 are separated by the light-shielding film 121g, so that electrical and optical crosstalk between the photodiodes 121f-1 to 121f-4 is generated. Be prevented. That is, the light-shielding film 121g of FIG. 5 is for preventing crosstalk like the light-shielding film 121g of the image sensor 121 of FIG. 4, and is not for providing incident angle directivity.
- one FD unit 163 is shared by four photodiodes 121f-1 to 121f-4.
- the lower part of FIG. 5 shows an example of a circuit configuration in which one FD unit 163 is shared by four photodiodes 121f-1 to 121f-4. The description of the same configuration as that of the lower part of FIG. 4 in the lower part of FIG. 5 will be omitted.
- the difference from the circuit configuration in the lower part of FIG. 4 is that the photodiodes 161-1 to 161- are replaced with the photodiode 161 (corresponding to the photodiode 121e in the upper part of FIG. 4) and the transfer transistor 162. 4 (corresponding to the photodiodes 121f-1 to 121f-4 in the upper part of FIG. 5) and transfer transistors 162-1 to 162-4 are provided, and the FD unit 163 is shared.
- the electric charge accumulated in the photodiodes 121f-1 to 121f-4 has a predetermined capacitance provided at the connection portion between the photodiodes 121f-1 to 121f-4 and the gate electrode of the amplification transistor 165. It is transferred to the common FD unit 163. Then, a signal corresponding to the level of the electric charge held in the FD unit 163 is read out as a detection signal (pixel signal).
- the charges accumulated in the photodiodes 121f-1 to 121f-4 can be selectively contributed to the output of the pixel 121a, that is, the detection signal in various combinations. That is, the configurations are such that the electric charge can be read out independently for each photodiode 121f-1 to 121f-4, and the photodiodes 121f-1 to 121f-4 (photodiodes 121f-1 to 121f-4 output) that contribute to the output. By making the degree of contribution to the diode different from each other, different incident angle directivities can be obtained.
- the incident angle directivity in the left-right direction can be obtained by transferring the charges of the photodiode 121f-1 and the photodiode 121f-3 to the FD unit 163 and adding the signals obtained by reading each of them.
- the incident angle directivity in the vertical direction can be obtained.
- the signal obtained based on the electric charge selectively read out independently from the four photodiodes 121f-1 to 121f-4 is a detection signal corresponding to one pixel constituting the detection image.
- each photodiode 121f charge
- the contribution of each photodiode 121f (charge) to the detection signal is determined not only by whether or not the charge (detection value) of each photodiode 121f is transferred to the FD unit 163, but also by using the electronic shutter function. It can also be realized by resetting the electric charge accumulated in the photodiode 121f before the transfer to the FD unit 163. For example, if the charge of the photodiode 121f is reset immediately before the transfer to the FD unit 163, the photodiode 121f does not contribute to the detection signal at all. On the other hand, by allowing a time between resetting the charge of the photodiode 121f and transferring the charge to the FD unit 163, the photodiode 121f is in a state of partially contributing to the detection signal.
- each pixel has a different incident angle directivity.
- the detection signal output from each pixel 121a of the image sensor 121 in FIG. 5 is a value modulated according to the incident angle of the incident light from the subject (output pixel value), and has characteristics (directivity) depending on the incident angle. Is different (has incident angle directivity).
- the detection signal is a signal obtained by optical modulation. Absent. Further, hereinafter, the photodiode 121f that does not contribute to the detection signal will also be referred to as a photodiode 121f that does not contribute to the pixel or output.
- FIG. 5 shows an example in which the light receiving surface of the pixel (pixel 121a) is divided into four equal parts and the photodiode 121f having the same light receiving surface is arranged in each region, that is, the photodiode is divided into four equal parts.
- the number of divisions and the division position of the photodiode can be set arbitrarily.
- the division position of the photodiode may be different for each pixel.
- the incident angle directivity will differ between the pixels.
- the number of divisions different between the pixels, it becomes possible to set the incident angle directivity more freely.
- both the number of divisions and the division position may be different between the pixels.
- both the image sensor 121 of FIG. 4 and the image sensor 121 of FIG. 5 have a configuration in which each pixel can independently set the incident angle directivity.
- the incident angle directivity of each pixel is set at the time of manufacture by the light-shielding film 121b.
- the number of divisions and the division position of the photodiodes of each pixel are set at the time of manufacture, but the incident angle directivity of each pixel (combination of photodiodes that contribute to output) is used (combination of photodiodes that contribute to output). For example, it can be set at the time of imaging).
- neither of the image sensor 121 of FIG. 4 and the image sensor 121 of FIG. 5 necessarily has a configuration in which all the pixels have an incident angle directivity.
- the shape of the light-shielding film 121b of each pixel 121a is referred to as a light-shielding pattern.
- the shape of the region of the photodiode 121f that does not contribute to the output in each pixel 121a is referred to as a light-shielding pattern.
- the incident angle directivity of each pixel of the image sensor 121 is generated by, for example, the principle shown in FIG.
- the upper left and upper right parts of FIG. 6 are views for explaining the principle of generation of incident angle directivity in the image sensor 121 of FIG. 5, and the lower left and lower right parts of FIG. 6 are the image sensor 121 of FIG. It is a figure explaining the generation principle of the incident angle directivity.
- the upper left and upper right pixels of FIG. 6 both include one photodiode 121e.
- the lower left and lower right pixels of FIG. 6 both include two photodiodes 121f.
- an example in which one pixel includes two photodiodes 121f is shown here, this is for convenience of explanation, and the number of photodiodes 121f included in one pixel may be another number. ..
- a light-shielding film 121b-11 is formed so as to block the right half of the light-receiving surface of the photodiode 121e-11. Further, in the upper right pixel of FIG. 6, a light-shielding film 121b-12 is formed so as to block the left half of the light-receiving surface of the photodiode 121e-12.
- the alternate long and short dash line in the figure is an auxiliary line that passes through the horizontal center of the light receiving surface of the photodiode 121e and is perpendicular to the light receiving surface.
- the incident light from the upper right direction forming the incident angle ⁇ 1 with respect to the alternate long and short dash line in the figure is not shielded by the light shielding film 121b-11 of the photodiode 121e-11. It is easy to receive light due to the half range.
- the incident light from the upper left direction forming the incident angle ⁇ 2 with respect to the alternate long and short dash line in the figure is hard to be received by the range of the left half that is not shielded by the light shielding film 121b-11 of the photodiode 121e-11. .. Therefore, the pixel in the upper left part of FIG. 6 has an incident angle directivity having a high light receiving sensitivity with respect to the incident light from the upper right in the figure and a low light receiving sensitivity with respect to the incident light from the upper left side. ..
- the incident light from the upper right direction forming the incident angle ⁇ 1 is received by the left half range shaded by the light shielding film 121b-12 of the photodiode 121e-12. Hateful.
- the incident light from the upper left direction forming the incident angle ⁇ 2 is easily received by the right half range that is not shielded by the light-shielding film 121b-12 of the photodiode 121e-12. Therefore, the pixel in the upper right part of FIG. 6 has an incident angle directivity having a low light receiving sensitivity with respect to the incident light from the upper right side in the figure and a high light receiving sensitivity with respect to the incident light from the upper left side. ..
- the lower left pixel of FIG. 6 is provided with photodiodes 121f-11 and 121f-12 on the left and right sides of the drawing, and a light-shielding film 121b is provided by reading out the detection signal of either one. It is configured to have incident angle directivity without any problem.
- the incident angle directivity similar to that of the upper left pixel of FIG. 6 is obtained. Obtainable. That is, the incident light from the upper right direction forming the incident angle ⁇ 1 with respect to the alternate long and short dash line in the figure is incident on the photodiode 121f-11, and the signal corresponding to the amount of received light is read from the photodiode 121f-11. It contributes to the detection signal output from the pixel.
- the incident light from the upper left direction forming the incident angle ⁇ 2 with respect to the alternate long and short dash line in the figure is incident on the photodiode 121f-12, but is not read from the photodiode 121f-12, and therefore is not read from this pixel. Does not contribute to the output detection signal.
- the incident light from the upper left direction forming the incident angle ⁇ 2 is incident on the photodiode 121f-14, and the signal corresponding to the amount of received light is read from the photodiode 121f-14, so that the light is output from this pixel. Contributes to the detection signal.
- the light-shielding range and the non-light-shielding range are separated at the horizontal center position of the pixel (light receiving surface of the photodiode 121e), but they are separated at other positions. You may.
- the two photodiodes 121f are separated at the horizontal center position of the pixel is shown, but the two photodiodes 121f may be separated at other positions. In this way, different incident angle directivity can be generated by changing the light-shielding range or the position where the photodiode 121f is divided.
- the upper graph of FIG. 7 shows the incident angle directivity of the pixels in the middle and lower rows of FIG.
- the horizontal axis represents the incident angle ⁇
- the vertical axis represents the detection signal level.
- the incident angle ⁇ is 0 degrees when the direction of the incident light coincides with the one-dot chain line on the left side of the middle row in FIG. 7, and the incident angle ⁇ 21 side on the left side of the middle row in FIG. 7 is the positive direction.
- the incident angle ⁇ 22 on the right side is the negative direction. Therefore, the incident light incident on the on-chip lens 121c from the upper right side has a larger incident angle than the incident light incident on the upper left side. That is, the incident angle ⁇ increases as the traveling direction of the incident light tilts to the left (increases in the positive direction), and decreases as it tilts to the right (increases in the negative direction).
- the pixels in the middle left part of FIG. 7 are the on-chip lens 121c-11 that collects the incident light and the color filter 121d-11 that transmits the light of a predetermined wavelength to the pixels in the upper left part of FIG. Is added. That is, in this pixel, the on-chip lens 121c-11, the color filter 121d-11, the light-shielding film 121b-11, and the photodiode 121e-11 are laminated in order from the incident direction of the light in the upper part of the drawing.
- the pixels in the middle right part of FIG. 7, the pixels in the lower left part of FIG. 7, and the pixels in the lower right part of FIG. 7 are the pixels in the upper right part of FIG. 6 and the pixels in the lower left part of FIG. 6, respectively.
- the on-chip lens 121c-11 and the color filter 121d-11, or the on-chip lens 121c-12 and the color filter 121d-12 are added to the pixels in the lower right part of FIG.
- the detection signal level (light receiving sensitivity) of the photodiode 121e-11 changes according to the incident angle ⁇ of the incident light, as shown by the waveform of the solid line in the upper row of FIG. That is, the larger the incident angle ⁇ , which is the angle formed by the incident light with respect to the one-point chain line in the figure (the larger the incident angle ⁇ is in the positive direction (the more it tilts to the right in the figure)), the light-shielding film 121b- By condensing the light in the range where 11 is not provided, the detection signal level of the photodiode 121e-11 is increased.
- the smaller the incident angle ⁇ of the incident light the larger the incident angle ⁇ is in the negative direction (the more it tilts to the left in the figure)
- the more the light is collected in the range where the light-shielding film 121b-11 is provided.
- the detection signal level of the photodiode 121e-11 becomes smaller.
- the detection signal level (light receiving sensitivity) of the photodiode 121e-12 changes according to the incident angle ⁇ of the incident light, as shown by the dotted line waveform in the upper row of FIG. To do. That is, the larger the incident angle ⁇ of the incident light (the larger the incident angle ⁇ is in the positive direction), the light is focused in the range where the light-shielding film 121b-12 is provided, so that the photodiode 121e-12 The detection signal level of is reduced.
- the smaller the incident angle ⁇ of the incident light the larger the incident angle ⁇ is in the negative direction
- the light is incident in the range where the light-shielding film 121b-12 is not provided, so that the photodiode 121e-12
- the detection signal level increases.
- the waveforms of the solid line and the dotted line shown in the upper part of FIG. 7 can be changed according to the range of the light-shielding film 121b. Therefore, depending on the range of the light-shielding film 121b, it is possible to give different incident angle directivity to each pixel.
- the incident angle directivity is a characteristic of the light receiving sensitivity of each pixel according to the incident angle ⁇ , but this is a characteristic of the light shielding value according to the incident angle ⁇ in the middle pixel of FIG. It can be said that. That is, the light-shielding film 121b shields the incident light in a specific direction at a high level, but cannot sufficiently block the incident light from other directions. This change in the light-shielding level causes different detection signal levels according to the incident angle ⁇ as shown in the upper part of FIG.
- each pixel has different incident angle directivity means that, in other words, each pixel has different light-shielding directions. It will have a direction.
- the signal of only the photodiode 121f-11 in the left part in the figure is used in the same manner as the pixel in the lower left part of FIG. It is possible to obtain the same incident angle directivity as the pixel of the part. That is, when the incident angle ⁇ of the incident light increases (when the incident angle ⁇ increases in the positive direction), the light is focused in the range of the photodiode 121f-11 from which the signal is read, so that the detected signal level becomes large. Become.
- the smaller the incident angle ⁇ of the incident light the larger the incident angle ⁇ is in the negative direction
- the more the light is focused in the range of the photodiode 121f-12 where the signal is not read, and the detected signal level. Becomes smaller.
- the signal of only the photodiode 121f-14 in the right part of the drawing is used in the same manner as the lower right pixel of FIG. It is possible to obtain the same incident angle directivity as the pixel on the right side of the middle stage. That is, when the incident angle ⁇ of the incident light increases (when the incident angle ⁇ increases in the positive direction), the light is focused in the range of the photodiode 121f-13 which does not contribute to the output (detection signal), so that the pixel The level of the detection signal of the unit becomes smaller.
- the smaller the incident angle ⁇ of the incident light the larger the incident angle ⁇ is in the negative direction
- the light is focused in the range of the photodiode 121f-14 that contributes to the output (detection signal).
- the level of the detection signal for each pixel increases.
- the center of gravity of the incident angle directivity of the pixel 121a is defined as follows.
- the center of gravity of the incident angle directivity is the center of gravity of the distribution of the intensity of the incident light incident on the light receiving surface of the pixel 121a.
- the light receiving surface of the pixel 121a is the light receiving surface of the photodiode 121e in the middle pixel 121a of FIG. 7, and the light receiving surface of the photodiode 121f in the lower pixel 121a of FIG.
- the detection signal level on the vertical axis of the upper graph of FIG. 7 is a ( ⁇ ), and the ray having an incident angle ⁇ g calculated by the following equation (4) is the center of gravity ray.
- ⁇ g ⁇ (a ( ⁇ ) ⁇ ⁇ ) / ⁇ a ( ⁇ ) ... (4)
- each photodiode is provided with directivity with respect to the incident angle of the incident light.
- an on-chip lens 121c is indispensable for each pixel.
- the pixel 121a is shaded by the light-shielding film 121b by the width d1 from each end of the four sides, and as shown in the lower part of FIG.
- FIG. 9 shows an example of the incident angle of the incident light from the subject surface 102 on the central position C1 of the image sensor 121.
- FIG. 9 shows an example of the incident angle of the incident light in the horizontal direction, the same applies to the vertical direction. Further, on the right side of FIG. 9, pixels 121a and 121a'in FIG. 8 are shown.
- the pixel 121a of FIG. 8 when the pixel 121a of FIG. 8 is arranged at the center position C1 of the image sensor 121, the range of the incident angle of the incident light from the subject surface 102 to the pixel 121a is an angle as shown in the left part of FIG. It becomes A1. Therefore, the pixel 121a can receive the incident light having a width W1 in the horizontal direction of the subject surface 102.
- the pixel 121a'in FIG. 8 when the pixel 121a'in FIG. 8 is arranged at the center position C1 of the image sensor 121, the pixel 121a'has a wider light-shielding range than the pixel 121a, so that the subject surface 102 moves to the pixel 121a'.
- the range of the incident angle of the incident light is the angle A2 ( ⁇ A1) as shown in the left part of FIG. Therefore, the pixel 121a'can receive the incident light having a width W2 ( ⁇ W1) in the horizontal direction of the subject surface 102.
- the pixel 121a having a narrow shading range is a wide angle of view pixel suitable for capturing a wide range on the subject surface 102, whereas the pixel 121a'with a wide shading range is narrow on the subject surface 102. It is a narrow angle of view pixel suitable for capturing a range.
- the wide angle of view pixel and the narrow angle of view pixel referred to here are expressions for comparing both the pixels 121a and 121a'in FIG. 8, and are not limited to this when comparing pixels with other angles of view.
- the pixel 121a is used to restore the image I1 of FIG.
- the image I1 is an image having an angle of view SQ1 corresponding to the subject width W1 including the entire person H101 as the subject in the upper part of FIG.
- the pixel 121a' is used to restore the image I2 of FIG.
- the image I2 is an image of the angle of view SQ2 corresponding to the subject width W2 in which the periphery of the face of the person H101 in the upper row of FIG. 10 is zoomed in.
- the pixel 121a of FIG. 8 is defined in the range ZA surrounded by the dotted line of the image sensor 121, and the pixel 121a'is defined in the range ZB surrounded by the alternate long and short dash line. It is conceivable to collect and arrange the number of pixels one by one. Then, for example, when the image of the angle of view SQ1 corresponding to the subject width W1 is restored, the image of the angle of view SQ1 can be appropriately restored by using the detection signal of each pixel 121a in the range ZA. it can. On the other hand, when restoring the image of the angle of view SQ2 corresponding to the subject width W2, the image of the angle of view SQ2 can be appropriately restored by using the detection signal of each pixel 121a'in the range ZB. ..
- the angle of view SQ2 is narrower than the angle of view SQ1
- the image of the angle of view SQ2 is restored rather than the image of the angle of view SQ1. It is possible to obtain a higher quality restored image.
- the right part of FIG. 11 shows a configuration example within the range ZA of the image sensor 121 of FIG.
- the left part of FIG. 11 shows a configuration example of the pixel 121a in the range ZA.
- the range shown in black is the light-shielding film 121b, and the light-shielding range of each pixel 121a is determined according to, for example, the rule shown on the left side of FIG.
- the main light-shielding portion Z101 on the left side of FIG. 11 is a range that is commonly shaded by each pixel 121a.
- the main shading portion Z101 has a width dx1 range from the left side and the right side of the pixel 121a toward the inside of the pixel 121a, and a height from the upper side and the lower side of the pixel 121a toward the inside of the pixel 121a, respectively. It is in the range of dy1.
- a rectangular opening Z111 that is not shaded by the light-shielding film 121b is provided in the range Z102 inside the main light-shielding portion Z101. Therefore, in each pixel 121a, the range other than the opening Z111 is shielded by the light-shielding film 121b.
- the openings Z111 of each pixel 121a are regularly arranged. Specifically, the horizontal position of the opening Z111 in each pixel 121a is the same in the pixels 121a in the same vertical row. Further, the vertical position of the opening Z111 in each pixel 121a is the same in the pixel 121a in the same horizontal row.
- the horizontal position of the opening Z111 in each pixel 121a is deviated at a predetermined interval according to the horizontal position of the pixel 121a. That is, as the position of the pixel 121a advances to the right, the left side of the opening Z111 moves to a position shifted to the right by the widths dx1, dx2, ..., Dxn from the left side of the pixel 121a, respectively.
- the distance between the width dxn-1 and the width dxn is the length obtained by subtracting the width of the opening Z111 from the horizontal width of the range Z102, respectively. Is divided by the number of pixels n-1 in the horizontal direction.
- the vertical position of the opening Z111 in each pixel 121a is deviated at a predetermined interval according to the vertical position of the pixel 121a. That is, as the position of the pixel 121a advances downward, the upper side of the opening Z111 moves downward by the heights dy1, dy2, ..., Dyn from the upper side of the pixel 121a, respectively.
- the distance between the height dy1 and the height dy2, the distance between the height dy2 and the height dy3, ..., The distance between the height dyn-1 and the height dyn is the opening Z111 from the vertical height of the range Z102, respectively.
- the length obtained by subtracting the height of is divided by the number of pixels m-1 in the vertical direction.
- the right part of FIG. 12 shows a configuration example within the range ZB of the image sensor 121 of FIG.
- the left part of FIG. 12 shows a configuration example of the pixels 121a'in the range ZB.
- the range shown in black is the light-shielding film 121b', and the light-shielding range of each pixel 121a' is determined according to, for example, the rule shown on the left side of FIG.
- the main light-shielding portion Z151 on the left side of FIG. 12 is a range that is commonly shaded in each pixel 121a'.
- the main shading portion Z151 faces the range of the width dx1'from the left side and the right side of the pixel 121a'to the inside of the pixel 121a', and from the upper side and the lower side of the pixel 121a' to the inside of the pixel 121a', respectively.
- Each is in the range of height dy1'.
- each pixel 121a' a rectangular opening Z161 that is not shaded by the light-shielding film 121b'is provided in the range Z152 inside the main light-shielding portion Z151. Therefore, in each pixel 121a', the range other than the opening Z161 is shielded by the light-shielding film 121b'.
- the openings Z161 of each pixel 121a' are regularly arranged in the same manner as the openings Z111 of each pixel 121a in FIG. Specifically, the horizontal position of the opening Z161 in each pixel 121a'is the same in the pixels 121a'in the same vertical row. Further, the vertical position of the opening Z161 in each pixel 121a'is the same in the pixel 121a' in the same horizontal row.
- the horizontal position of the opening Z161 in each pixel 121a' is deviated at a predetermined interval according to the horizontal position of the pixel 121a'. That is, as the position of the pixel 121a'moves to the right, the left side of the opening Z161 moves to a position shifted to the right by the widths dx1', dx2', ..., Dxn'from the left side of the pixel 121a', respectively. To do.
- the vertical position of the opening Z161 in each pixel 121a' is deviated at a predetermined interval according to the vertical position of the pixel 121a'. That is, as the position of the pixel 121a'advances downward, the upper side of the opening Z161 is displaced downward by the heights dy1', dy2', ..., Dyn'from the upper side of the pixel 121a', respectively. Moving. The distance between the height dy1'and the height dy2', the distance between the height dy2' and the height dy3', ..., The distance between the height dyn-1'and the height dyn'are in the vertical direction of the range Z152, respectively. It is the value obtained by subtracting the height of the opening Z161 from the height and dividing by the number of pixels m-1 in the vertical direction.
- the length obtained by subtracting the width of the opening Z111 from the horizontal width of the range Z102 of the pixel 121a in FIG. 11 is the width of the opening Z161 subtracted from the horizontal width of the range Z152 of the pixel 121a'in FIG. It will be larger than the subtracted width. Therefore, the interval of change of the widths dx1, dx2 ... dxn in FIG. 11 is larger than the interval of change of the widths dx1', dx2'... dxn'in FIG.
- the length obtained by subtracting the height of the opening Z111 from the vertical height of the range Z102 of the pixel 121a in FIG. 11 is the length of the opening Z161 from the vertical height of the range Z152 of the pixel 121a'in FIG. It is larger than the length minus the height. Therefore, the interval of change of the heights dy1, dy2 ... dyn of FIG. 11 is larger than the interval of the change of the heights dy1', dy2'... dyn'of FIG.
- the interval between changes in the horizontal and vertical positions of the opening Z111 of the light-shielding film 121b of each pixel 121a in FIG. 11 and the horizontal of the opening Z161 of the light-shielding film 121b'of each pixel 121a' in FIG. It differs from the interval between changes in directional and vertical position.
- the difference in this interval is the difference in the subject resolution (angle resolution) in the restored image. That is, the interval between changes in the horizontal and vertical positions of the opening Z161 of the light-shielding film 121b'of each pixel 121a'in FIG. 12 is horizontal to the opening Z111 of the light-shielding film 121b of each pixel 121a of FIG.
- the restored image restored by using the detection signal of each pixel 121a'in FIG. 12 has a higher subject resolution and higher image quality than the restored image restored by using the detection signal of each pixel 121a of FIG. Become.
- the image sensor 121 composed of pixels having various angles of view (having various incident angle directivities) is realized. It becomes possible.
- the pixels 121a and the pixels 121a'are arranged separately in the range ZA and the range ZB is shown, but this is for the sake of simplicity, and the pixels 121a corresponding to different angles of view are the same. It is desirable that they are mixed and arranged in the area.
- each unit U has wide angle of view pixels 121a-W and medium angle of view. It is composed of four pixels: pixels 121a-M, pixels 121a-N having a narrow angle of view, and pixels 121a-AN having an extremely narrow angle of view.
- the number of pixels of all the pixels 121a is X
- four types of coefficient set groups different for each angle of view are used, and restored images having different angles of view are restored by four different simultaneous equations.
- the image of the angle of view in the middle of the four types of angles of view and the images of the angles of view before and after the angle of view may be interpolated and generated from the images of the four types of angles of view, and the images of various angles of view are seamlessly generated. By doing so, a pseudo optical zoom may be realized.
- all the wide angle of view pixels may be used, or a part of the wide angle of view pixels may be used.
- all the narrow angle of view pixels may be used, or a part of the narrow angle of view pixels may be used.
- FIG. 14 shows a part of an arrangement example of the pixels 121a of the pixel array unit.
- the pixel 121a of the image sensor 121 is composed of two types of pixels PL1 (large pixels) and pixels PS1 (small pixels) having different sizes.
- the pixel PL1 has an octagonal shape with the four corners of the square cut off, and the area of the light receiving surface is larger than that of the pixel PS1.
- Each pixel PL1 is arranged in a grid pattern, and R (red), G (green), and B (blue) color filters are provided according to a Bayer array.
- Pixel PS1 has a shape in which a square is tilted by 45 degrees, and the area of the light receiving surface is smaller than that of pixel PL1.
- four pixel PL1s are arranged in a matrix in an adjacent portion, and R (red), G (green), and B (blue) color filters are provided according to a Bayer arrangement.
- FIG. 15 shows a configuration example of a light-shielding pattern of the pixel array portion.
- FIG. 16 is an enlarged view of the pixel PL1 and the pixel PS1 of FIG.
- the dotted line indicating the boundary between the pixels in FIGS. 15 and 16 is an auxiliary line, and is actually covered with a light-shielding film.
- the detection area AL1 is an area formed by an opening of the light-shielding film SL1 of each pixel PL1 and capable of detecting incident light.
- the detection region AL1 is set in the set region RL1 in the light receiving surface of each pixel PL1, and the region other than the set region RL1 of the light-shielding film SL1 of each pixel PL1 becomes the main light-shielding portion of the light-shielding film SL1.
- the region other than the detection region AL1 in the setting region RL1 and which is a part of the light-shielding film SL1 is referred to as a light-shielding region SL1'. Therefore, the setting area RL1 includes a detection area RL1 and a light-shielding area SL1'.
- the size, shape, and position of the setting area RL1 are common to each pixel PL1.
- the setting area RL1 is a square and is arranged in the center of each pixel PL1.
- the detection area AL1 is a square, and each pixel PL1 has the same shape and size. Further, the detection region AL1 is arranged in the setting region RL1 of each pixel PL1 according to the same rules as those described above with reference to FIGS. 12 and 13, for example, as shown in FIG.
- the detection area AL1 is arranged at the left end of the setting area RL1 in the pixel PL1 in the leftmost column of the pixel array unit, and is arranged at the upper end of the setting area RL1 in the pixel PL1 in the upper end row of the pixel array unit. Then, as the position of the pixel PL1 advances to the right, the detection area AL1 shifts to the right at equal intervals in the setting area RL1 and is arranged at the right end of the setting area RL1 in the pixel PL1 in the rightmost column of the pixel array portion. Will be done.
- the detection area AL1 is shifted downward at equal intervals in the setting area RL1 as the position of the pixel PL1 advances downward, and is arranged at the lower end of the setting area RL1 in the pixel PL1 in the lower end row of the pixel array portion. Will be done.
- the horizontal position of the detection area AL1 is the same in the pixel PL1 of the same vertical column. Further, the vertical position of the detection area AL1 is the same in the pixel PL1 in the same horizontal row. Therefore, the position of the detection region AL1 in each pixel PL1, that is, the position of the detection region in which the incident light is incident in each pixel PL1 and the incident light can be detected is different for each pixel PL1. As a result, the incident angle directivity of each pixel PL1 is different.
- the setting area RL1 is defined by the logical sum of all the detection areas AL1 included in the pixel PL1 group of the same size. That is, the area in which all the detection areas AL1 included in the pixel PL1 group are overlapped becomes equal to the setting area RL1.
- the arrangement pattern of the detection area AL1 is not limited to the above configuration, and any arrangement may be used as long as the logical sum of all the detection areas AL1 is equal to the setting area RL1.
- the positions of the detection area AL1 in the setting area RL1 may be randomly arranged in each pixel PL1.
- the detection area AS1 is an area formed by an opening of the light-shielding film SS1 of each pixel PS1 and can detect incident light.
- the detection region AS1 is set in the set region RS1 in the light receiving surface of each pixel PS1, and the region other than the set region RS1 of the light-shielding film SS1 of each pixel PS1 becomes the main light-shielding portion of the light-shielding film SS1.
- the region other than the detection region AS1 in the setting region RS1 and which is a part of the light-shielding film SS1 is referred to as a light-shielding region SS1'. Therefore, the setting area RS1 includes a detection area RS1 and a light-shielding area SS1'.
- the size, shape, and position of the setting area RS1 are common to each pixel PS1.
- the setting area RS1 is a square similar to the setting area RL1 and is arranged in the center of each pixel PS1.
- the detection area AS1 is a square similar to the detection area AL1, and each pixel PS1 has the same shape and size. Further, the detection area AS1 is arranged in the setting area RS1 of each pixel PS1 according to the same rule as the detection area AL1 in FIG. 17, for example.
- the detection area AS1 is arranged at the left end of the setting area RS1 in the pixel PS1 in the leftmost column of the pixel array unit, and is arranged at the upper end of the setting area RS1 in the pixel PS1 in the upper end row of the pixel array unit. Then, the detection area AS1 shifts to the right at equal intervals in the setting area RS1 as the position of the pixel PS1 advances to the right, and is arranged at the right end of the setting area RS1 in the pixel PS1 in the rightmost column of the pixel array portion. Will be done.
- the detection area AS1 shifts downward at equal intervals in the setting area RS1 as the position of the pixel PS1 moves downward, and is arranged at the lower end of the setting area RS1 in the pixel PS1 in the lower end row of the pixel array portion. Will be done.
- the horizontal position of the detection area AS1 is the same in the pixel PS1 in the same vertical row. Further, the vertical position of the detection area AS1 is the same in the pixel PS1 in the same horizontal row. Therefore, the position of the detection region AS1 in each pixel PS1, that is, the position of the detection region in which the incident light is incident in each pixel PS1 and the incident light can be detected is different for each pixel PS1. As a result, the incident angle directivity of each pixel PS1 is different.
- the setting area RS1 is defined by the logical sum of all the detection areas AS1 included in the pixel PS1 group of the same size. That is, the area in which all the detection areas AS1 included in the pixel PS1 group are overlapped becomes equal to the setting area RS1.
- the arrangement pattern of the detection area AS1 is not limited to the above configuration, and may be any arrangement as long as the logical sum of all the detection areas AS1 is equal to the setting area RS1.
- the positions of the detection area AS1 in the setting area RS1 may be randomly arranged in each pixel PS1.
- the ratio of the area of the set area RL1 to the area of the light receiving surface of the pixel PL1 and the area of the set area RS1 to the area of the light receiving surface of the pixel PS1 are substantially equal. Further, the ratio of the area of the detection region AL1 to the area of the light receiving surface of the pixel PL1 and the ratio of the area of the detection area AS1 to the area of the light receiving surface of the pixel PS1 are substantially equal. As a result, the ratio of the area of the detection area AL1 to the area of the setting area RL1 of the pixel PL1 and the ratio of the area of the detection area AS1 to the area of the setting area RS1 of the pixel PS1 are substantially equal.
- the ratio of the incident light incident on the detection region AS1 and detected and the incident light shielded by the light-shielding film SS1 is substantially equal.
- the ratio of the incident light incident on the setting area RL1 of the pixel PL1 is substantially equal.
- the ratio of the incident light incident on the detection region AS1 and detected and the incident light shielded by the light-shielding region SS1' is substantially equal.
- step S1 the image sensor 121 images the subject.
- a detection signal indicating a detection signal level according to the amount of incident light from the subject is output from each pixel PL1 and each pixel PS1 of the image sensor 121 having different incident angle directivities.
- the image sensor 121 performs A / D conversion of the detection signal of each pixel PL1, and a detection image composed of the detection signal of each pixel PL1 (hereinafter referred to as a high-sensitivity detection image) and a detection signal of each pixel PS1.
- a detection image (hereinafter referred to as a high DR detection image) composed of the above is generated and supplied to the restoration unit 122.
- the restoration unit 122 obtains a coefficient used for image restoration. Specifically, the restoration unit 122 sets the distance to the subject surface to be restored, that is, the subject distance. Any method can be adopted as the method for setting the subject distance. For example, the restoration unit 122 sets the subject distance input by the user via the input unit 124 or the subject distance detected by the detection unit 125 to the distance to the subject surface 102 to be restored.
- the restoration unit 122 reads out the coefficient set group for the high-sensitivity detection image and the coefficient set group for the high DR detection image associated with the set subject distance from the storage unit 128.
- step S3 the restoration unit 122 restores two types of images using the detected image and the coefficient. Specifically, the restoration unit 122 uses the detection signal level of each pixel of the high-sensitivity detection image and the acquired coefficient set group, and refers to the equations (1) to (3) to describe the simultaneous equations described above. To create. Next, the restoration unit 122 calculates the light intensity of each point light source on the subject surface corresponding to the set subject distance by solving the created simultaneous equations. Then, the restoration unit 122 generates a high-sensitivity restoration image in which the image of the subject is formed by arranging the pixels having the pixel values corresponding to the calculated light intensity according to the arrangement of the light sources at each point on the subject surface.
- the restoration unit 122 creates the above-mentioned simultaneous equations with reference to the equations (1) to (3) by using the detection signal level of each pixel of the high DR detection image and the acquired coefficient set group. To do.
- the restoration unit 122 calculates the light intensity of each point light source on the subject surface corresponding to the set subject distance by solving the created simultaneous equations.
- the restoration unit 122 generates a high DR restoration image in which the image of the subject is formed by arranging the pixels having the pixel values corresponding to the calculated light intensity according to the arrangement of the light sources at each point on the subject surface.
- the number of pixels of the high-sensitivity restored image and the number of pixels of the high-DR restored image are set to the same value.
- step S4 the restoration unit 122 synthesizes two types of restoration images.
- FIG. 19 is a graph schematically showing the characteristics of the pixel values of the restored image.
- the horizontal axis shows the light intensity of the point light source on the subject surface
- the vertical axis shows the pixel values of the pixels of the restored image corresponding to the point light source on the horizontal axis.
- the characteristic curve CL shows the characteristics of the pixel values of the high-sensitivity restored image
- the characteristic curve CS shows the characteristics of the pixel values of the high-sensitivity restored image.
- the pixel PL1 Since the pixel PL1 has a larger light receiving area than the pixel PS1, the amount of incident light received from each point light source is larger than that of the pixel PS1. On the contrary, since the light receiving area of the pixel PS1 is smaller than that of the pixel PL1, the amount of light received from the incident light from each point light source is smaller than that of the pixel PL1. Therefore, the high-sensitivity restored image based on the detection signal from each pixel PL1 has higher sensitivity than the high-sensitivity restored image based on the detection signal from each pixel PS1, while the pixels are likely to be saturated and are dynamic. The range becomes narrower. On the contrary, the high-sensitivity restored image has a lower sensitivity than the high-sensitivity restored image, but the pixels are less likely to be saturated and the dynamic range is widened.
- the light intensity of the point light source at which the pixel value reaches the maximum Vmax in the high-sensitivity restored image is L1. Therefore, the dynamic range of the pixels of the high-sensitivity restored image is in the range of light intensity from 0 to L1.
- the light intensity of the point light source at which the pixel value reaches the maximum Vmax in the high DR restored image is L2 (> L1). Therefore, the dynamic range of the pixels of the high DR restored image is in the range of light intensity from 0 to L2.
- the restoration unit 122 selects the pixels of the high-sensitivity restoration image or the pixels of the high-sensitivity restoration image based on the pixel values of the high-sensitivity restoration image, and synthesizes the restoration image based on the selected pixels.
- the restoration unit 122 has a high-sensitivity restoration image when the pixel value of the high-sensitivity restoration image is within the range of 0 to less than Vmax for the pixels at the same position of the high-sensitivity restoration image and the high DR restoration image.
- Vmax the pixel of the high-DR restored image is selected.
- the restoration unit 122 keeps the pixel value of the pixel selected from the high-sensitivity restoration image as it is, and multiplies the pixel value of the pixel selected from the high-DR restoration image by a predetermined coefficient.
- This coefficient is set, for example, to the area of the detection area AL1 / the area of the detection area AS1.
- the restoration unit 122 generates a composite restoration image in which the high-sensitivity restoration image and the high-DR restoration image are combined by arranging the selected pixels at the same positions in the original restoration image.
- the pixel value of the pixel selected from the high DR restored image may be left as it is, and the pixel value of the pixel selected from the high sensitivity restored image may be multiplied by a predetermined coefficient, or selected from both restored images. You may multiply the pixel value of a pixel by a different coefficient.
- FIG. 20 is a graph schematically showing the characteristics of the pixel values of the composite restored image subjected to the above-mentioned composition processing with reference to FIG.
- the horizontal axis shows the light intensity
- the vertical axis shows the pixel value.
- the dynamic range of the composite restored image is expanded to the range from light intensity 0 to light intensity L2 as compared with the high-sensitivity restored image.
- the sensitivity of the composite restored image is improved as compared with the high DR restored image. Specifically, for a dark portion (low-luminance subject) of the subject, the sensitivity is improved by using the pixel value of the high-sensitivity restored image based on the pixel signal of the pixel PL1 having a relatively large light receiving area.
- the occurrence of overexposure can be prevented by using the pixel value of the high DR restored image based on the pixel signal of the pixel PS1 having a relatively small light receiving area. Can be done.
- step S5 the image pickup apparatus 101 performs various processes on the restored image (composite restored image) after composition.
- the restoration unit 122 performs demosaic processing, ⁇ correction, white balance adjustment, conversion processing to a predetermined compression format, and the like on the composite restoration image, if necessary.
- the restoration unit 122 supplies the composite restoration image to the display unit 127 and displays it, supplies it to the recording / reproduction unit 129, and records it on the recording medium 130, or via the communication unit 131, as needed. And output to other devices.
- the process of restoring the restored image corresponding to the predetermined subject distance immediately after capturing the detected image has been described.
- the detected image is recorded without immediately performing the restoration process.
- the restored image may be restored by using the detected image at a desired timing.
- the restored image may be restored by the imaging device 101 or another device.
- a restored image for a subject surface having an arbitrary subject distance or angle of view can be obtained by solving a simultaneous equation created by using a set of coefficients corresponding to an arbitrary subject distance or angle of view to obtain a restored image. It is possible to realize refocusing and the like.
- FIG. 21 shows a first modification of the light-shielding pattern of the pixel array portion.
- pixel PL2 and pixel PS2 are provided instead of pixel PL1 and pixel PS1.
- Pixel PL2 and pixel PS2 have the same shape, size, and position as pixel PL1 and pixel PS1.
- the setting area RL2 of the pixel PL2 is different from the setting area RL1 of the pixel PL1.
- the setting area RL2 is a set area having the same shape and size as the setting area RL1 rotated by 45 degrees.
- the setting area RS2 of the pixel PS2 is different from the setting area RS1 of the pixel PS1. Specifically, the setting area RS2 is obtained by rotating a setting area having the same shape and size as the setting area RS1 by 45 degrees.
- the ratio of the area of the set area RL2 to the area of the light receiving surface of the pixel PL2 and the ratio of the area of the set area RS2 to the area of the light receiving surface of the pixel PS2 are relative to the area of the light receiving surface of the pixel PL1 in FIG. It is equal to the ratio to the area of the set area RL1 and the area to the area of the set area RS1 to the area of the light receiving surface of the pixel PS1.
- the ratio of the area of the set area RL2 to the area of the light receiving surface of the pixel PL2 and the ratio of the area of the set area RS2 to the area of the light receiving surface of the pixel PS2 are substantially equal.
- the detection area (not shown) is arranged at a slightly different position in the setting area RL2 as in the above-described example with reference to FIG. Further, in each pixel PS2, the detection area (not shown) is arranged at a slightly different position in the setting area RS2, as in the above-described example with reference to FIG.
- the detection areas of the pixel PL2 and the pixel PS2 are substantially equal to the ratio of the area of the detection area to the area of the light receiving surface of the pixel PL2 and the area of the detection area to the area of the light receiving surface of the pixel PS2.
- the shape and size of are set. Therefore, the ratio of the area of the detection area to the area of the setting area RL2 of the pixel PL2 and the ratio of the area of the detection area to the area of the setting area RS2 of the pixel PS2 are substantially equal.
- FIG. 22 shows a second modification of the light-shielding pattern of the pixel array portion.
- the example of FIG. 22 is different from the example of FIG. 21 in that the pixel PL3 is provided instead of the pixel PL2.
- Pixel PL3 has the same shape, size, and position as pixel PL2.
- the setting area RL3 of the pixel PL3 is different from the setting area RL2 of the pixel PL2.
- the setting area RL3 has an octagonal shape obtained by reducing the pixel PL3, and is arranged at the center of the pixel PL3.
- the area of the set area RL3 and the area of the set area RL2 are substantially the same.
- the ratio of the area of the set area RL3 to the area of the light receiving surface of the pixel PL3 and the area of the set area RS2 to the area of the light receiving surface of the pixel PS2 are substantially equal.
- each pixel PL3 the detection area (not shown) is arranged at a slightly different position in the setting area RL3, as in the above-described example with reference to FIG.
- the detection areas of the pixels PL2 and the pixel PS2 are substantially equal to the ratio of the area of the detection area to the area of the light receiving surface of the pixel PL3 and the area of the detection area to the area of the light receiving surface of the pixel PS2.
- the shape and size of are set. Therefore, the ratio of the area of the detection area to the area of the setting area RL3 of the pixel PL3 and the ratio of the area of the detection area to the area of the setting area RS2 of the pixel PS2 are substantially equal.
- FIG. 23 shows a part of an arrangement example of the pixels 121a of the pixel array unit.
- the pixel 121a of the image sensor 121 is composed of two types of pixels PL4 and pixels PS4 having different sizes.
- Pixel PL4 and pixel PS4 are both square.
- the area of the light receiving surface of the pixel PL4 is about four times the area of the light receiving surface of the pixel PS4.
- the pixel PL4 and the pixel PS4 are alternately arranged in the vertical direction (column direction). Specifically, in the pixel array unit, rows in which the pixel PL4 is arranged and rows in which the pixel PS4 is arranged are alternately arranged, and the pixel PL4 and the pixel PS4 are arranged in different rows in the row direction (horizontal direction). They are lined up.
- FIG. 24 shows a configuration example of a shading pattern for the pixel array portion of FIG. 23.
- the dotted line indicating the boundary between the pixels in FIG. 24 is an auxiliary line, and is actually covered with a light-shielding film.
- a square setting area RL4 is arranged in the center of each pixel PL4.
- a square setting area RS4 similar to the setting area RL4 is arranged. Further, the ratio of the area of the set area RL4 to the area of the light receiving surface of the pixel PL4 and the ratio of the area of the set area RS4 to the area of the light receiving surface of the pixel PS4 are substantially equal.
- each pixel PL4 the detection area (not shown) is arranged at a slightly different position in the setting area RL4 as in the above-described example with reference to FIG. Further, in each pixel PS4, a detection area (not shown) similar to the shape of the detection area of the pixel PL4 is arranged at a slightly different position in the setting area RS4, as in the above-described example with reference to FIG. .. Further, the ratio of the area of the detection area to the area of the light receiving surface of the pixel PL4 and the ratio of the area of the detection area to the area of the light receiving surface of the pixel PS4 are substantially equal. Therefore, the ratio of the area of the detection area to the area of the setting area RL4 of the pixel PL4 and the ratio of the area of the detection area to the area of the setting area RS4 of the pixel PS4 are substantially equal.
- the columns in which the pixels PL4 are arranged and the columns in which the pixels PS4 are arranged are alternately arranged, and the pixels PL4 and the pixels PS4 are arranged in the column direction (vertical direction) in different columns. You may be able to do it.
- 25 and 26 schematically show a second modification of the pixel array portion of the image sensor 121.
- FIG. 25 shows a part of an arrangement example of the pixels 121a of the pixel array unit.
- the pixel 121a of the image sensor 121 is composed of two types of pixels PL5 and pixels PS5 having different sizes.
- Pixel PL5 and pixel PS5 are both square.
- the area of the light receiving surface of the pixel PL5 is about four times the area of the light receiving surface of the pixel PS5.
- Pixels PS5 are arranged in 4 rows vertically and 4 columns horizontally in the center.
- the pixel PL5 is arranged so as to surround the periphery of the area where the pixel PS5 is arranged.
- the pixel patterns shown in FIG. 25 are repeatedly arranged in the horizontal direction (row direction) and the vertical direction (column direction) of the pixel array portion.
- FIG. 26 shows a configuration example of a shading pattern for the pixel array portion of FIG. 25.
- the dotted line indicating the boundary between the pixels in FIG. 26 is an auxiliary line, and is actually covered with a light-shielding film.
- a square setting area RL5 is arranged in the center of each pixel PL5.
- a square setting area RS5 similar to the setting area RL5 is arranged. Further, the ratio of the area of the set area RL5 to the area of the light receiving surface of the pixel PL5 and the ratio of the area of the set area RS5 to the area of the light receiving surface of the pixel PS5 are substantially equal.
- each pixel PL5 the detection area (not shown) is arranged at a slightly different position in the setting area RL5, as in the above-described example with reference to FIG.
- a detection area (not shown) similar to the shape of the detection area of the pixel PL5 is arranged at a slightly different position in the setting area RS5, as in the above-described example with reference to FIG. ..
- the ratio of the area of the light receiving surface of the pixel PL5 to the area of the detection area and the ratio of the area of the light receiving surface of the pixel PS5 to the area of the detection area are substantially equal.
- the ratio of the area of the detection area to the area of the setting area RL5 of the pixel PL5 and the ratio of the area of the detection area to the area of the setting area RS5 of the pixel PS5 are substantially equal.
- the configuration of the pixel array unit described above is an example thereof, and other configurations can be adopted.
- the setting area and the detection area of each type of pixel are set so that the ratio of the area of the detection area to the area of the setting area is substantially equal among the pixels of each type.
- the pixels at the same position of the plurality of restored images restored from the plurality of detected images based on the detection signals of the respective types of pixels are combined based on the pixel value.
- the pixels used in the restored image are selected.
- n kinds of pixels of pixels 1 to n are arranged in the pixel array unit in order from the one having the larger area of the light receiving surface, and the restored images 1 to the restored images n corresponding to the pixels 1 to n are restored. To do.
- the pixel of the restored image corresponding to the pixel having the largest light receiving area is the composite restored image. Is selected as the pixel used for.
- the pixel of the restored image 1 is less than a predetermined threshold value 1
- the pixel of the restored image 1 is selected.
- the pixel value of the pixel of the restored image 1 is equal to or more than the threshold value 1
- the pixel value of the pixel of the restored image 2 is less than the predetermined threshold value 2
- the pixel of the restored image 2 is selected.
- the pixel value of the pixel of the restored image 2 is equal to or more than the threshold value 2
- the pixel value of the pixel of the restored image 3 is less than the predetermined threshold value 3 is selected.
- the pixel of the restored image i is equal to or more than the predetermined threshold value i and the pixel value of the pixel of the restored image i + 1 is less than the predetermined threshold value i + 1, the pixel of the restored image i + 1 is selected. ..
- the pixel value of the pixel of the restored image n-1 is equal to or higher than a predetermined threshold value n-1, the pixel of the restored image n is unconditionally selected.
- the threshold values 1 to n-1 may be set to the same value or different values.
- a part or all of the signal processing control unit 111 of the image pickup apparatus 101 may be performed by an external apparatus.
- the restored image may be restored by an external device.
- a flattening film may be provided between the color filter 121d and the light-shielding film 121b.
- a flattening film may be provided between the color filter 121d and the photodiode 121f.
- the surface-illuminated pixel 121a shown in FIG. 27 can be used for the image sensor 121.
- the stacking order of the photoelectric conversion layer Z11 and the wiring layer Z12 is reversed as compared with the pixel 121a of FIG. That is, in the pixel 121a of FIG. 27, the on-chip lens 121c, the color filter (CF) 121d, the wiring layer Z12, and the photoelectric conversion layer Z11 are laminated in this order from the top. In the photoelectric conversion layer Z11, a light-shielding film 121b and a photodiode (PD) 121e are laminated in this order from the top.
- PD photodiode
- FIG. 4 shows an example in which a light-shielding film 121b is used as a modulation element or a combination of photodiodes contributing to output is changed to give different incident angle directivity to each pixel.
- an optical filter 902 that covers the light receiving surface of the image pickup device 901 as a modulation element so that each pixel has an incident angle directivity.
- the optical filter 902 is arranged so as to cover the entire surface of the light receiving surface 901A at a predetermined distance from the light receiving surface 901A of the image sensor 901.
- the light from the subject surface 102 is modulated by the optical filter 902 and then enters the light receiving surface 901A of the image sensor 901.
- optical filter 902BW having a black and white grid pattern shown in FIG. 29.
- a white pattern portion that transmits light and a black pattern portion that blocks light are randomly arranged.
- the size of each pattern is set independently of the size of the pixels of the image sensor 901.
- FIG. 30 shows the light receiving sensitivity characteristics of the image sensor 901 with respect to the light from the point light source PA and the point light source PB on the subject surface 102 when the optical filter 902BW is used.
- the light from the point light source PA and the point light source PB is modulated by the optical filter 902BW, and then enters the light receiving surface 901A of the image sensor 901.
- the light receiving sensitivity characteristic of the image sensor 901 with respect to the light from the point light source PA is as in the waveform Sa. That is, since shadows are generated by the black pattern portion of the optical filter 902BW, a shading pattern is generated in the image on the light receiving surface 901A with respect to the light from the point light source PA.
- the light receiving sensitivity characteristic of the image sensor 901 with respect to the light from the point light source PB is as in the waveform Sb. That is, since shadows are generated by the black pattern portion of the optical filter 902BW, a shading pattern is generated in the image on the light receiving surface 901A with respect to the light from the point light source PB.
- each pixel of the image sensor 901 has incident angle directivity with respect to each point light source of the subject surface 102.
- the optical filter 902HW of FIG. 31 may be used instead of the black pattern portion of the optical filter 902BW.
- the optical filter 902HW includes a linear polarization element 911A, a linear polarization element 911B, and a 1/2 wave plate 912 having the same polarization direction, and the 1/2 wave plate 912 is located between the linear polarization element 911A and the linear polarization element 911B. It is sandwiched.
- the 1/2 wavelength plate 912 is provided with a polarizing portion indicated by a diagonal line instead of the black pattern portion of the optical filter 902BW, and the white pattern portion and the polarizing portion are randomly arranged.
- the linearly polarized light element 911A transmits only the light in the predetermined polarization direction among the almost unpolarized light emitted from the point light source PA.
- the linear polarization element 911A transmits only light whose polarization direction is parallel to the drawing.
- the polarized light transmitted through the linearly polarizing element 911A the polarized light transmitted through the polarizing portion of the 1/2 wave plate 912 changes its polarization direction in a direction perpendicular to the drawing due to the rotation of the polarizing surface.
- the linearly polarized light element 911B transmits the polarized light transmitted through the white pattern portion and hardly transmits the polarized light transmitted through the polarized light portion. Therefore, the amount of polarized light transmitted through the polarized light portion is smaller than that of the polarized light transmitted through the white pattern portion. As a result, a pattern of shading similar to that when the optical filter BW is used is generated on the light receiving surface 901A of the image sensor 901.
- the optical interference mask can be used as the optical filter 902LF.
- the light emitted from the point light sources PA and PB on the subject surface 102 is applied to the light receiving surface 901A of the image sensor 901 via the optical filter 902LF.
- the light incident surface of the optical filter 902LF is provided with irregularities of about wavelength.
- the optical filter 902LF maximizes the transmission of light of a specific wavelength irradiated from the vertical direction.
- the optical path length changes.
- the intensity of the transmitted light having a specific wavelength emitted from the point light sources PA and PB and transmitted through the optical filter 902LF is modulated according to the angle of incidence on the optical filter 902LF and is modulated by the image pickup element 901. It is incident on the light receiving surface 901A of. Therefore, the detection signal output from each pixel of the image sensor 901 is a signal obtained by synthesizing the modulated light intensity of each point light source for each pixel.
- the present technology can also be applied to an image pickup device or an image sensor that images light having a wavelength other than visible light such as infrared light.
- the restored image is not an image in which the user can visually recognize the subject, but an image in which the user cannot visually recognize the subject.
- the image quality of the restored image is improved with respect to an image processing device or the like capable of recognizing the subject. Since it is difficult for a normal imaging lens to transmit far-infrared light, this technique is effective, for example, when imaging far-infrared light. Therefore, the restored image may be an image of far-infrared light, and is not limited to far-infrared light, and may be an image of other visible light or non-visible light.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure includes any type of movement such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, robots, construction machines, agricultural machines (tractors), and the like. It may be realized as a device mounted on the body.
- FIG. 33 is a block diagram showing a schematic configuration example of a vehicle control system 7000, which is an example of a mobile control system to which the technique according to the present disclosure can be applied.
- the vehicle control system 7000 includes a plurality of electronic control units connected via the communication network 7010.
- the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an external information detection unit 7400, an in-vehicle information detection unit 7500, and an integrated control unit 7600. ..
- the communication network 7010 connecting these plurality of control units conforms to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network) or FlexRay (registered trademark). It may be an in-vehicle communication network.
- CAN Controller Area Network
- LIN Local Interconnect Network
- LAN Local Area Network
- FlexRay registered trademark
- Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores a program executed by the microcomputer or parameters used for various arithmetics, and a drive circuit that drives various control target devices. To be equipped.
- Each control unit is provided with a network I / F for communicating with other control units via the communication network 7010, and is connected to devices or sensors inside or outside the vehicle by wired communication or wireless communication.
- a communication I / F for performing communication is provided. In FIG.
- control unit 7600 As the functional configuration of the integrated control unit 7600, the microcomputer 7610, general-purpose communication I / F 7620, dedicated communication I / F 7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I / F 7660, audio image output unit 7670, The vehicle-mounted network I / F 7680 and the storage unit 7690 are shown.
- Other control units also include a microcomputer, a communication I / F, a storage unit, and the like.
- the drive system control unit 7100 controls the operation of the device related to the drive system of the vehicle according to various programs.
- the drive system control unit 7100 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating a braking force of a vehicle.
- the drive system control unit 7100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
- the vehicle condition detection unit 7110 is connected to the drive system control unit 7100.
- the vehicle state detection unit 7110 may include, for example, a gyro sensor that detects the angular velocity of the axial rotation of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, an accelerator pedal operation amount, a brake pedal operation amount, or steering wheel steering. Includes at least one of the sensors for detecting angular velocity, engine speed, wheel speed, and the like.
- the drive system control unit 7100 performs arithmetic processing using signals input from the vehicle state detection unit 7110 to control an internal combustion engine, a drive motor, an electric power steering device, a brake device, and the like.
- the body system control unit 7200 controls the operation of various devices mounted on the vehicle body according to various programs.
- the body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, blinkers or fog lamps.
- the body system control unit 7200 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches.
- the body system control unit 7200 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
- the battery control unit 7300 controls the secondary battery 7310, which is the power supply source of the drive motor, according to various programs. For example, information such as the battery temperature, the battery output voltage, or the remaining capacity of the battery is input to the battery control unit 7300 from the battery device including the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and controls the temperature control of the secondary battery 7310 or the cooling device provided in the battery device.
- the vehicle outside information detection unit 7400 detects information outside the vehicle equipped with the vehicle control system 7000.
- the image pickup unit 7410 and the vehicle exterior information detection unit 7420 is connected to the vehicle exterior information detection unit 7400.
- the imaging unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
- the vehicle exterior information detection unit 7420 is used to detect, for example, the current weather or an environmental sensor for detecting the weather, or other vehicles, obstacles, pedestrians, etc. around the vehicle equipped with the vehicle control system 7000. At least one of the ambient information detection sensors is included.
- the environmental sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects the degree of sunshine, and a snow sensor that detects snowfall.
- the ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device.
- the imaging unit 7410 and the vehicle exterior information detection unit 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
- FIG. 34 shows an example of the installation positions of the imaging unit 7410 and the vehicle exterior information detection unit 7420.
- the imaging units 7910, 7912, 7914, 7916, 7918 are provided, for example, at at least one of the front nose, side mirrors, rear bumpers, back door, and upper part of the windshield of the vehicle interior of the vehicle 7900.
- the image pickup unit 7910 provided on the front nose and the image pickup section 7918 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 7900.
- the imaging units 7912 and 7914 provided in the side mirrors mainly acquire images of the side of the vehicle 7900.
- the image pickup unit 7916 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 7900.
- the imaging unit 7918 provided on the upper part of the windshield in the vehicle interior is mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
- FIG. 34 shows an example of the shooting range of each of the imaging units 7910, 7912, 7914, 7916.
- the imaging range a indicates the imaging range of the imaging unit 7910 provided on the front nose
- the imaging ranges b and c indicate the imaging ranges of the imaging units 7912 and 7914 provided on the side mirrors, respectively
- the imaging range d indicates the imaging range d.
- the imaging range of the imaging unit 7916 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 7910, 7912, 7914, 7916, a bird's-eye view image of the vehicle 7900 as viewed from above can be obtained.
- the vehicle exterior information detection units 7920, 7922, 7924, 7926, 7928, 7930 provided on the front, rear, side, corners and the upper part of the windshield in the vehicle interior of the vehicle 7900 may be, for example, an ultrasonic sensor or a radar device.
- the vehicle exterior information detection units 7920, 7926, 7930 provided on the front nose, rear bumper, back door, and upper part of the windshield in the vehicle interior of the vehicle 7900 may be, for example, a lidar device.
- These external information detection units 7920 to 7930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, or the like.
- the vehicle exterior information detection unit 7400 causes the image pickup unit 7410 to capture an image of the outside of the vehicle and receives the captured image data. Further, the vehicle exterior information detection unit 7400 receives the detection information from the connected vehicle exterior information detection unit 7420. When the vehicle exterior information detection unit 7420 is an ultrasonic sensor, a radar device, or a LIDAR device, the vehicle exterior information detection unit 7400 transmits ultrasonic waves, electromagnetic waves, or the like, and receives the received reflected wave information.
- the vehicle outside information detection unit 7400 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on a road surface based on the received information.
- the vehicle exterior information detection unit 7400 may perform an environment recognition process for recognizing rainfall, fog, road surface conditions, etc., based on the received information.
- the vehicle exterior information detection unit 7400 may calculate the distance to an object outside the vehicle based on the received information.
- the vehicle exterior information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a vehicle, an obstacle, a sign, a character on the road surface, or the like based on the received image data.
- the vehicle exterior information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and synthesizes the image data captured by different imaging units 7410 to generate a bird's-eye view image or a panoramic image. May be good.
- the vehicle exterior information detection unit 7400 may perform the viewpoint conversion process using the image data captured by different imaging units 7410.
- the in-vehicle information detection unit 7500 detects the in-vehicle information.
- a driver state detection unit 7510 that detects the driver's state is connected to the in-vehicle information detection unit 7500.
- the driver state detection unit 7510 may include a camera that captures the driver, a biosensor that detects the driver's biological information, a microphone that collects sound in the vehicle interior, and the like.
- the biosensor is provided on, for example, the seat surface or the steering wheel, and detects the biometric information of the passenger sitting on the seat or the driver holding the steering wheel.
- the in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, and may determine whether the driver is dozing or not. You may.
- the in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on the collected audio signal.
- the integrated control unit 7600 controls the overall operation in the vehicle control system 7000 according to various programs.
- An input unit 7800 is connected to the integrated control unit 7600.
- the input unit 7800 is realized by a device such as a touch panel, a button, a microphone, a switch or a lever, which can be input-operated by a passenger. Data obtained by recognizing the voice input by the microphone may be input to the integrated control unit 7600.
- the input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or an externally connected device such as a mobile phone or a PDA (Personal Digital Assistant) that supports the operation of the vehicle control system 7000. You may.
- the input unit 7800 may be, for example, a camera, in which case the passenger can input information by gesture. Alternatively, data obtained by detecting the movement of the wearable device worn by the passenger may be input. Further, the input unit 7800 may include, for example, an input control circuit that generates an input signal based on the information input by the passenger or the like using the input unit 7800 and outputs the input signal to the integrated control unit 7600. By operating the input unit 7800, the passenger or the like inputs various data to the vehicle control system 7000 and instructs the processing operation.
- the storage unit 7690 may include a ROM (Read Only Memory) for storing various programs executed by the microcomputer, and a RAM (Random Access Memory) for storing various parameters, calculation results, sensor values, and the like. Further, the storage unit 7690 may be realized by a magnetic storage device such as an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, or the like.
- ROM Read Only Memory
- RAM Random Access Memory
- the general-purpose communication I / F 7620 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 7750.
- General-purpose communication I / F7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced).
- GSM Global System of Mobile communications
- WiMAX registered trademark
- LTE registered trademark
- LTE-A Long Term Evolution-Advanced
- Bluetooth® may be implemented.
- the general-purpose communication I / F7620 connects to a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or a business-specific network) via a base station or an access point, for example. You may. Further, the general-purpose communication I / F7620 uses, for example, P2P (Peer To Peer) technology to use a terminal existing in the vicinity of the vehicle (for example, a terminal of a driver, a pedestrian or a store, or an MTC (Machine Type Communication) terminal). You may connect with.
- P2P Peer To Peer
- MTC Machine Type Communication
- the dedicated communication I / F 7630 is a communication I / F that supports a communication protocol formulated for use in a vehicle.
- the dedicated communication I / F7630 uses standard protocols such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), or cellular communication protocol, which is a combination of lower layer IEEE802.11p and upper layer IEEE1609. May be implemented.
- the dedicated communication I / F7630 typically includes vehicle-to-vehicle (Vehicle to Vehicle) communication, road-to-vehicle (Vehicle to Infrastructure) communication, vehicle-to-home (Vehicle to Home) communication, and pedestrian-to-pedestrian (Vehicle to Pedestrian) communication. ) Carry out V2X communication, a concept that includes one or more of the communications.
- the positioning unit 7640 receives, for example, a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite), executes positioning, and executes positioning, and the latitude, longitude, and altitude of the vehicle. Generate location information including.
- the positioning unit 7640 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smartphone having a positioning function.
- the beacon receiving unit 7650 receives radio waves or electromagnetic waves transmitted from a radio station or the like installed on the road, and acquires information such as the current position, traffic congestion, road closure, or required time.
- the function of the beacon receiving unit 7650 may be included in the above-mentioned dedicated communication I / F 7630.
- the in-vehicle device I / F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle devices 7760 existing in the vehicle.
- the in-vehicle device I / F7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication) or WUSB (Wireless USB).
- a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication) or WUSB (Wireless USB).
- the in-vehicle device I / F7660 is connected via a connection terminal (and a cable if necessary) (not shown), USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface, or MHL (Mobile High)).
- -Definition Link and other wired connections may be established.
- the in-vehicle device 7760 includes, for example, at least one of a passenger's mobile device or wearable device, or an information device carried in or attached to the vehicle.
- the in-vehicle device 7760 may include a navigation device that searches for a route to an arbitrary destination.
- the in-vehicle device I / F 7660 is a control signal to and from these in-vehicle devices 7760. Or exchange the data signal.
- the in-vehicle network I / F7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010.
- the vehicle-mounted network I / F7680 transmits and receives signals and the like according to a predetermined protocol supported by the communication network 7010.
- the microcomputer 7610 of the integrated control unit 7600 is via at least one of general-purpose communication I / F7620, dedicated communication I / F7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I / F7660, and in-vehicle network I / F7680.
- the vehicle control system 7000 is controlled according to various programs based on the information acquired. For example, the microcomputer 7610 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the drive system control unit 7100. May be good.
- the microcomputer 7610 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. Cooperative control may be performed for the purpose of. Further, the microcomputer 7610 automatically travels autonomously without relying on the driver's operation by controlling the driving force generator, the steering mechanism, the braking device, etc. based on the acquired information on the surroundings of the vehicle. Coordinated control may be performed for the purpose of driving or the like.
- ADAS Advanced Driver Assistance System
- the microcomputer 7610 has information acquired via at least one of general-purpose communication I / F7620, dedicated communication I / F7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I / F7660, and in-vehicle network I / F7680. Based on the above, three-dimensional distance information between the vehicle and an object such as a surrounding structure or a person may be generated, and local map information including the peripheral information of the current position of the vehicle may be created. Further, the microcomputer 7610 may predict a danger such as a vehicle collision, a pedestrian or the like approaching or entering a closed road based on the acquired information, and generate a warning signal.
- the warning signal may be, for example, a signal for generating a warning sound or turning on a warning lamp.
- the audio image output unit 7670 transmits an output signal of at least one of audio and image to an output device capable of visually or audibly notifying information to the passenger or the outside of the vehicle.
- an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are exemplified as output devices.
- the display unit 7720 may include, for example, at least one of an onboard display and a head-up display.
- the display unit 7720 may have an AR (Augmented Reality) display function.
- the output device may be other devices such as headphones, wearable devices such as eyeglass-type displays worn by passengers, projectors or lamps, in addition to these devices.
- the display device displays the results obtained by various processes performed by the microcomputer 7610 or the information received from other control units in various formats such as texts, images, tables, and graphs. Display visually.
- the audio output device converts an audio signal composed of reproduced audio data or acoustic data into an analog signal and outputs it audibly.
- At least two control units connected via the communication network 7010 may be integrated as one control unit.
- each control unit may be composed of a plurality of control units.
- the vehicle control system 7000 may include another control unit (not shown).
- the other control unit may have a part or all of the functions carried out by any of the control units. That is, as long as information is transmitted and received via the communication network 7010, predetermined arithmetic processing may be performed by any control unit.
- a sensor or device connected to any control unit may be connected to another control unit, and a plurality of control units may send and receive detection information to and from each other via the communication network 7010. .
- the image pickup device 101 according to the present embodiment described with reference to FIG. 1 can be applied to the image pickup unit 7410 of the application example shown in FIG. 33.
- the image pickup unit 7410 supplies the restored image with a high dynamic range to the vehicle exterior information detection unit 7400, and the detection accuracy of the vehicle exterior information detection unit 7400 is improved.
- the series of processes described above can be executed by hardware or software.
- the programs that make up the software are installed on the computer.
- the computer includes a computer (for example, a control unit 123, etc.) incorporated in dedicated hardware.
- the program executed by the computer can be provided by recording it on a recording medium (for example, recording medium 130 or the like) as a package medium or the like. Programs can also be provided via wired or wireless transmission media such as local area networks, the Internet, and digital satellite broadcasting.
- a recording medium for example, recording medium 130 or the like
- Programs can also be provided via wired or wireless transmission media such as local area networks, the Internet, and digital satellite broadcasting.
- the program executed by the computer may be a program that is processed in chronological order in the order described in this specification, or may be a program that is processed in parallel or at a necessary timing such as when a call is made. It may be a program in which processing is performed.
- the embodiment of the present technology is not limited to the above-described embodiment, and various changes can be made without departing from the gist of the present technology.
- this technology can have a cloud computing configuration in which one function is shared by a plurality of devices via a network and processed jointly.
- each step described in the above flowchart can be executed by one device or can be shared and executed by a plurality of devices.
- the plurality of processes included in the one step can be executed by one device or shared by a plurality of devices.
- a pixel group including a plurality of pixels that receive incident light from an incident subject without passing through either an imaging lens or a pinhole and output one detection signal indicating an output pixel value modulated by the incident angle of the incident light.
- a pixel array unit including at least a large pixel group including a plurality of large pixels having a large light receiving surface and a small pixel group including a plurality of small pixels having a small light receiving surface is provided.
- the ratio of the area of the large detection area set so that the incident light can be detected on the large light receiving surface to the area of the large setting area set in the large light receiving surface of each of the large pixels, and the area of each small pixel An image pickup device in which the ratio of the area of the small detection area set so that the incident light can be detected on the small light receiving surface is substantially equal to the area of the small setting area set in the small light receiving surface.
- the large setting area is an area defined by the logical sum of the large detection areas of all the large pixels included in the large pixel group.
- the large pixel group includes a plurality of the large pixels having different positions of the large detection area with respect to the large setting area.
- the image pickup device according to any one of (1) to (3), wherein the small pixel group includes a plurality of the small pixels having different positions of the small detection region with respect to the small setting region.
- the large setting area includes the large detection area and the large light blocking area that blocks the incident light.
- the large light-shielding region and the small light-shielding region include a light-shielding portion that blocks the incident light.
- the shapes of the large setting area and the small setting area are substantially similar.
- the large pixels are arranged in a grid pattern.
- (11) The image pickup device according to any one of (1) to (7) above, wherein in the pixel array unit, a pattern in which the large pixels surround a region in which a plurality of the small pixels are arranged is repeatedly arranged.
- a pixel array unit having at least a large pixel group including a plurality of large pixels having a large light receiving surface and a small pixel group including a plurality of small pixels having a small light receiving surface is provided, and the large light receiving of each of the large pixels is provided.
- the ratio of the area of the large detection area set so that the incident light can be detected on the large light receiving surface to the area of the large setting area set in the plane is set in the small light receiving surface of each small pixel.
- An imaging element in which the ratio of the area of the small detection area set so that the incident light can be detected on the small light receiving surface to the area of the small setting area is substantially equal to
- the first restored image is restored from the first detected image based on the detection signals output from the plurality of large pixels, and the second detected image based on the detection signals output from the plurality of small pixels is restored.
- An imaging device including a restoration unit that restores the restored image of 2 and generates a composite restored image by synthesizing the first restored image and the second restored image. (13)
- the restoration unit selects a pixel to be used for the composite restoration image from the pixels at the same position of the first restoration image and the second restoration image based on the pixel value, and uses the selected pixel to describe the pixel.
- the imaging apparatus which generates a composite restored image.
- the restoration unit is the first.
- the image pickup apparatus according to (13), wherein the pixels of the restored image are selected, and when the pixel values of the pixels of the first restored image are equal to or greater than the threshold value, the pixels of the second restored image are selected.
- the restoration unit multiplies the pixel values of the pixels selected from the first restoration image and the second restoration image by a predetermined coefficient as necessary, and then positions the selected pixels in the original restoration image.
- a pixel array unit having at least a large pixel group including a plurality of large pixels having a large light receiving surface and a small pixel group including a plurality of small pixels having a small light receiving surface is provided, and the large light receiving of each of the large pixels is provided.
- the ratio of the area of the large detection area set so that the incident light can be detected on the large light receiving surface to the area of the large setting area set in the plane is set in the small light receiving surface of each small pixel.
- the detection signal output from the plurality of large pixels of the image pickup element, in which the ratio of the area of the small detection area set so that the incident light can be detected on the small light receiving surface to the area of the small setting area is substantially equal to the area of the small setting area.
- the first restored image is restored from the first detected image based on the first detected image
- the second restored image is restored from the second detected image based on the detected signals output from the plurality of small pixels
- the first restored image is restored.
- a signal processing device including a restoration unit that generates a composite restoration image by synthesizing an image and the second restoration image.
- the restoration unit selects a pixel to be used for the composite restoration image from the pixels at the same position of the first restoration image and the second restoration image based on the pixel value, and uses the selected pixel to describe the pixel.
- the signal processing apparatus according to (16) above which generates a composite restored image.
- the restoration unit is the first.
- the signal processing apparatus wherein the pixels of the restored image of the above are selected, and when the pixel value of the pixels of the first restored image is equal to or greater than the threshold value, the pixels of the second restored image are selected.
- the restoration unit multiplies the pixel values of the pixels selected from the first restoration image and the second restoration image by a predetermined coefficient as necessary, and then positions the selected pixels in the original restoration image.
- the signal processing apparatus according to (18) above, which generates the composite restored image by arranging the images at the same positions as.
- a pixel group including a plurality of pixels that receive incident light from an incident subject without passing through either an imaging lens or a pinhole and output one detection signal indicating an output pixel value modulated by the incident angle of the incident light.
- a pixel array unit having at least a large pixel group including a plurality of large pixels having a large light receiving surface and a small pixel group including a plurality of small pixels having a small light receiving surface is provided, and the large light receiving of each of the large pixels is provided.
- the ratio of the area of the large detection area set so that the incident light can be detected on the large light receiving surface to the area of the large setting area set in the plane is set in the small light receiving surface of each small pixel.
- the detection signal output from the plurality of large pixels of the image pickup element, in which the ratio of the area of the small detection area set so that the incident light can be detected on the small light receiving surface to the area of the small setting area is substantially equal to the area of the small setting area.
- the first restored image is restored from the first detected image based on the first detected image
- the second restored image is restored from the second detected image based on the detected signals output from the plurality of small pixels.
- 101 image pickup device 111 signal processing control unit, 121 image sensor, 121a, 121a'pixel, 121b shading film, 122 restoration unit, 123 control unit, PL1 to PL5, PS1, PS2, PS4, PS5 pixel, RL1 to RL5, RS1 , RS2, RS4, RS5 setting area, SL1, SS1 light-shielding film, AL1, AS1 detection area
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
La présente technologie concerne un élément d'imagerie, un dispositif d'imagerie, un dispositif de traitement du signal et un procédé de traitement du signal avec lesquels une plage dynamique peut être étendue dans un dispositif d'imagerie qui n'utilise pas une lentille d'imagerie. L'élément d'imagerie comprend une unité de réseau de pixels qui comprend une pluralité de grands pixels ayant chacun une grande surface de réception de la lumière et un petit groupe de pixels et une pluralité de petits pixels ayant chacun une petite surface de réception de la lumière, comme groupes de pixels comprenant chacun une pluralité de pixels dont chacun reçoit la lumière incidente d'un sujet sans passer par une lentille d'imagerie ou un sténopé et émet un signal de détection indiquant une valeur de pixel de sortie modulée selon l'angle d'incidence de la lumière incidente, dans laquelle le rapport de la surface d'une grande région de détection établie sur la grande surface de réception de la lumière pour pouvoir détecter la lumière incidente à la surface d'une grande région d'établissement établie dans chaque grande surface de réception de la lumière est sensiblement égal au rapport de la surface d'une petite région de détection établie sur la petite surface de réception de la lumière pour pouvoir détecter la lumière incidente par rapport à la surface d'une petite région de réception établie dans chaque petite surface de réception de la lumière. La présente technologie peut s'appliquer, par exemple, à un dispositif d'imagerie.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021536937A JPWO2021020156A1 (fr) | 2019-07-31 | 2020-07-17 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-140675 | 2019-07-31 | ||
JP2019140675 | 2019-07-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021020156A1 true WO2021020156A1 (fr) | 2021-02-04 |
Family
ID=74229640
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/027787 WO2021020156A1 (fr) | 2019-07-31 | 2020-07-17 | Élément d'imagerie, dispositif d'imagerie, dispositif de traitement du signal et procédé de traitement du signal |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2021020156A1 (fr) |
WO (1) | WO2021020156A1 (fr) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015153859A (ja) * | 2014-02-13 | 2015-08-24 | 株式会社東芝 | 固体撮像装置 |
WO2017169754A1 (fr) * | 2016-03-29 | 2017-10-05 | ソニー株式会社 | Dispositif d'imagerie à semi-conducteurs et dispositif électronique |
WO2018012492A1 (fr) * | 2016-07-13 | 2018-01-18 | ソニー株式会社 | Dispositif d'imagerie, élément d'imagerie et dispositif de traitement d'image |
-
2020
- 2020-07-17 WO PCT/JP2020/027787 patent/WO2021020156A1/fr active Application Filing
- 2020-07-17 JP JP2021536937A patent/JPWO2021020156A1/ja active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015153859A (ja) * | 2014-02-13 | 2015-08-24 | 株式会社東芝 | 固体撮像装置 |
WO2017169754A1 (fr) * | 2016-03-29 | 2017-10-05 | ソニー株式会社 | Dispositif d'imagerie à semi-conducteurs et dispositif électronique |
WO2018012492A1 (fr) * | 2016-07-13 | 2018-01-18 | ソニー株式会社 | Dispositif d'imagerie, élément d'imagerie et dispositif de traitement d'image |
Also Published As
Publication number | Publication date |
---|---|
JPWO2021020156A1 (fr) | 2021-02-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11743604B2 (en) | Imaging device and image processing system | |
JP7310869B2 (ja) | 固体撮像装置、および電子機器 | |
JP7134952B2 (ja) | 撮像素子、及び、撮像素子を備えた電子機器 | |
US10885389B2 (en) | Image processing device, image processing method, learning device, and learning method | |
CN110959194B (zh) | 固态摄像器件及电子设备 | |
JP2018064007A (ja) | 固体撮像素子、および電子装置 | |
JP7484904B2 (ja) | 撮像素子、信号処理装置、信号処理方法、プログラム、及び、撮像装置 | |
WO2021020156A1 (fr) | Élément d'imagerie, dispositif d'imagerie, dispositif de traitement du signal et procédé de traitement du signal | |
US11650360B2 (en) | Color filter array patterns for enhancing a low-light sensitivity while preserving a color accuracy in image signal processing applications | |
JP7261168B2 (ja) | 固体撮像装置及び電子機器 | |
WO2023195395A1 (fr) | Dispositif de détection de lumière et appareil électronique | |
WO2023195392A1 (fr) | Dispositif de détection de lumière | |
US20240080587A1 (en) | Solid-state imaging device and electronic instrument | |
WO2021085152A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations, programme et système de traitement d'informations | |
WO2025013515A1 (fr) | Dispositif de détection de lumière, dispositif d'imagerie, et appareil électronique | |
WO2024048292A1 (fr) | Élément de détection de lumière, dispositif d'imagerie et système de commande de véhicule | |
JP2024092602A (ja) | 光検出素子及び電子機器 | |
JP2024073899A (ja) | 撮像素子 | |
WO2023219045A1 (fr) | Dispositif de réception de lumière, procédé de commande et système de mesure de distance | |
TW202433943A (zh) | 固態攝像裝置 | |
WO2022124092A1 (fr) | Élément d'imagerie à semi-conducteur, dispositif d'imagerie et procédé de commande d'unité d'imagerie à semi-conducteur | |
JP2022107201A (ja) | 撮像装置および電子機器 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20846014 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021536937 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20846014 Country of ref document: EP Kind code of ref document: A1 |