CN116456178A - Image sensor, imaging system and method of operating an image sensor - Google Patents

Image sensor, imaging system and method of operating an image sensor Download PDF

Info

Publication number
CN116456178A
CN116456178A CN202310073700.2A CN202310073700A CN116456178A CN 116456178 A CN116456178 A CN 116456178A CN 202310073700 A CN202310073700 A CN 202310073700A CN 116456178 A CN116456178 A CN 116456178A
Authority
CN
China
Prior art keywords
router
wavelength
photons
color
photosensitive region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310073700.2A
Other languages
Chinese (zh)
Inventor
李秉熙
S·伯萨克
马克·艾伦·撒弗里奇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Semiconductor Components Industries LLC
Original Assignee
Semiconductor Components Industries LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/065,992 external-priority patent/US20230230987A1/en
Application filed by Semiconductor Components Industries LLC filed Critical Semiconductor Components Industries LLC
Publication of CN116456178A publication Critical patent/CN116456178A/en
Pending legal-status Critical Current

Links

Abstract

An image sensor, an imaging system, and a method of operating an image sensor are disclosed. At least one example is an image sensor comprising a plurality of image pixels. Each image pixel may include: a color router defining a router collection area on an upper surface thereof; a first photosensitive area located below the color router; a second photosensitive area located below the color router; and a third photosensitive area positioned below the color router. The color router is configured to: routing photons of a first wavelength received at the router collection region to the first photosensitive region, routing photons of a second wavelength received at the router collection region to the second photosensitive region, and routing photons of a third wavelength received at the router collection region to the third photosensitive region.

Description

Image sensor, imaging system and method of operating an image sensor
Technical Field
The present application relates to the technical field of imaging systems, and in particular, to imaging sensors and related methods. More particularly, the present application relates to an image sensor, an imaging system, and a method of operating an image sensor.
Background
Image sensors are used in electronic devices such as cellular telephones, cameras, and computers to capture images. In particular, the electronic device has an array of image pixels arranged in a grid pattern. Each image pixel receives an incident photon, such as a light ray, and converts the photon into an electrical signal. Many image sensors suffer from weak light photosensitivity. That is, in the case of weak light, such as dawn or dusk, insufficient photons may be captured to recreate the proper image.
Disclosure of Invention
One example is an image sensor comprising a plurality of image pixels. Each image pixel may include: a color router defining a router collection area (router collection area) on an upper surface; a first photosensitive area (photosensitive region) located below the color router; a second photosensitive area located below the color router; and a third photosensitive area positioned below the color router. The color router is configured to be configurable to: routing photons of a first wavelength received at the router collection region to the first photosensitive region, routing photons of a second wavelength received at the router collection region to the second photosensitive region, and routing photons of a third wavelength received at the router collection region to the third photosensitive region.
In an exemplary image sensor, when the color router routes photons of a first wavelength, the color router is further configured to route photons having a wavelength corresponding to red light to the first photosensitive region, and when the color router routes photons of a second wavelength, the color router is further configured to route photons having a wavelength corresponding to yellow light to the second photosensitive region, and when the color router routes photons of a third wavelength, the color router is further configured to route photons having a wavelength corresponding to green light to the third photosensitive region.
In an exemplary image sensor, when the color router routes photons of a first wavelength, the color router is further configured to route photons having a wavelength corresponding to red light to the first photosensitive region, and when the color router routes photons of a second wavelength, the color router is further configured to route photons having a wavelength corresponding to yellow light to the second photosensitive region; and when the color router routes photons of a third wavelength, the color router is further configured to route photons having a wavelength corresponding to blue light to the third photosensitive region.
In an exemplary image sensor, each image pixel may further include a fourth photosensitive region located below the color router. When the color router routes photons of a first wavelength, the color router is further configured to route photons having a wavelength corresponding to red light to the first photosensitive region, when the color router routes photons of a second wavelength, the color router is further configured to route photons having a wavelength corresponding to yellow light to the second photosensitive region, and when the color router routes photons of a third wavelength, the color router is further configured to route photons having a wavelength corresponding to green light to the third photosensitive region, and the color router is further configured to route photons having a wavelength corresponding to blue light to the fourth photosensitive region.
In an exemplary image sensor, each image pixel may further include a fourth photosensitive region located below the color router. When the color router routes photons of a first wavelength, the color router is further configured to route photons having a wavelength corresponding to red light to the first photosensitive region, when the color router routes photons of a third wavelength, the color router is further configured to route photons having a wavelength corresponding to green light to the second photosensitive region, when the color router routes photons of a third wavelength, the color router is further configured to route photons having a wavelength corresponding to blue light to the third photosensitive region, and the color router is further configured to route photons having a wavelength corresponding to infrared light to the fourth photosensitive region.
The exemplary image sensor further includes: the first photosensitive region defines a first collection area; the second photosensitive region defines a second collection area smaller than the first collection area; and the third photosensitive region defines a third collection area that is smaller than the second collection area. The first wavelength is longer than the second wavelength, and the second wavelength is longer than the third wavelength. The first wavelength may correspond to red light, the second wavelength may correspond to blue light, and the third wavelength may correspond to green light. The first wavelength may correspond to a first infrared wavelength, the second wavelength may correspond to a second infrared wavelength different from the first wavelength, and the third wavelength may correspond to a third infrared wavelength. The exemplary image sensor may further include a fourth photosensitive region located below the color router, the fourth photosensitive region defining a fourth collection area that is larger than the first collection area, and the color router may be configured to route photons of a fourth wavelength to the fourth photosensitive region, the fourth wavelength being greater than the first wavelength. The first wavelength may correspond to red light, the second wavelength may correspond to blue light, the third wavelength may correspond to green light, and the fourth wavelength may correspond to infrared light.
In an example image sensor, each image pixel may define a long dimension measured parallel to the router collection area, and each image pixel further comprises: the first photosensitive region defines a collection area defining a first shape; the second photosensitive region defining a collection area defining a second shape; and the third photosensitive region defines a collection area defining a third shape. The first shape, the second shape, and the third shape may be configured such that a longest horizontal distance that photons are routed through the color router is half of the long dimension.
In an exemplary image sensor, a color router may define a first quadrant and a second quadrant. The first photosensitive region may define a composite collection area comprised of a plurality of discrete photosensitive regions, and wherein the plurality of discrete photosensitive regions are equally divided under a first quadrant and a second quadrant. And the color router may be further configured to route photons of the first wavelength that reach within the first quadrant to a discrete photosensitive region directly under the first quadrant, and to route photons of the first wavelength that reach within the second quadrant to a discrete photosensitive region directly under the second quadrant. The example image sensor may also include an imaging controller operatively coupled to the first photosensitive region. The imaging controller may be configured to detect a phase imbalance based on a different number of photons reaching the first quadrant as compared to the second quadrant, and to modify the focus parameter based on the phase imbalance.
The exemplary image sensor may further include a collimator disposed over the color router.
Another example is an imaging system comprising: an imaging controller and a camera module. The camera module may include: a lens system coupled to the imaging controller; and a plurality of image pixels in operative relationship with the lens system and communicatively coupled to the imaging controller. Each image pixel may include: a color router defining a router collection area on an upper surface thereof; a first photosensitive area located below the color router; a second photosensitive area located below the color router; and a third photosensitive area located below the color router. The color router is configured to: the method includes routing photons of a first wavelength received at the router collection region to the first photosensitive region, routing photons of a second wavelength received at the router collection region to the second photosensitive region, and routing photons of a third wavelength received at the router collection region to the third photosensitive region.
In an exemplary imaging system, an imaging controller and a camera module may be associated with an automobile.
In an exemplary image sensor, when the color router routes photons of a first wavelength, the color router is further configured to route photons having a wavelength corresponding to red light to the first photosensitive region, when the color router routes photons of a second wavelength, the color router is further configured to route photons having a wavelength corresponding to yellow light to the second photosensitive region, and when the color router routes photons of a third wavelength, the color router is further configured to route photons having a wavelength corresponding to green light to the third photosensitive region.
In an example image system, when the color router routes photons of a first wavelength, the color router is further configured to route photons having a wavelength corresponding to red light to the first photosensitive region, when the color router routes photons of a second wavelength, the color router is further configured to route photons having a wavelength corresponding to yellow light to the second photosensitive region, and when the color router routes photons of a third wavelength, the color router is further configured to route photons having a wavelength corresponding to blue light to the third photosensitive region.
In an example image system, each image pixel may also include a fourth photosensitive area located below the color router. And when the color router routes photons of a first wavelength, the color router is further configured to route photons having a wavelength corresponding to red light to the first photosensitive region, when the color router routes photons of a second wavelength, the color router is further configured to route photons having a wavelength corresponding to yellow light to the second photosensitive region, when the color router routes photons of a third wavelength, the color router is further configured to route photons having a wavelength corresponding to green light to the third photosensitive region, and the color router is further configured to route photons having a wavelength corresponding to blue light to the fourth photosensitive region.
In an example image system, each image pixel may also include a fourth photosensitive area located below the color router. And when the color router routes photons of a first wavelength, the color router is further configured to route photons having a wavelength corresponding to red light to the first photosensitive region, when the color router routes photons of a third wavelength, the color router is further configured to route photons having a wavelength corresponding to green light to the second photosensitive region, when the color router routes photons of a third wavelength, the color router is further configured to route photons having a wavelength corresponding to blue light to the third photosensitive region, and the color router may also be configured to route photons having a wavelength corresponding to infrared light to the fourth photosensitive region.
The first photosensitive region in an exemplary image system may define a first collection area, the second photosensitive region may define a second collection area that is smaller than the first collection area, and the third photosensitive region may define a third collection area that is smaller than the second collection area. The first wavelength is longer than the second wavelength, and the second wavelength is longer than the third wavelength. The first wavelength may correspond to red light, the second wavelength may correspond to blue light, and the third wavelength may correspond to green light. The first wavelength may correspond to a first infrared wavelength, the second wavelength may correspond to a second infrared wavelength different from the first wavelength, and the third wavelength may correspond to a third infrared wavelength.
The exemplary imaging system may further include a fourth photosensitive area located below the color router, the fourth photosensitive area defining a fourth collection area that is larger than the first collection area. The color router may be configured to route photons of a fourth wavelength to the fourth photosensitive region, the fourth wavelength being greater than the first wavelength. The first wavelength may correspond to red light, the second wavelength may correspond to blue light, the third wavelength may correspond to green light, and the fourth wavelength may correspond to infrared light.
Exemplary imaging systems each image pixel may define a long dimension measured parallel to the router collection area, and each image pixel may further comprise: the first photosensitive region defines a collection area defining a first shape; the second photosensitive region defining a collection area defining a second shape; and the third photosensitive region defines a collection area defining a third shape. The first shape, the second shape, and the third shape may be configured such that a longest horizontal distance that photons are routed through the color router is half of the long dimension.
In an exemplary imaging system, a color router may define a first quadrant and a second quadrant, the first photosensitive region may define a composite collection area comprised of a plurality of discrete photosensitive regions, and the plurality of discrete photosensitive regions may be bisected under the first quadrant and the second quadrant. The color router may be further configured to route photons of the first wavelength that reach within the first quadrant to a discrete photosensitive region directly under the first quadrant, and to route photons of the first wavelength that reach within the second quadrant to a discrete photosensitive region directly under the second quadrant. The example imaging system may also include an imaging controller operatively coupled to the first photosensitive region. The imaging controller may be configured to detect a phase imbalance based on a different number of photons reaching the first quadrant as compared to the second quadrant, and to modify the focus parameter based on the imbalance.
The example imaging system may also include a collimator disposed over the color router.
Still other examples are methods of operating an image sensor, the method comprising: directing photons from the scene into a color router located over a plurality of photosensitive regions; the color router routes photons of a first wavelength to a first photosensitive region located below the color router; routing photons of a second wavelength to a second photosensitive region located below the color router; and routing photons of a third wavelength to a third photosensitive region located below the color router.
In an exemplary method, the first wavelength may correspond to red light, the second wavelength may correspond to yellow light, and the third wavelength may correspond to green light.
In an exemplary method, the first wavelength may correspond to red light, the second wavelength may correspond to yellow light, and the third wavelength may correspond to blue light.
In an exemplary method, the first wavelength may correspond to red light, the second wavelength may correspond to yellow light, and the third wavelength may correspond to green light. The method may further include routing photons of a fourth wavelength corresponding to blue light by the color router to a fourth photosensitive region below the color router.
In an example method, the first wavelength may correspond to red light, the second wavelength may correspond to green light, and the third wavelength may correspond to blue light. The method may further include the color router routing photons of a fourth wavelength corresponding to infrared light to a fourth photosensitive region below the color router.
In an exemplary method, the first photosensitive region may define a first collection area, the second photosensitive region may define a second collection area smaller than the first collection area, and the third photosensitive region may define a third collection area smaller than the second collection area. The first wavelength may be longer than the second wavelength, and the second wavelength may be longer than the third wavelength. The first wavelength may correspond to red light, the second wavelength may correspond to blue light, and the third wavelength may correspond to green light. The first wavelength may correspond to a first infrared wavelength, the second wavelength may correspond to a second infrared wavelength, and the third wavelength may correspond to a third infrared wavelength. The example method may also include routing photons of a fourth wavelength to a fourth photosensitive region, the fourth photosensitive region defining a fourth collection area that is larger than the first collection area. The first wavelength may correspond to red light, the second wavelength may correspond to blue light, the third wavelength may correspond to green light, and the fourth wavelength may correspond to infrared light.
In an example method, each image pixel may define a long dimension measured parallel to the router collection area. Routing photons of the first wavelength further includes horizontally routing photons of the first wavelength no more than three-fourths of the long dimension to reach the first photosensitive region. Routing photons of the second wavelength further includes horizontally routing photons of the second wavelength no more than three-fourths of the long dimension to reach the second photosensitive region. Routing photons of the third wavelength further includes horizontally routing photons of the third wavelength no more than three-fourths of the long dimension to reach the third photosensitive region.
In an exemplary method: the color router may define a first quadrant and a second quadrant: the first photosensitive region may define a composite collection region comprised of a plurality of discrete photosensitive regions; and the plurality of discrete photosensitive regions may be equally divided under the first quadrant and the second quadrant. The color router may be further configured to route photons of the first wavelength that reach within the first quadrant to a discrete photosensitive region directly under the first quadrant, and to route photons of the first wavelength that reach within the second quadrant to a discrete photosensitive region directly under the second quadrant. The exemplary method may further comprise: detecting a phase imbalance based on a different number of photons reaching the first quadrant as compared to the second quadrant; and modifying the focus parameter based on the phase difference.
The example method may also include collimating photons between the scene and the color router.
Drawings
For a detailed description of exemplary embodiments, reference will now be made to the accompanying drawings in which:
FIG. 1A illustrates an imaging system in accordance with at least some embodiments;
FIG. 1B illustrates an implementation of an imaging system in accordance with at least some embodiments;
FIG. 2 illustrates an image sensor in accordance with at least some embodiments;
FIG. 3A shows a top view of a related art image pixel;
FIG. 3B shows a cross-sectional view of a related art image pixel;
FIG. 4 illustrates a perspective exploded view of an image pixel in accordance with at least some embodiments;
FIG. 5 illustrates a top view and simplified labeling of the exemplary image pixel of FIG. 4 having RYGB sensitivity (RYGB sensitivity) in accordance with at least some embodiments;
FIG. 6 illustrates a top view of an exemplary image pixel having RYGB sensitivity in accordance with at least some embodiments;
FIG. 7 illustrates a top view of an image pixel in accordance with at least some embodiments;
FIG. 8 illustrates a top view of an image pixel in accordance with at least some embodiments;
FIG. 9 illustrates a top view of an image pixel in accordance with at least some embodiments;
FIG. 10 illustrates a top view of an image pixel in accordance with at least some embodiments;
FIG. 11 illustrates a top view of an image pixel in accordance with at least some embodiments;
FIG. 12 illustrates a top view of an image pixel in accordance with at least some embodiments;
FIG. 13 illustrates a top view of an image pixel in accordance with at least some embodiments;
FIG. 14 illustrates a top view of an image pixel in accordance with at least some embodiments;
FIG. 15 illustrates a top view of an image pixel in accordance with at least some embodiments;
FIG. 16 illustrates a top view of an image pixel in accordance with at least some embodiments;
FIG. 17 illustrates a cross-sectional view of an image pixel including a collimator in accordance with at least some embodiments;
FIG. 18 illustrates a cross-sectional view of an image pixel including a collimator in accordance with at least some embodiments;
FIG. 19 illustrates an exploded side view of a color router in accordance with at least some embodiments; and
FIG. 20 illustrates an exploded side view of a color router in accordance with at least some embodiments.
Definition of the definition
Various terms are used to refer to particular system components. Different companies may refer to a component by different names-the present application is not intended to distinguish between components that differ in name but not function. In the following discussion and claims, the terms "comprise" and "include" are used in an open-ended fashion, and thus should be interpreted to mean "including, but not limited to … …" and the term "coupled" is intended to mean an indirect or direct connection. Thus, if a first device couples to a second device, that connection may be through a direct connection or through an indirect connection via other devices and connections.
Terms defining height, such as "above," "below," "above," and "below," should be positional terms referring to the direction of light incident on the pixel array and/or image pixels. Before interacting with or passing through "under" or "beneath" objects and/or structures, the incoming light should be considered to interact with or pass through "over" and "above" objects and/or structures. Thus, the positional term may have no relation to the direction of gravity.
"IR" refers to infrared.
In the case of an electrical device, whether it is stand-alone or as part of an integrated circuit, the terms "input" and "output" refer to electrical connections to the electrical device and are not to be construed as verbs requiring action. For example, a differential amplifier such as an operational amplifier may have a first differential input and a second differential input, and these "inputs" define electrical connections to the operational amplifier and should not be construed as requiring input of a signal to the operational amplifier.
"controller" shall mean, alone or in combination, a single circuit component, an Application Specific Integrated Circuit (ASIC), a microcontroller with control software, a Reduced Instruction Set Computer (RISC) with control software, a Digital Signal Processor (DSP), a processor with control software, a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), or a programmable system-on-a-chip (PSOC), the controller configured to read an input and drive an output in response to the input.
Detailed Description
The present application claims the benefit of U.S. provisional application No. 63/266,804, entitled "nanophotonic filters and lenses (Nanophotonic Color Filter and Lens)", filed on 1 month 14 of 2022, which provisional application is incorporated herein by reference as if fully reproduced below.
The following discussion is directed to various embodiments of the invention. Although one or more of these embodiments may be preferred, the embodiments disclosed should not be interpreted, or otherwise used, as limiting the scope of the disclosure, including the claims. Furthermore, those skilled in the art will appreciate that the following description has broad application, and the discussion of any embodiment is meant only to be exemplary of that embodiment, and not intended to intimate that the scope of the disclosure, including the claims, is limited to that embodiment.
Various examples are directed to imaging systems, image pixels, and related methods. More specifically, at least some examples are directed to image pixels that are designed and constructed to be more sensitive in low light conditions by not using filters (color filters) that tend to absorb light at certain frequencies, thereby reducing the total number of photons available to the photosensitive area. More specifically, various examples are directed to using a color router to direct photons of light incident on a collection area of the color router to image pixels of an underlying photosensitive region. Routing is based on the wavelength of each photon. Other examples relate to image pixels in which each photosensitive region has a collection area associated with an overlaid color router, and wherein the collection area of each photosensitive region is proportional to the wavelength of photons directed to each photosensitive region. That is, photons with shorter wavelengths (e.g., blue light) are directed to a photosensitive region with a smaller collection area, and photons with longer wavelengths (e.g., red light or infrared light) are directed to a photosensitive region with a larger collection area. In yet another example, the lower photosensitive region is arranged such that the longest horizontal distance that a photon passes through the color router is half the long dimension of an image pixel. The description now turns to an example system to guide the reader.
FIG. 1A illustrates an exemplary imaging system. In particular, exemplary imaging system 100 may be a portable electronic device, such as a camera, a cellular telephone, a tablet computer, a webcam, a video camera, a video surveillance system, or a video game system with imaging capabilities. In other cases, the imaging system 100 may be an automotive imaging system. The camera module 102 may be used to convert incident light into digital image data. The camera module 102 may include one or more lens systems, hereinafter referred to as lenses 104 only, and one or more corresponding image sensors 106. Lens 104 may include a fixed lens and/or an adjustable lens. During image capture operations, light from a scene may be focused through lens 104 onto image sensor 106. In the case of an adjustable lens, various focus parameters may be adjusted by the camera module 102 and/or the imaging controller 108. The image sensor 106 may include circuitry for converting analog pixel data into corresponding digital image data for provision to the imaging controller 108. If desired, the camera module 102 may be provided with an array of lenses 104 and a corresponding array of image sensors 106.
The imaging controller 108 may include one or more integrated circuits, such as image processing circuitry, microprocessors, and storage devices, such as random access memory and non-volatile memory. The imaging controller 108 may be implemented using components separate from the camera module 102 or using components forming part of the camera module 102, e.g., circuitry forming part of the image sensor 106. The imaging controller 108 may be used to process and store digital image data captured by the camera module 102. The processed image data may be provided to an external device, such as a computer, external display, or other device, if desired, using a wired and/or wireless communication path coupled to imaging controller 108.
FIG. 1B illustrates an exemplary imaging system. Specifically, the example imaging system 100 includes an automobile or vehicle 110. Vehicle 110 is illustratively shown as a passenger vehicle, but exemplary imaging system 100 may be any type of vehicle, including commercial vehicles, buses, tractor-trailer vehicles, on-highway vehicles, off-road vehicles, tractors, and crop harvesting devices. In the example of fig. 1B, the vehicle 110 includes a front-view camera module 102, the front-view camera module 102 being configured to capture an image of a scene in front of the vehicle 110. Such a front-view camera module 102 may be used for any suitable purpose, such as lane keeping assistance, collision warning systems, distance-speed cruise control (distance-cruise control) systems, autopilot systems, and proximity detection. The exemplary vehicle 100 also includes a rear-view camera module 102, the rear-view camera module 102 being configured to capture an image of a scene behind the vehicle 110. Such a rear-view camera module 102 may be used for any suitable purpose, such as, for example, collision warning systems, reverse video, autopilot systems, monitoring the position of an overtaking vehicle, reversing, and proximity detection. The exemplary vehicle 110 also includes a side looking camera module 102, the side looking camera module 102 being configured to capture an image of a scene lateral to the vehicle 110. Such a side view camera module may be used for any suitable purpose, such as blind spot monitoring, collision warning systems, automatic driving systems, monitoring the position of an overtaking vehicle, lane change detection, and proximity detection. In the case where imaging system 100 is a vehicle, imaging controller 108 may be a controller of vehicle 110. An exemplary image sensor 106 of the camera module 102 will now be discussed in more detail.
Fig. 2 illustrates an exemplary image sensor 106. In particular, fig. 2 shows that the image sensor 106 may include a substrate 200 of semiconductor material (e.g., silicon) encapsulated within a package to create a packaged semiconductor device or packaged semiconductor product. Bond pads or other connection points of substrate 200 are coupled to terminals of image sensor 106, e.g., serial communication channel 202 coupled to terminal 204, and capture input 206 coupled to terminal 208. Additional terminals, such as a ground terminal or a common terminal, and a power terminal, are present, but are omitted to avoid undue complications of the drawing. Although a single substrate 200 is shown, in other cases, multiple substrates may be combined to form image sensor 106, thereby forming a multi-chip module.
The image sensor 106 includes a pixel array 210, the pixel array 210 including a plurality of image pixels 212 arranged in rows and columns. The pixel array 210 may include, for example, hundreds or thousands of rows and columns of image pixels 212. Control and readout of the pixel array 210 may be achieved by an image sensor controller 214 coupled to a row controller 216 and a column controller 218. The exemplary row controller 216 may receive row addresses from the image sensor controller 214 and provide corresponding row control signals to the image pixels 212, such as reset signals, row select signals, charge transfer signals, dual conversion gain signals, and readout control signals. The row control signals may be transmitted through one or more conductors, such as row control path 220.
Column controller 218 may be coupled to pixel array 210 by one or more conductors such as column lines 222. The column controller may sometimes be referred to as a column control circuit, a readout circuit, and/or a column decoder. Column lines 222 may be used to read out image signals from image pixels 212 and to provide bias currents and/or bias voltages to image pixels 212. If desired, a row of pixels in pixel array 210 may be selected using row controller 216 during a pixel readout operation, and image signals generated by image pixels 212 in that row may be read out along column lines 222. Each image pixel 212 may include multiple photosensitive regions, e.g., four, nine, or sixteen, so although each column line 222 is shown as a single conductor, a plurality of such column lines may be associated with each image pixel 212 in a column.
The example column controller 218 may include sample and hold circuitry for sampling and temporarily storing image signals read out of the pixel array 210, amplifier circuitry, analog-to-digital conversion (ADC) circuitry, bias circuitry, column memory, latch circuitry for selectively enabling or disabling the column circuitry, or other circuitry coupled to one or more columns of pixels in the pixel array 210 to operate the image pixels 212 and read out image signals from the image pixels 212. The ADC circuitry in the column controller 218 may convert analog pixel values received from the pixel array 210 into corresponding digital image data. The column controller 218 may provide digital image data to the image sensor controller 214 and/or the imaging controller 108 (fig. 1) via, for example, the serial communication channel 202.
Fig. 3A shows a top view of a related art image pixel 312. Specifically, the related art image pixel 312 includes four photosensitive regions in total: red light region 300; a first green light region 302; a second green light region 304 and a blue light region 306. Fig. 3B shows a partial cross-sectional view of a related art image pixel 312 taken from line 3B-3B of fig. 3A. Specifically, fig. 3B shows that the red sensor is composed of a microlens 320, a red filter 322, and a photosensitive region 324. For discussion purposes, consider incident light 326 (shown by the arrow) entering red sensor 300. Light 326 initially encounters microlens 320, and microlens 320 may be a convex lens designed and configured to direct incident light 326 to a lower region of the sensor. The lens may be spherical. Light 326 then encounters the optical filter as red filter 322. The material of red filter 322 is selected to pass light having a wavelength corresponding to red light, for example, between about 595 and 655 nanometers (nm), and to filter out or absorb light of other colors. The remaining light 326 then passes through one or more oxide layers (not numbered) and then enters the photosensitive region 324 where it is absorbed. The absorption of light produces a corresponding electrical signal having a parameter indicative of the intensity of the received red light, e.g., the number of photons received during the detection period. The parameter indicative of the intensity may be any suitable parameter, for example, the magnitude of the current or the magnitude of the voltage. Accordingly, the red sensor 300 generates an electrical signal proportional to the amount of light entering the photosensitive region 324.
Similarly, fig. 3B shows a green light sensor 302, which is composed of a microlens 328, a green light filter 330, and a photosensitive region 332. For discussion purposes, consider incident light 334 as indicated by the arrow entering green sensor 302. Incident light 334 initially encounters microlens 328. Light 334 then encounters an optical filter that is green light filter 330. The material of green filter 330 is selected to pass light having a wavelength corresponding to green light, for example, between about 515 and 575nm, and to filter out or absorb light of other colors. The remaining light 334 then passes through one or more oxide layers (not numbered) and then enters the light sensitive region 332 where it is absorbed. The absorption of light produces a corresponding electrical signal having a parameter indicative of the intensity of the green light received, e.g., the number of photons received during the detection period. Also, the parameter indicative of the intensity may be any suitable parameter, for example, the magnitude of the current or the magnitude of the voltage. Accordingly, the green sensor 302 generates an electrical signal proportional to the amount of light entering the photosensitive region 332. Similar discussion regarding the second green sensor 304 and the blue sensor 306 (each of which may be configured in the same or similar manner) is omitted to avoid unduly lengthening the specification.
Referring to both fig. 3A and 3B. The performance of the related art image pixel 312 may be affected in low light conditions. The global image pixels 312 define a light collection area in the form of four sensors. If each sensor defines a square unit area, the exemplary image pixel 312 defines four square unit areas for collecting light. However, because of the use of filters that absorb light having wavelengths outside the design passband, only a small portion of the light of a particular color incident on the entire image pixel 312 enters the photosensitive region associated with that color. For example, light having a wavelength corresponding to red light but incident on green sensor 302 or 304 or incident on blue sensor 306 is absorbed by the respective filter, and such absorbed light cannot contribute to the generation of an electrical signal in the photosensitive region. For the exemplary image pixel 312, only 25% of the red light incident on the entire image pixel 312 reaches the photosensitive region associated with the red sensor, only 25% of the blue light incident on the entire image pixel 312 reaches the photosensitive region associated with the blue sensor 306, and only 50% of the green light incident on the entire image pixel 312 reaches the photosensitive regions associated with the green sensors 302 and 304.
Another reason for poor performance in low light conditions is that the filters themselves are not efficient at passing colors within the passband. For example, green filter 330 may be made of a material that passes only 90% of the green light incident on the filter, and then green filter 330 absorbs approximately 10% of the green light. Thus, of the green light falling on the entire image pixel 312, only 50% falls on the two exemplary green light sensors 302 and 304, and only 90% of the 50% green light, or only about 45% green light, reaches the photosensitive areas of the green light sensors 302 and 304. Filters for the red sensor 300 and the blue sensor may have similar problems.
The performance of the image sensor may be improved by using nanophotonic structures or color routers. In particular, a color router is a semiconductor structure that accepts photons incident on or defined by a top surface of the router. The color router then routes photons from the router collection area to the underlying photosensitive area, the routing being based on the wavelength of each particular photon. The router collection area may correspond to an entire image pixel, being larger in area than the area above a single photosensitive area. The present specification defines and employs the following notation to refer to the color represented by the wavelength of a photon: red photons are photons having a wavelength corresponding to red light; a yellow photon is a photon having a wavelength corresponding to yellow light; the blue light photon is a photon having a wavelength corresponding to blue light; cyan photons are photons having a wavelength corresponding to cyan; an infrared photon is a photon having a wavelength corresponding to infrared light; etc. Consider, for example, a red-yellow-cyan (RYYCy) image pixel. Considering this terminology, of all photons incident on the router collection area of the color router of an example RYYCy image pixel: the red photons are directed to a photosensitive region designated for receiving red light; the yellow photons are directed to a photosensitive region designated for receiving yellow light; and the cyan photons are directed to a photosensitive region designated for receiving cyan. Thus, for example, red photons that accidentally physically reach over the designated light-sensitive area for cyan light or the designated light-sensitive area for yellow light are not lost by absorption, but are routed to the designated light-sensitive area for red light.
Fig. 4 shows an exploded perspective view of an exemplary image pixel 212. Specifically, FIG. 4 shows an exemplary image pixel 212 defining a color router 400 and four photodiodes or photosensitive regions 402, 404, 406, and 408. In the exploded view of fig. 4, color router 400 is separated from photosensitive areas 402, 404, 406, and 408 for purposes of explanation. However, in practice, color router 400 may be above and directly adjacent to the upper surfaces of photosensitive regions 402, 404, 406, and 408, or one or more additional layers may be present between the lower surface of color router 400 and the upper surfaces of photosensitive regions 402, 404, 406, and 408. For example, an oxide layer may be present between the lower surface of color router 400 and the upper surfaces of photosensitive regions 402, 404, 406, and 408. The oxide layer may also include a metal gate for draining unwanted electrical current, such as electrostatic charge. Intermediate layers between the lower surface of color router 400 and the upper surfaces of photosensitive areas 402, 404, 406, and 408 are not shown to avoid unduly complicating the drawing.
Photosensitive regions 402, 404, 406, and 408 are semiconductor regions, such as silicon, in which photons of light can be captured or absorbed to produce an electrical signal, such as a voltage or current. The photosensitive regions 402, 404, 406, and 408 are not themselves known as the wavelength of the absorbed photons-for an image pixel, photons of any suitable wavelength spanning from the visible region to the infrared region-that can be absorbed once they reach the semiconductor material. In many cases, each photosensitive region is designed and constructed as a photodiode that generates voltage and current in response to the capture or absorption of photons of light.
The exemplary color router 400 defines a router collection area 410 on its upper surface. That is, the upper surface of the color router 400 defines a length L CR And width W CR And length L CR And width W CR Collectively considered as defining a collection area. Each photosensitive region 402, 404, 406, and 408 itself defines a collection aperture or collection area. For example, photosensitive region 408 defines a length L PD And width W PD And length L PD And width W PD And are considered to define a collection area. Thus, photosensitive region 402 defines a collection area 412, photosensitive region 404 defines a collection area 414, photosensitive region 406 defines a collection area 416, and photosensitive region 408 defines a collection area 418. Considering that photosensitive areas 402, 404, 406, and 408 are located in an area below color router 400 or within the same range as color router 400, the collection area of each photosensitive area is smaller than router collection area 410. In other examples, the router collection area 410 may receive more or less light than shown in fig. 4The collection area ranges are the same. Different arrangements, sizes, shapes, photosensitivity and/or other characteristics of the photosensitive regions located below color router 400 may be used in place of the arrangements and characteristics shown in fig. 4, some of which will be discussed further herein.
In various examples, photons of light incident on the router collection area 410 of the color router 400 enter the structure of the color router 400 and are routed to a particular collection area of the underlying photosensitive area. In image pixels 212 having four underlying photosensitive regions, color router 400 may be designed and configured to: photons of the first wavelength received at router collection region 410 are routed to photosensitive region 402; routing photons of a second wavelength received at router collection region 410 to second photosensitive region 404; routing photons of a third wavelength received at router collection region 410 to third photosensitive region 406; and routes photons of the fourth wavelength received at the router collection region 410 to the fourth photosensitive region 408. The representative example of fig. 4 is an image pixel 212 having red, yellow, green, blue (RYGB) photosensitivity. Thus, for example RYGB photosensitivity, color router 400 may be designed and constructed to: routing red photons to photosensitive region 402; routing the yellow photons to photosensitive region 404; routing green photons to photosensitive region 406; and routes the blue photons to photosensitive region 408.
Exemplary image pixels 212 having RYGB photosensitivity may be particularly suitable for automotive applications. That is, one of the more difficult tasks performed by the autopilot system is to distinguish between red and yellow light, for example, of a stop light. The wavelength corresponding to red light is similar to the wavelength corresponding to yellow light. The related art image pixels using the filter have not only the drawbacks of the collection area described above, but also yellow filters designed for passing yellow photons tend to absorb a high percentage of the desired photons at yellow wavelengths close to red. However, the use of color router 400 reduces the disadvantage of the collection area-yellow photons incident anywhere on router collection area 410 can be routed to lower photosensitive region 404. And without the need for a yellow filter, the percentage of yellow photons reaching photosensitive region 404 is significantly higher than for the related art image pixels using a filter. Of course, the statement regarding yellow light is correct for all color sensitivities of the image pixel 212.
Color router 400 may take any suitable form. In many cases, color router 400 includes a structure of several tiers or layers, where each layer is designed and constructed to perform at least a portion of the routing. Each layer may be designed and constructed from a plurality of three-dimensional structures, such as rectangular solids of materials having different refractive indices and/or different dimensions. For example, the three-dimensional structure of a particular layer may be selectively made of silicon dioxide and silicon nitride to at least partially deflect and reflect photons toward a designated underlying photosensitive region. For the exemplary image pixel 212 of FIG. 4 having RYGB sensitivity, there may be several different designs for color router 400 that operate in the same manner.
The design of color router 400 may not be 100% efficient in routing photons received at router collection area 410 to respective color collection areas 412, 414, 416, and 418. For example, some photons may be reflected back through router collection area 410. Other photons (e.g., photons with high incidence angles) may be misrouted. In addition, refraction and reflection within color router 400 may send photons that flow between router collection area 410 and various collection areas 412, 414, 416, and 418. However, even considering the potential inefficiency of such color routers, the overall image pixel 212 may still have better collection efficiency, and thus better low light sensitivity, than other image pixels using filters. For example, consider that color router 400 is only 50% efficient at routing red photons. That is, in this example, only half of the red photons incident on router collection area 410 reach photosensitive area 406. The hypothetical efficiency of 50% may still be higher than the collection efficiency of red photons for only 25% of the color filtered image pixels in the same area. Further, an exemplary image pixel with color router 400 having only 50% efficiency in routing red photons may potentially collect more than twice the number of photons than image pixel 312 of the related art having red filter 322, assuming that the red filter may pass only about 90% of the red photons.
Fig. 5 shows a top view and simplified labeling of the exemplary image pixel 212 of fig. 4 having RYGB photosensitivity. In particular, color router 400 is not visible in FIG. 5, but FIG. 5 does show photosensitive region 402 designated for receiving red photons, photosensitive region 404 designated for receiving yellow photons, photosensitive region 406 designated for receiving green photons, and photosensitive region 408 designated for receiving blue photons. In some cases, each image pixel 212 may contain only four photosensitive regions. However, in other cases, the layout of FIG. 5 may be considered a unit cell within an entire image pixel, and the image pixel may replicate the unit cell and its color router multiple times within the image pixel. Fig. 6 shows an exemplary image pixel 212 having RYGB photosensitivity, wherein an exemplary unit cell is replicated four times to create an overall image pixel 212 having 16 photosensitive regions.
Likewise, the exemplary image pixels 212 may be particularly well suited for automotive applications. However, the use of image pixels having both red light sensitivity and yellow light sensitivity and color routers is not limited to RYGB sensitivity. Fig. 7 shows a top view of another exemplary image pixel 212, represented in simplified notation, designed for RYYB photosensitivity. Specifically, the exemplary image pixel 212 defines a photosensitive region 402 designated for receiving red photons, a photosensitive region 404 designated for receiving yellow photons, a photosensitive region 406 also designated for receiving yellow photons, and a photosensitive region 408 designated for receiving blue photons. As previously described, the image pixel 212 may contain only four photosensitive regions, or four photosensitive regions may be considered as unit cells within the entire image pixel and duplicated, for example, four unit cells total or sixteen photosensitive regions total.
Fig. 8 shows in simplified form a top view of yet another exemplary image pixel 212 designed for red-yellow-cyan (RYYCy) photosensitivity. Specifically, the exemplary image pixel 212 defines a photosensitive region 402 designated for receiving red photons, a photosensitive region 404 designated for receiving yellow photons, a photosensitive region 406 also designated for receiving yellow photons, and a photosensitive region 408 designated for receiving cyan photons. As previously described, the image pixel 212 may contain only four photosensitive regions, or four photosensitive regions may be considered as unit cells within the entire image pixel and duplicated, for example, four unit cells total or sixteen photosensitive regions total.
The various specific embodiments discussed so far have been directed to automotive applications related to the visible spectrum; however, when image pixels are designed and constructed to receive infrared wavelengths, it may also be beneficial to use color routers. Fig. 9 shows a top view of an exemplary image pixel 212 in a simplified labeling method, which is designed for RGB-infrared (RGB-IR) photosensitivity. Specifically, the exemplary image pixel 212 defines a photosensitive region 402 designated for receiving red photons, a photosensitive region 404 designated for receiving green photons, a photosensitive region 406 designated for receiving blue photons, and a photosensitive region 408 designated for receiving infrared photons, e.g., wavelengths longer than 700nm, or longer than about 850nm, or longer than about 905nm, or longer than about 940nm. As previously described, the image pixel 212 may contain only four photosensitive regions, or the four photosensitive regions may be considered as unit cells within the entire image pixel and duplicated, for example, a total unit cell or sixteen photosensitive regions in total.
In yet further cases, image pixels may be designed and constructed for hyperspectral use. Hyperspectral imaging can be used in industrial applications for counterfeit detection, powder analysis and/or for checking package integrity. In agriculture, hyperspectral imaging can be used for precision watering, fertilization, weed and pest control. Hyperspectral imaging also finds application in medical and surveillance. Fig. 10 shows a top view of an exemplary image pixel 212 in a simplified labeling method, which is designed for hyperspectral applications. Specifically, the exemplary image pixel 212 of FIG. 10 defines a photosensitive region 1000 designated for receiving green photons, a photosensitive region 1002 designated for receiving yellow photons, a photosensitive region 1004 designated for receiving blue photons, a photosensitive region 1006 designated for receiving orange photons, a photosensitive region 1008 designated for receiving violet photons having a wavelength corresponding to violet light, and a photosensitive region 1010 designated for receiving red photons. In addition to the visible region, the exemplary image pixel 212 of FIG. 10 includes a photosensitive region 1012 designated for receiving infrared photons of a first range, a photosensitive region 1014 designated for receiving infrared photons of a second range, and a photosensitive region 1016 designated for receiving infrared photons of a third range.
Still referring to fig. 10, in some cases, each image pixel 212 may contain only nine photosensitive regions. However, in other cases, the layout of FIG. 10 may be considered a unit cell within an entire image pixel, and an image pixel may replicate a unit cell (including its color router) multiple times within an image pixel. If the unit cell in fig. 10 is duplicated four times, the entire image pixel 212 may have a total of 36 photosensitive areas.
Fig. 11-13 show examples of image pixel layouts. These layouts are designed based at least in part on the routing considerations of the color router and the capture considerations of the photosensitive area. For example, the ability of a color router to route photons incident on the router collection area to an underlying photosensitive area may depend on the wavelength of the photons and the distance the photons travel within the color router, primarily horizontal distances. Furthermore, the ability of a photosensitive region to capture photons or absorb photons may be related to the wavelength of the photons and the volume of the photosensitive region. Photons with shorter wavelengths, such as violet and blue light, are absorbed more easily and more quickly than photons with longer wavelengths, such as red and infrared, for a given volume of photosensitive area.
Fig. 11 shows a top view of an exemplary image pixel 212 in simplified notation designed to at least partially impart (address) a wavelength-based ability to capture or absorb photons to a photosensitive region and a wavelength-based ability for a color router to route photons. Specifically, the exemplary image pixel 212 of FIG. 10 defines a photosensitive region 1100 designated for receiving red photons, a photosensitive region 1102 designated for receiving green photons, a photosensitive region 1104 likewise designated for receiving green photons, and a photosensitive region 1106 designated for receiving blue photons. Although not visible in fig. 11, each photosensitive region 1100, 1102, 1104, and 1106 has a uniform thickness or depth, wherein the depth is measured perpendicular to the plane of the page.
Exemplary photosensitive area 1100 designated for red light defines a length L R And width W R And length L R And width W R Together defining a collection area for photosensitive area 1100. Considering that the overlapping color router, which is not shown in the simplified labeling method, spans the entire image pixel 212, the collection area of the photosensitive area 1100 is smaller than the router collection area 410 (fig. 4).
Exemplary photosensitive region 1102 designated for green light defines a length L G And width W G And length L G And width W G Together defining a collection area for photosensitive region 1102. The collection area of the photosensitive area 1102 is smaller than the router collection area 410 (fig. 4). In addition, the collection area of the photosensitive region 1102 is smaller than the collection area of the photosensitive region 1100. The photosensitive region 1104, which is also designated for green light, has a collection area of about the same size as the collection area of the photosensitive region 1102. An exemplary photosensitive region 1106 designated for blue light defines a length L B And width W B And length L B And width W B Together defining a collection area for photosensitive area 1106. The collection area of the photosensitive area 1102 is smaller than the router collection area 410 (fig. 4). In addition, the collection area of the photosensitive region 1106 is smaller than the collection areas of the photosensitive regions 1100 and 1102.
Still referring to fig. 11, consider an exemplary red photon incident on router collection area 410. On average, such red photons may travel a horizontal distance to reach the red light sensitive region 1100, which is measured in the page plane of fig. 11. However, considering that the size of the collection area of the red light-sensing region 1100 is larger than the proportion of other pixels (e.g., pixels in fig. 7-9) to the collection area of the color router, the horizontal distance to the red light-sensing region may be shorter than for image pixels having a smaller proportion of the red light-sensing region to the collection area of the color router.
Consider now a blue photon incident on the router collection area 410. On average, because the blue light-sensitive region 1106 is smaller than the red light-sensitive region, the horizontal distance such blue light photons travel to reach the blue light-sensitive region 1106 may be longer than the horizontal distance red light photons travel to reach the red light-sensitive region. But because the wavelength of blue light is shorter than that of red light, blue photons may be absorbed by the silicon of the photodiode more efficiently than red photons.
In some cases, each image pixel 212 of fig. 11 may contain only four photosensitive regions. However, in other cases, the layout of FIG. 11 may be considered a unit cell within an entire image pixel, and the image pixel may replicate the unit cell and its color router multiple times within the image pixel. If the unit cell in fig. 11 is duplicated four times, the entire image pixel 212 may have a total of 16 photosensitive areas.
The size considerations of the collection area of the photosensitive region are not limited to wavelengths in the visible spectrum. The same collection area and wavelength considerations may be applied to image pixels having mixed photosensitivity, including, for example, visible and infrared wavelengths. Fig. 12 shows a top view of an exemplary image pixel 212 in simplified notation, in this case with photosensitive regions for visible and infrared light. Specifically, the exemplary image pixel 212 of FIG. 12 defines a photosensitive region 1200 designated to receive red photons, a photosensitive region 1202 designated to receive green photons, a photosensitive region 1204 also designated to receive blue photons, and a photosensitive region 1206 designated to receive infrared photons. As previously described, each photosensitive region 1200, 1202, 1204, and 1206 has a uniform thickness or depth (depth is measured perpendicular to the plane of the page), although not visible.
In the example of fig. 12, the collection area associated with photosensitive region 1206 designated for infrared light is larger than the collection area of photosensitive region 1200 designated for red light. The collection area associated with photosensitive region 1200 designated for red light is larger than the collection area of photosensitive region 1202 designated for green light. The collection area associated with the photosensitive area 1202 designated for green light is greater than the collection area of the photosensitive area 1202 designated for blue light. The size of the collection areas may be based on the wavelength of light designated to be routed to the respective collection areas. In some embodiments, the dimensions may be proportional to the wavelength of the specified wavelength. In some cases, each image pixel 212 of fig. 12 may contain only four photosensitive regions. However, in other cases, the layout of FIG. 12 may be considered a unit cell within an entire image pixel, and the image pixel may replicate the unit cell and its color router multiple times within the image pixel. If the unit cell in fig. 12 is duplicated four times, the entire image pixel 212 may have a total of 16 photosensitive areas.
The size considerations of the collection area of the photosensitive region are not limited to wavelengths in the visible spectrum and mixed visible and infrared light. The same collection area and wavelength considerations may apply to image pixels dedicated to infrared light only. Fig. 13 shows a top view of an exemplary image pixel 212 in simplified notation, in this case with only a photosensitive area for infrared light. Specifically, the exemplary image pixel 212 of FIG. 12 defines a photosensitive region 1300 designated for receiving infrared photons of a first range, a photosensitive region 1302 designated for receiving infrared photons of a second range, a photosensitive region 1304 designated for receiving infrared photons of a third range, and a photosensitive region 1306 designated for receiving infrared photons of a fourth range. As previously described, each photosensitive region 1300, 1302, 1304, and 1306 has a uniform thickness or depth, measured perpendicular to the plane of the page, although not visible.
In the example of fig. 13, the collection area associated with photosensitive region 1300 designated for the first range of infrared light is greater than the collection area of photosensitive region 1302 designed for the second range of infrared light. The collection area associated with photosensitive region 1302 designated for the second range of infrared light is greater than the collection area of photosensitive region 1306 designated for the fourth range of infrared light. The collection area associated with photosensitive region 1304 designated for the third range of infrared light is greater than the collection area of photosensitive region 1306 designated for the fourth range of infrared light. In this example, the first range of infrared light may have a longer wavelength than the second range of infrared light; the second range of infrared light may have a longer wavelength than the third range of infrared light; and the third range of infrared light may have a longer wavelength than the fourth range of infrared light. In some cases, each image pixel 212 in fig. 12 may contain only four photosensitive regions. However, in other cases, the layout of FIG. 12 may be considered a unit cell within an entire image pixel, and an image pixel may replicate a unit cell (including its color router) multiple times within an image pixel. If the unit cell in fig. 12 is duplicated four times, the entire image pixel 212 may have a total of 16 photosensitive areas.
In the above-described embodiments, each photosensitive region defines a cuboid (e.g., fig. 4-10 and 13) having a square collection aperture or square collection area, or a cuboid having a substantially square collection area, e.g., fig. 12. The size of the collection area may be based in part on the volume of the corresponding photosensitive area, and may also facilitate routing of the color router with a larger target area for longer wavelengths. In other embodiments, the layout of the photosensitive regions for a particular wavelength range may be a different shape. Furthermore, the shape of the individual collection areas may be selected to assist the color router in routing photons. The shape of each cell may be selected regardless of the wavelength of the designated photon for a given photosensitive region.
In some examples, multiple discrete photosensitive regions may be placed together to produce a desired shape for a given wavelength range. Thus, the collection areas of the plurality of discrete photosensitive regions taken together define an overall collection area for a given wavelength range. The detection signal for a given wavelength range may be achieved by summing the signals generated by the discrete photo-sensing regions in the analog domain or in the digital domain. For example, in fig. 13, the photosensitive region 1306 may be a single discrete photosensitive region, which is an example of a standard-size or unit-size photosensitive region. The integral photosensitive region 1302 may be created by using two discrete photosensitive regions of a unit size. The integral photosensitive region 1304 can also be created by using two discrete photosensitive regions of a unit size. The integral photosensitive region 1300 may be created by using four discrete photosensitive regions of a unit size. By implementing photosensitive regions of unit size, process integration may be simpler.
Fig. 14 shows a top view of an exemplary image pixel 212, represented in simplified notation, designed to at least partially impart a color router with the ability to route photons based on wavelength. Specifically, the exemplary image pixel 212 of FIG. 14 defines a photosensitive region 1400 designated for receiving red photons, a photosensitive region 1402 designated for receiving green photons, a photosensitive region 1404 also designated for receiving green photons, and a photosensitive region 1406 designated for receiving blue photons. The exemplary image pixel 212 defines a length L, a width W, and a longest dimension H, which is the hypotenuse of the triangle defined by the length L and the width W.
The exemplary photosensitive area 1400 designated for red light defines a collection area having a first shape. In the example of fig. 14, the shape is L-shaped with a long portion defining a portion of the first boundary of image pixel 212 and a short portion extending toward the center of image pixel 212. The L-shape has a width defined by a short dimension. An exemplary photosensitive area 1402 designated for green light defines an L-shaped collection area with a long portion defining a portion of the second boundary of an image pixel and a short portion extending toward the center of the image pixel. The short dimensions of the long portion of photosensitive region 1402 and the long portion of photosensitive region 1400 define a second boundary of image pixels 212. An exemplary photosensitive region 1406 designated for blue light defines an L-shape with a long portion defining a portion of a third boundary of image pixel 212 and a short portion extending toward the center of image pixel 212. The long portion of photosensitive region 1406 and the short dimension of the long portion of photosensitive region 1402 define a third boundary of image pixel 212. An exemplary photosensitive region 1404 designated for green light defines an L-shape with a long portion defining a portion of the fourth boundary of image pixel 212 and a short portion extending toward the center of image pixel 212. The short dimensions of the long portion of photosensitive region 1404 and the long portion of photosensitive region 1406 define a fourth boundary of image pixel 212. Finally, the short dimensions of the long portion of photosensitive region 1400 and the long portion of photosensitive region 1404 define a first boundary of image pixels 212.
Still referring to fig. 14, consider an exemplary red photon incident on router collection area 410. On average, such red photons travel a horizontal distance to reach the red light sensitive region 1400, which is measured in the page plane of fig. 14. However, considering the shape of the collection area of red light sensitive region 1400, color router 400 (fig. 4) may more easily route such photons because the longest horizontal distance such red photons may travel within color router 400 is half the long dimension of image pixel 212. For an exemplary red photon, consider that the red photon enters color router 400 at location 1410 in the upper right corner of image pixel 212 in the view of FIG. 14 (i.e., at the outermost point of the long portion of L-shaped photosensitive region 1402), and therefore, the exemplary red photon only needs to be routed a distance of half of the longest dimension of the image pixel (here half of hypotenuse H). The same description applies to the longest horizontal travel distance of a blue photon. For green photons, the shortest travel distance (for square image pixels) is one quarter of the length or width. Such a layout may therefore make it easier for color router 400 to route photons, and thus may make the design of the color router simpler. Other shapes of the layout are possible, as are other color photosensitivity, such as RYYCy. And as previously described, the exemplary layout of fig. 14 may be a unit cell that is replicated multiple times, e.g., four times, throughout image pixel 212.
In the exemplary image pixel 212 of FIG. 14, photosensitive regions 1400, 1402, 1404, and 1406 are assumed to be contiguous regions within the respective L-shapes. However, in a further instance, the underlying photosensitive region designated for receiving photons of a particular wavelength may be a composite collection region comprised of a plurality of discrete regions. Fig. 15 shows a top view of an exemplary image pixel 212 in simplified notation, wherein the image pixel has a plurality of discrete regions that collectively define the L-shaped photosensitive region of fig. 14. Specifically, the exemplary L-shaped photosensitive region 1400 is defined by discrete photosensitive regions 1500, 1502, 1504, and 1506. In the example of fig. 15, each of the discrete photosensitive regions 1500, 1502, 1504, and 1506 defines a square collection area having sides of short dimensional length. However, the collection area need not be square, so long as the collection area defines the desired shape (here an L-shaped structure) in the collection. Further, while fig. 15 shows four discrete photosensitive regions, two or more discrete photosensitive regions may together form an exemplary photosensitive region 1400. The exemplary image pixel 212 similarly shows photosensitive regions 1402, 1404, and 1406 that are made up of discrete photosensitive regions, but the description of the discrete composition of photosensitive regions 1402, 1404, and 1406 is the same as photosensitive region 1400 and will not be repeated to avoid unduly lengthening the description.
Other considerations for the design of color router 400 may include Phase Detection Autofocus (PDAF) considerations. That is, color router 400 is designed and constructed to route photons to respective photosensitive regions (photosensitive area), which may be based not only on wavelength, but also on the physical location where photons enter the router collection region of color router 400. Fig. 16 shows a top view of an exemplary image pixel 212 in simplified notation, wherein the image pixel 212 has the same photosensitive area as shown in fig. 14 and the same discrete elements as shown in fig. 15. Although color router 400 is not visible in the simplified labeling method of fig. 16, color router 400 may still be designed and constructed to route photons of a particular color designated for PDAF based on the location of the router collection area where the photons reach.
To describe this operation, fig. 16 includes a vertical dashed line 1600 and a horizontal dashed line 1602. The exemplary vertical lines 1600 and horizontal lines 1602 conceptually (although not necessarily physically) divide the collection area and underlying photosensitive area of color router 400 into four sub-portions; namely quadrants 1604, 1606, 1608, and 1610. In this example, consider that a green photon is designated for PDAF, which is routed as a function of the collection area—although any color may be selected in other implementations. In the example of fig. 16, for a given PDAF color, i.e., green, the four sub-portions are each composed of an equal number of discrete photosensitive regions. That is, quadrants 1604, 1606, 1608, and 1620 each include two discrete photosensitive regions designated for green photons. Having two discrete photosensitive regions in each subsection is merely one example. Each subsection need only have the same collection area for the designated color as the corresponding subsection across the dividing line (e.g., vertical or horizontal dashed line). Further, the collection area in each subsection may be defined by a single photosensitive region or a plurality of discrete photosensitive regions.
Using image pixel 212 in fig. 16, green photons reaching router collection area 410 (fig. 1) in quadrants 1604 and 1608 or intersecting router collection area 410 (fig. 1) in quadrants 1604 and 1608, respectively, are routed to the photosensitive regions designated for green light below quadrants 1604 and 1608. Exemplary green photons reaching router collection area 410 (fig. 4) within quadrants 1606 and 1610 or intersecting router collection area 410 (fig. 4) within quadrants 1606 and 1610 are routed to a photosensitive area designated for green light directly below quadrants 1606 and 1610. In the case where image pixel 212 in fig. 16 is a cell of the entire pixel array 210 (fig. 2), the focus problem may be detected based on the non-uniform phase or spatial distribution of the exemplary green photons reaching image pixel 212. For example, based on the routing described with respect to fig. 16, the phase or spatial distribution of green light photons arriving at the underlying photosensitive region may be weighted more on one side of the vertical dashed line 1600 than the other side of the vertical dashed line 1600. The precise phase imbalance may be determined further based on the position of image pixels 212 within the entire pixel array 210, the orientation of image pixels 212 relative to the scene, and/or the degree of scene defocus. Thus, by detecting a phase or spatial imbalance on the exemplary vertical dashed line 1600 associated with the image pixel 212 of fig. 16, the magnitude and position of the phase imbalance may be used to automatically adjust the focus of the lens 104 (fig. 1) of the camera module 102 (fig. 1).
The same reasoning can be applied for the quadrants in fig. 16 that are divided by the horizontal dashed line 1602. For example, the green photons reaching router collection area 410 (FIG. 1) within quadrants 1604 and 1606 or intersecting router collection area 410 (FIG. 1) within quadrants 1604 and 1606 are routed to the photosensitive areas designated for green light below quadrants 1604 and 1606, respectively. The exemplary green photons reaching the router collection area 410 in quadrants 1608 and 1610 or intersecting the router collection area 410 in quadrants 1608 and 1610 are routed to the photosensitive regions designated for green light below quadrants 1606 and 1610, respectively. The focus problem may be detected based on the phase or spatial distribution of the exemplary green photons reaching image pixel 212. For example, based on the routing described with respect to fig. 16, the phase or spatial distribution of green light photons arriving at the underlying photosensitive region may be weighted more on one side of the horizontal dashed line 1602 than the other side of the horizontal dashed line 1602. The precise phase imbalance may be determined further based on the position of image pixels 212 within the entire pixel array 210, the orientation of image pixels 212 relative to the scene, and/or the degree of scene defocus. Thus, by detecting a phase or spatial imbalance on the exemplary horizontal dashed line 1602 associated with the image pixel 212 of fig. 16, the magnitude and position of the phase imbalance may be used to automatically adjust the focus of the lens 104 (fig. 1) of the camera module 102 (fig. 1).
In the exemplary PDAF considerations discussed with respect to fig. 16, green light is an exemplary PDAF color. Because the exemplary image pixel 212 has RGGB sensitivity, the same number of discrete green light sensitive regions are located below each quadrant of the color router 400. Alternatively, for example RGGB photosensitivity, red or blue light may be selected as the PDAF color. With red light as the designated PDAF color, the exemplary image pixel 212 has an equal number of discrete red light sensitive regions in quadrants 1604 and 1608. That is, when divided by horizontal line 1602, the upper subsection of exemplary image pixel 212, including quadrants 1604 and 1606, has an equal number of discrete red light sensitive areas as the lower subsection, including quadrants 1608 and 1610. Thus, the PDAF color need not have an equal number of discrete photosensitive regions in all four quadrants.
In some examples, an image pixel may be associated with a collimator or other device that reduces the angle of incidence of incident photons. As described above, the ability of color router 400 to route based on wavelength and/or location may be compromised when photons contact the router collection area at high angles of incidence. Thus, reducing the angle of incidence of incident photons to the color router may improve routing efficiency.
Fig. 17 shows a cross-sectional view of an exemplary image pixel 212. Specifically, a cross-sectional view of image pixel 212 depicts four photosensitive regions 1700, 1702, 1704, and 1706. Photosensitive regions 1700, 1702, 1704, and 1706 can be any combination of color sensitivity, such as those discussed above. Located above photosensitive areas 1700, 1702, 1704, and 1706 is color router 400. The exemplary image pixel 212 also defines a collimator 1710 disposed above the color router 400. In the example shown, collimator 1710 abuts color router 400, but in other cases, one or more additional layers, such as an oxide layer, may be located between collimator 1710 and color router 400.
As the name suggests, collimator 1710 is designed and constructed to at least partially collimate photons collected by collimator 1710 before they are incident on router collection area 410. In other words, collimator 1710 is designed and constructed to modify the angle of incidence of at least some photons before they are incident on router collection area 410. Collimator 1710 may take any suitable form, such as a set of parallel walls forming a grid pattern. The collimator may be designed and constructed from a plurality of three-dimensional structures (e.g. rectangular parallelepiped) or materials with different refractive indices and/or different dimensions. For example, the three-dimensional structure of the collimator layer may be selectively made of silicon dioxide and silicon nitride to modify the incident angle.
Fig. 18 shows an exemplary collimator 1710 in the form of a microlens. Fig. 18 defines the same underlying assembly as fig. 17. However, in fig. 18, collimators 1710 are schematically shown as microlenses 1800 and 1802. That is, exemplary microlens 1800 is disposed above color router 400 and directly above photosensitive regions 1700 and 1702. An exemplary microlens 1802 is disposed over color router 400 and directly over photosensitive regions 1704 and 1706. Although the example of fig. 18 shows each microlens associated with at least two photosensitive regions, a given microlens may be associated with any non-zero number of photosensitive regions. For example, a single microlens may span one photosensitive region, one image pixel 212, or multiple image pixels. Furthermore, the distribution of microlenses across the pixel array 210 (fig. 2) may be non-uniform, and the pixel array 210 may include hundreds to thousands of image pixels 212. For example, the lens 104 (fig. 1) may tend to better collimate photons incident on the center of the pixel array 210 than photons incident on the periphery of the pixel array. In this case, the center region of the pixel array 210 may have fewer or even no collimators 1710 in the form of microlenses, but the density of microlenses per unit area of the pixel array 210 may increase with increasing distance from the center of the pixel array 210. The increase in density may be linear, exponential or otherwise.
For the design considerations of color router 400, as described above, color router 400 may take any suitable form. In many cases, color router 400 is made up of several tiers or layers, with each layer designed and constructed to perform at least a portion of the routing. Each layer may be designed and constructed using multiple three-dimensional structures of materials having different refractive indices and/or different dimensions. For example, the three-dimensional structure may be a cuboid or other form. The material may include a dielectric material and/or a metallic material. To reduce the complexity of construction, constraints may be imposed on the size of the elements that make up each layer of the color router.
Fig. 19 shows an exploded side view of an exemplary color router 400. In particular, the exemplary color router 400 includes three levels or tiers 1900, 1902, and 1904. Although three representative layers are shown, color router 400 may include two or more layers, and thus the three layers shown in fig. 19 should not be construed as limiting. Layer 1900 is a first layer for receiving incident photons, and thus router collection area 410 may be defined by an upper surface of layer 1900. An example layer 1902 is located between layers 1900 and 1904. Finally, example layer 1904 defines the lowest layer, and the bottom surface of layer 1904 outputs the routed photons to a collection area (not shown in fig. 19) of the photosensitive region of the image pixel.
In an exemplary case, each layer is composed of a rectangular parallelepiped of a particular size or volume. For example, each cuboid of the layer 1900 defines a first dimension or first volume. Each cuboid of the layer 1902 defines a second dimension or second volume, the second dimension being greater than the first dimension and the second volume being greater than the first volume. Each cuboid of layers 1904 defines a third dimension or third volume, the third dimension being greater than the second dimension and the third volume being greater than the second volume. Even though the examples of fig. 19-20 show a linear change in the dimensions of the cuboid, the dimensional change of the cuboid of each layer may be in any order. Although only three illustrative layers are shown in the example of fig. 19, each additional layer lower in color router 400 may be comprised of a cuboid that is larger or more bulky than the dimensions of the layers above. The size or volume of each layer may be set based on set constraints, which may reduce the complexity of the construction of the color router during semiconductor fabrication or processing. In other examples, different combinations of cuboid dimensions or volumes in a given layer may be used.
In some cases, the cuboid of each example layer is made of the same material (e.g., silicon oxide, silicon nitride, or titanium oxide) embedded in a lower refractive index material (e.g., silicon oxide and/or air). However, in other cases, the cuboid of each layer may be made of a material different from the material of the layers above and/or below. For example, a particular layer may be made of an oxide, an adjoining layer may be made of a nitride, and another layer may be made of a metal (e.g., gold or silver). Maintaining the same material for the cuboid on each layer may allow color router 400 to be designed and/or constructed. In other examples, different combinations of materials in a given layer may be used.
Fig. 20 shows an exploded side view of another exemplary color router 400. In particular, the exemplary color router 400 of FIG. 20 includes three levels or tiers 2000, 2002, and 2004. Although three representative layers are shown, color router 400 may include two or more layers, and thus the three layers shown in fig. 20 should not be construed as limiting. Layer 2000 is the first layer to receive incident photons, so router collection area 410 may be defined by the upper surface of layer 2000. Example layer 2002 is located between example layers 2000 and 2004. Finally, the example layer 2004 defines a lowermost layer, and the bottom surface of the layer 2004 outputs the routed photons to a collection area (not shown in fig. 20) of the photosensitive region of the image pixel.
As previously described, each layer in fig. 20 is composed of a rectangular parallelepiped of a particular size or volume, but in a size mode opposite to that of fig. 19. For example, each cuboid of the layer 2000 defines a first dimension or first volume. Each cuboid of the layer 2002 has a second dimension or second volume, the second dimension being smaller than the first dimension and the second volume being the first volume. Each cuboid of layer 2004 defines a third dimension or third volume, the third dimension being smaller than the second dimension and the third volume being smaller than the second volume. Even though the examples of the drawings show a linear variation in the dimensions of the cuboids, the dimensional variation of the cuboids in each layer may be in any order. Although only three illustrative layers are shown in the example of fig. 20, each additional layer lower in color router 400 may be comprised of a cuboid of smaller size or smaller volume than the layers above. The size or volume of each layer may be set based on set constraints, which may reduce the complexity of the construction of the color router during semiconductor fabrication or processing. In other examples, different combinations of cuboid dimensions or volumes in a given layer may be used.
The above discussion is meant to be illustrative of the principles and various embodiments of the present invention. Numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated. It is intended that the following claims be interpreted to embrace all such variations and modifications.

Claims (12)

1. An image sensor, the image sensor comprising:
a plurality of image pixels, each image pixel comprising:
a color router defining a router collection area on an upper surface;
a first photosensitive area located below the color router;
a second photosensitive area located below the color router;
a third photosensitive area located below the color router; and
the color router is configured to: routing photons of a first wavelength received at the router collection region to the first photosensitive region, routing photons of a second wavelength received at the router collection region to the second photosensitive region, and routing photons of a third wavelength received at the router collection region to the third photosensitive region.
2. The image sensor of claim 1, wherein:
when the color router routes photons of a first wavelength, the color router is further configured to route photons having a wavelength corresponding to red light to the first photosensitive region;
When the color router routes photons of a second wavelength, the color router is further configured to route photons having a wavelength corresponding to yellow light to the second photosensitive region; and
when the color router routes photons of a third wavelength, the color router is further configured to route photons having a wavelength corresponding to blue light to the third photosensitive region.
3. The image sensor of claim 1, wherein:
the first photosensitive region defines a first collection area;
the second photosensitive region defines a second collection area smaller than the first collection area;
the third photosensitive region defines a third collection area smaller than the second collection area; and
wherein the first wavelength is longer than the second wavelength and the second wavelength is longer than the third wavelength.
4. The image sensor of claim 1, wherein each image pixel defines a long dimension measured parallel to the router collection area, and each image pixel further comprises:
the first photosensitive region defines a collection area defining a first shape;
the second photosensitive region defining a collection area defining a second shape;
The third photosensitive region defines a collection area defining a third shape;
wherein the first shape, the second shape, and the third shape are configured such that a longest horizontal distance that photons are routed through the color router is half of the long dimension.
5. An imaging system, the imaging system comprising:
an imaging controller;
a camera module, comprising:
a lens system coupled to the imaging controller; and
a plurality of image pixels in operative relationship with the lens system and communicatively coupled to the imaging controller, each image pixel comprising:
a color router defining a router collection area on an upper surface;
a first photosensitive area located below the color router;
a second photosensitive area located below the color router;
a third photosensitive area located below the color router; and
the color router is configured to: the method includes routing photons of a first wavelength received at the router collection region to the first photosensitive region, routing photons of a second wavelength received at the router collection region to the second photosensitive region, and routing photons of a third wavelength received at the router collection region to the third photosensitive region.
6. The imaging system of claim 5, wherein:
when the color router routes photons of a first wavelength, the color router is further configured to route photons having a wavelength corresponding to red light to the first photosensitive region;
when the color router routes photons of a second wavelength, the color router is further configured to route photons having a wavelength corresponding to yellow light to the second photosensitive region; and
wherein when the color router routes photons of a third wavelength, the color router is further configured to route photons having a wavelength corresponding to green light to the third photosensitive region.
7. The imaging system of claim 5, wherein:
the first photosensitive region defines a first collection area;
the second photosensitive region defines a second collection area smaller than the first collection area;
the third photosensitive region defines a third collection area smaller than the second collection area; and
wherein the first wavelength is longer than the second wavelength and the second wavelength is longer than the third wavelength.
8. The imaging system of claim 5, wherein each image pixel defines a long dimension measured parallel to the router collection area, and each image pixel further comprises:
The first photosensitive region defines a collection area defining a first shape;
the second photosensitive region defining a collection area defining a second shape;
the third photosensitive region defines a collection area defining a third shape;
wherein the first shape, the second shape, and the third shape are configured such that a longest horizontal distance that photons are routed through the color router is half of the long dimension.
9. A method of operating an image sensor, the method comprising:
directing photons from the scene into a color router located over a plurality of photosensitive regions;
the color router routes photons of a first wavelength to a first photosensitive region located below the color router;
routing photons of a second wavelength to a second photosensitive region located below the color router; and
photons of a third wavelength are routed to a third photosensitive region located below the color router.
10. The method of claim 9, wherein the first wavelength corresponds to red light, the second wavelength corresponds to yellow light, and the third wavelength corresponds to green light.
11. The method according to claim 9, wherein:
The first photosensitive region defines a first collection area;
the second photosensitive region defines a second collection area smaller than the first collection area;
the third photosensitive region defines a third collection area smaller than the second collection area; and
wherein the first wavelength is longer than the second wavelength and the second wavelength is longer than the third wavelength.
12. The method of claim 9, wherein each image pixel defines a long dimension measured parallel to a router collection area of the color router, and wherein:
routing photons of the first wavelength further includes horizontally routing photons of the first wavelength no more than three-fourths of the long dimension to reach the first photosensitive region;
routing photons of the second wavelength further includes horizontally routing photons of the second wavelength no more than three-fourths of the long dimension to reach the second photosensitive region; and
routing photons of the third wavelength further includes horizontally routing photons of the third wavelength no more than three-fourths of the long dimension to reach the third photosensitive region.
CN202310073700.2A 2022-01-14 2023-01-13 Image sensor, imaging system and method of operating an image sensor Pending CN116456178A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US63/266,804 2022-01-14
US18/065,992 US20230230987A1 (en) 2022-01-14 2022-12-14 Imaging systems, and image pixels and related methods
US18/065,992 2022-12-14

Publications (1)

Publication Number Publication Date
CN116456178A true CN116456178A (en) 2023-07-18

Family

ID=87126233

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310073700.2A Pending CN116456178A (en) 2022-01-14 2023-01-13 Image sensor, imaging system and method of operating an image sensor

Country Status (1)

Country Link
CN (1) CN116456178A (en)

Similar Documents

Publication Publication Date Title
US10297629B2 (en) Image sensors with in-pixel lens arrays
CN109981939B (en) Imaging system
US10015416B2 (en) Imaging systems with high dynamic range and phase detection pixels
US9945718B2 (en) Image sensors with multi-functional pixel clusters
US8223234B2 (en) Solid-state imaging device and imaging apparatus having pixels portions over a substrate
CN211577919U (en) Fingerprint identification device and electronic equipment
US10158843B2 (en) Imaging pixels with depth sensing capabilities
US20170295327A1 (en) Multi-spectral imaging with diffractive optics
CN211404505U (en) Image sensor with a plurality of pixels
US20180301484A1 (en) Image sensors with high dynamic range and autofocusing hexagonal pixels
CN211404504U (en) Image sensor with a plurality of pixels
US20080080028A1 (en) Imaging method, apparatus and system having extended depth of field
KR20110069889A (en) Image sensor having multiple sensing layers and its method of operation and fabrication
WO2011053711A1 (en) Systems and methods for color binning
US10608029B2 (en) Image sensors with color filter variations
US20210195072A1 (en) Imaging device and image sensing method
CN212696099U (en) Imaging system, image sensor packaging device and image sensor
CN116456178A (en) Image sensor, imaging system and method of operating an image sensor
CN110890390B (en) Image sensor including a plurality of imaging pixels
CN208538861U (en) Imaging sensor and imaging pixel including multiple imaging pixels
US20230230987A1 (en) Imaging systems, and image pixels and related methods
WO2021077374A1 (en) Image sensor, imaging apparatus, and mobile platform
KR20210056754A (en) Image Sensor
US20230230989A1 (en) Image pixels having ir sensors with reduced exposure to visible light
CN114827451A (en) Imaging system with electronic shutter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication