CN112235494A - Image sensor, control method, imaging apparatus, terminal, and readable storage medium - Google Patents

Image sensor, control method, imaging apparatus, terminal, and readable storage medium Download PDF

Info

Publication number
CN112235494A
CN112235494A CN202011105845.9A CN202011105845A CN112235494A CN 112235494 A CN112235494 A CN 112235494A CN 202011105845 A CN202011105845 A CN 202011105845A CN 112235494 A CN112235494 A CN 112235494A
Authority
CN
China
Prior art keywords
pixel
color
panchromatic
pixels
phase detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011105845.9A
Other languages
Chinese (zh)
Other versions
CN112235494B (en
Inventor
杨鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202011105845.9A priority Critical patent/CN112235494B/en
Priority to CN202210323572.8A priority patent/CN114845015A/en
Publication of CN112235494A publication Critical patent/CN112235494A/en
Application granted granted Critical
Publication of CN112235494B publication Critical patent/CN112235494B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/704Pixels specially adapted for focusing, e.g. phase difference pixel sets

Abstract

The application discloses an image sensor, a control method, an imaging device, a terminal and a computer readable storage medium. The image includes a pixel array including color pixels, panchromatic pixels, at least one color phase detection pixel pair, and at least one panchromatic phase detection pixel pair, and a lens array. The pair of panchromatic phase detection pixels includes two adjacent panchromatic pixels, and phase information of the panchromatic pixels in the pair of panchromatic phase detection pixels is used for in-focus detection. The color phase detection pixel pair includes two adjacent color pixels of the same color, and phase information of the color pixels in the color phase detection pixel pair is used for focus detection. The lens array includes a plurality of first lenses, each full-color phase detection pixel pair being covered by the same first lens, each color phase detection pixel pair being covered by the same first lens. According to the method and the device, accurate focusing can be realized under scenes with different ambient brightness by setting the panchromatic phase detection pixel pair and the color phase detection pixel pair.

Description

Image sensor, control method, imaging apparatus, terminal, and readable storage medium
Technical Field
The present disclosure relates to the field of imaging technologies, and in particular, to an image sensor, a control method, an imaging device, a terminal, and a computer-readable storage medium.
Background
With the development of electronic technology, terminals having a camera function have become popular in people's lives. The focusing methods adopted by the existing mobile phone shooting mainly comprise contrast focusing and phase focusing. Contrast focusing is more accurate but too slow. The phase focusing speed is high, but the phase focusing on the market is realized on a color sensor, and the focusing performance is not good enough under the dark light environment.
Disclosure of Invention
The embodiment of the application provides an image sensor, a control method, an imaging device, a terminal and a computer readable storage medium.
The embodiment of the application provides image sensing. The image includes a pixel array including a plurality of color pixels, a plurality of panchromatic pixels, at least one color phase detection pixel pair, and at least one panchromatic phase detection pixel pair, and a lens array. The panchromatic phase detection pixel pair includes two adjacent panchromatic pixels, and phase information of the panchromatic pixels in the panchromatic phase detection pixel pair is used for in-focus detection. The pair of color phase detection pixels includes two adjacent color pixels of the same color, the color pixels having a narrower spectral response than the panchromatic pixels, phase information of the color pixels in the pair of color phase detection pixels being used for focus detection. The lens array includes a plurality of first lenses, each of the panchromatic phase detection pixel pairs being covered by the same one of the first lenses, each of the color phase detection pixel pairs being covered by the same one of the first lenses.
The embodiment of the application provides a control method. The control method is used for an image sensor. The image sensor includes a pixel array including a plurality of color pixels, a plurality of panchromatic pixels, at least one color phase detection pixel pair, and at least one panchromatic phase detection pixel pair, and a lens array. The panchromatic phase detection pixel pair includes two adjacent panchromatic pixels. The color phase detection pixel pair comprises two adjacent color pixels of the same color, the color pixels having a narrower spectral response than the panchromatic pixels; the lens array includes a plurality of first lenses, each of the panchromatic phase detection pixel pairs being covered by the same one of the first lenses, each of the color phase detection pixel pairs being covered by the same one of the first lenses. The control method comprises the following steps: performing focus detection for focusing according to phase information of the panchromatic pixels in the panchromatic phase detection pixel pair when ambient brightness is less than a first threshold; and when the ambient brightness is larger than a second threshold value, carrying out focusing detection according to the phase information of the color pixels in the color phase detection pixel pair so as to focus, wherein the second threshold value is larger than the first threshold value.
The embodiment of the application provides an imaging device. The imaging device comprises a lens and an image sensor. The image sensor can receive light rays passing through the lens. The image includes a pixel array including a plurality of color pixels, a plurality of panchromatic pixels, at least one color phase detection pixel pair, and at least one panchromatic phase detection pixel pair, and a lens array. The panchromatic phase detection pixel pair includes two adjacent panchromatic pixels, and phase information of the panchromatic pixels in the panchromatic phase detection pixel pair is used for in-focus detection. The pair of color phase detection pixels includes two adjacent color pixels of the same color, the color pixels having a narrower spectral response than the panchromatic pixels, phase information of the color pixels in the pair of color phase detection pixels being used for focus detection. The lens array includes a plurality of first lenses, each of the panchromatic phase detection pixel pairs being covered by the same one of the first lenses, each of the color phase detection pixel pairs being covered by the same one of the first lenses.
The embodiment of the application provides a terminal. The terminal includes a housing and an imaging device. The imaging device is coupled to the housing. The imaging device comprises a lens and an image sensor. The image sensor can receive light rays passing through the lens. The image includes a pixel array including a plurality of color pixels, a plurality of panchromatic pixels, at least one color phase detection pixel pair, and at least one panchromatic phase detection pixel pair, and a lens array. The panchromatic phase detection pixel pair includes two adjacent panchromatic pixels, and phase information of the panchromatic pixels in the panchromatic phase detection pixel pair is used for in-focus detection. The pair of color phase detection pixels includes two adjacent color pixels of the same color, the color pixels having a narrower spectral response than the panchromatic pixels, phase information of the color pixels in the pair of color phase detection pixels being used for focus detection. The lens array includes a plurality of first lenses, each of the panchromatic phase detection pixel pairs being covered by the same one of the first lenses, each of the color phase detection pixel pairs being covered by the same one of the first lenses.
The present embodiments provide a non-transitory computer-readable storage medium containing a computer program. The computer program, when executed by a processor, causes the processor to perform a control method. The control method is used for an image sensor. The image sensor includes a pixel array including a plurality of color pixels, a plurality of panchromatic pixels, at least one color phase detection pixel pair, and at least one panchromatic phase detection pixel pair, and a lens array. The panchromatic phase detection pixel pair includes two adjacent panchromatic pixels. The color phase detection pixel pair comprises two adjacent color pixels of the same color, the color pixels having a narrower spectral response than the panchromatic pixels; the lens array includes a plurality of first lenses, each of the panchromatic phase detection pixel pairs being covered by the same one of the first lenses, each of the color phase detection pixel pairs being covered by the same one of the first lenses. The control method comprises the following steps: performing focus detection for focusing according to phase information of the panchromatic pixels in the panchromatic phase detection pixel pair when ambient brightness is less than a first threshold; and when the ambient brightness is larger than a second threshold value, carrying out focusing detection according to the phase information of the color pixels in the color phase detection pixel pair so as to focus, wherein the second threshold value is larger than the first threshold value.
According to the image sensor, the control method, the imaging device, the terminal and the computer-readable storage medium, the color phase detection pixel pair and the panchromatic phase detection pixel pair are arranged in the image sensor, and focusing detection is carried out according to phase information of panchromatic pixels in the panchromatic phase detection pixel pair and/or phase information of color pixels in the color pixel phase detection pixel pair, so that accurate focusing can be achieved under scenes with different ambient brightness, particularly under a dark light environment, and scene adaptability of the image sensor is improved.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic diagram of an image sensor of certain embodiments of the present application;
FIG. 2 is a schematic diagram of a pixel circuit according to some embodiments of the present application;
FIG. 3 is a schematic illustration of different color channel exposure saturation times;
FIGS. 4-13 are schematic diagrams of pixel arrangements and lens coverage of the minimal repeating unit according to some embodiments of the present disclosure;
FIGS. 14-19 are schematic partial cross-sectional views of image sensors of certain embodiments of the present application;
FIG. 20 is a schematic flow chart diagram of a control method according to certain embodiments of the present application;
FIG. 21 is a schematic structural view of an imaging device according to certain embodiments of the present application;
FIGS. 22-25 are schematic flow charts of control methods according to certain embodiments of the present application;
FIG. 26 is a schematic diagram of a mobile terminal of some embodiments of the present application;
FIG. 27 is a diagrammatic representation of the interaction of a non-volatile computer readable storage medium and a processor in accordance with certain embodiments of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the embodiments of the present application.
Referring to fig. 1 and 4, an image sensor 10 is provided in the present embodiment. The image sensor 10 includes a pixel array 11 and a lens array 16. The pixel array 11 includes a plurality of color pixels, a plurality of panchromatic pixels W, at least one color phase detection pixel pair 17, and at least one panchromatic phase detection pixel pair 18. The panchromatic phase detection pixel pair 18 includes two adjacent panchromatic pixels W, phase information of the panchromatic pixels W in the panchromatic phase detection pixel pair 18 is used for focus detection, the color phase detection pixel pair 17 includes two adjacent color pixels of the same color, the color pixels have a narrower spectral response than the panchromatic pixels W, and phase information of the color pixels in the color phase detection pixel pair 17 is used for focus detection. The lens array 16 includes a plurality of first lenses 161, each full-color phase detection pixel pair 18 being covered by the same first lens 16, and each color phase detection pixel pair 18 being covered by the same first lens 16.
The image sensor 10 in the embodiment of the present application is provided with the color phase detection pixel pair 17 and the panchromatic phase detection pixel pair 18, and performs focus detection according to the phase information of the panchromatic pixel W in the panchromatic phase detection pixel pair 18 and/or the phase information of the color pixel in the color pixel phase detection pixel pair 17, so that accurate focus can be achieved in scenes with different ambient brightness, especially in a dark light environment, to improve the scene adaptability of the image sensor 10.
Next, the basic structure of the image sensor 10 will be described first. Referring to fig. 1, fig. 1 is a schematic structural diagram of an image sensor 10 according to an embodiment of the present disclosure. The image sensor 10 includes a pixel array 11 and a lens array 16. The lens array 16 and the pixel array 11 are arranged in this order along the light receiving direction of the image sensor 10.
The image sensor 10 may employ a Complementary Metal Oxide Semiconductor (CMOS) photosensitive element or a Charge-coupled Device (CCD) photosensitive element.
The pixel array 11 includes a plurality of pixels 1101 two-dimensionally arranged in an array, and each pixel 1101 includes a photoelectric conversion element 117 (shown in fig. 2 or 14) and a filter 118 (shown in fig. 14). Each pixel 1101 converts light into an electrical charge based on the intensity of the light incident thereon, and the filter 118 is used to pass light in certain wavelength bands and filter out light in the remaining wavelength bands. The pixel array 11 includes color pixels and panchromatic pixels W, the color pixels having a narrower spectral response than the panchromatic pixels. Referring to fig. 4, the pixel array 11 includes at least one color phase detection pixel pair 17 and at least one panchromatic phase detection pixel pair, the color phase detection pixel pair 17 includes two adjacently disposed color pixels of the same color, and the panchromatic phase detection pixel pair 18 includes two adjacently disposed panchromatic pixels W.
The lens array 16 includes a plurality of first lenses 161 and a plurality of second lenses 162, and an area that the second lenses 162 can cover is smaller than an area that the first lenses 161 can cover. Illustratively, each full-color phase detection pixel pair 18 is covered by the same first lens 161 (as shown in fig. 16 or 19), and each colored phase detection pixel pair 17 is covered by the same first lens 161 (as shown in fig. 15 or 18). Since two adjacent pixels 1101 share the same first lens 161, focus detection can be performed for focusing from phase information obtained by the two pixels 1101. That is, the phase information of the panchromatic pixel W in the pair of panchromatic phase detection pixels 18 is used for focus detection, and the phase information of the color pixel in the pair of color phase detection pixels 17 is used for focus detection. In addition, in the pixel array 11, except for the pixels 1101 in the color phase detection pixel pair 17 and the full-color phase detection pixel pair 18, each pixel 1101 is covered by one second lens 162, so that light incident on the image sensor 10 can be converged into the pixel 1101 below the second lens 162 by the second lens 162.
Fig. 2 is a schematic diagram of a pixel circuit 110 according to an embodiment of the present disclosure. The operation principle of the pixel circuit 110 will be described with reference to fig. 1 and 2.
As shown in fig. 1 and 2, the pixel circuit 110 includes a photoelectric conversion element 117 (e.g., a photodiode PD), an exposure control circuit 116 (e.g., a transfer transistor 112), a reset circuit (e.g., a reset transistor 113), an amplification circuit (e.g., an amplification transistor 114), and a selection circuit (e.g., a selection transistor 115). In the embodiment of the present application, the transfer transistor 112, the reset transistor 113, the amplifying transistor 114, and the selection transistor 115 are, for example, MOS transistors, but are not limited thereto.
For example, referring to fig. 2, the gate TG of the transfer transistor 112 is connected to a vertical driving unit (not shown) of the image sensor 10 through an exposure control line (not shown); a gate RG of the reset transistor 113 is connected to the vertical driving unit through a reset control line (not shown in the figure); the gate SEL of the selection transistor 115 is connected to the vertical driving unit through a selection line (not shown in the figure). The exposure control circuit 116 (e.g., the transfer transistor 112) in each pixel circuit 110 is electrically connected to the photoelectric conversion element 117 for transferring the potential accumulated by the photoelectric conversion element 117 after illumination. For example, the photoelectric conversion element 117 includes a photodiode PD, and an anode of the photodiode PD is connected to, for example, ground. The photodiode PD converts the received light into electric charges. The cathode of the photodiode PD is connected to the floating diffusion unit FD via an exposure control circuit 116 (e.g., the transfer transistor 112). The floating diffusion FD is connected to the gate of the amplification transistor 114 and the source of the reset transistor 113.
For example, the exposure control circuit 116 is the transfer transistor 112, and the control terminal TG of the exposure control circuit 116 is the gate of the transfer transistor 112. The transfer transistor 112 is turned on when a pulse of an effective level (for example, VPIX level) is transmitted to the gate of the transfer transistor 12 through the exposure control line. The transfer transistor 112 transfers the charge photoelectrically converted by the photodiode PD to the floating diffusion unit FD.
For example, the drain of the reset transistor 113 is connected to the pixel power supply VPIX. A source of the reset transistor 113 is connected to the floating diffusion FD. Before the electric charges are transferred from the photodiode PD to the floating diffusion unit FD, a pulse of an active reset level is transmitted to the gate of the reset transistor 113 via the reset line, and the reset transistor 113 is turned on. The reset transistor 113 resets the floating diffusion unit FD to the pixel power supply VPIX.
For example, the gate of the amplification transistor 114 is connected to the floating diffusion FD. The drain of the amplifying transistor 114 is connected to the pixel power supply VPIX. After the floating diffusion FD is reset by the reset transistor 113, the amplification transistor 114 outputs a reset level through the output terminal OUT via the selection transistor 115. After the charge of the photodiode PD is transferred by the transfer transistor 112, the amplification transistor 114 outputs a signal level through the output terminal OUT via the selection transistor 115.
For example, the drain of the selection transistor 115 is connected to the source of the amplification transistor 114. The source of the selection transistor 115 is connected to a column processing unit (not shown in the figure) in the image sensor 10 through an output terminal OUT. When a pulse of an effective level is transmitted to the gate of the selection transistor 115 through the selection line, the selection transistor 115 is turned on. The signal output from the amplifying transistor 114 is transmitted to the column processing unit through the selection transistor 115.
It should be noted that the pixel structure of the pixel circuit 110 in the embodiment of the present application is not limited to the structure shown in fig. 2. For example, the pixel circuit 110 may have a three-transistor pixel structure in which the functions of the amplifying transistor 114 and the selection transistor 115 are performed by one transistor. For example, the exposure control circuit 116 is not limited to the single transfer transistor 112, and other electronic devices or structures with a control terminal controlling the on function can be used as the exposure control circuit in the embodiment of the present application, and the implementation of the single transfer transistor 112 is simple, low-cost and easy to control.
In an image sensor including pixels of a plurality of colors, pixels of different colors receive different amounts of exposure per unit time. After some colors are saturated, some colors have not been exposed to the ideal state. RGBW (red, green, blue, full color) is illustrated in fig. 3 as an example. Referring to fig. 3, in fig. 3, the horizontal axis represents exposure time, the vertical axis represents exposure amount, Q represents saturated exposure amount, LW represents exposure curve of panchromatic pixel W, LG represents exposure curve of green pixel G, LR represents exposure curve of red pixel R, and LB represents exposure curve of blue pixel.
As can be seen from fig. 3, the slope of the exposure curve LW of the panchromatic pixel W is the largest, i.e., the panchromatic pixel W can obtain more exposure per unit time, and saturation is reached at time t 1. The next to the slope of the exposure curve LG for the green pixel G, the green pixel saturates at time t 2. The slope of the exposure curve LR for the red pixel R is again such that the red pixel is saturated at time t 3. The slope of the exposure curve LB for the blue pixel B is at a minimum and the blue pixel is saturated at time t 4. As can be seen from fig. 3, the exposure amount received by the panchromatic pixel W per unit time is larger than the exposure amount received by the color pixel W per unit time, that is, the sensitivity of the panchromatic pixel W is higher than that of the color pixel.
If the phase focusing is realized only by adopting the color phase detection pixel pair 17, in an environment with higher brightness, more light rays can be received by the color pixels in the color phase detection pixel pair 17, so that pixel information with higher signal-to-noise ratio can be output, and the phase focusing accuracy is higher at the moment; however, in an environment with low brightness, the color pixels in the color phase detection pixel pair 17 can receive less light, the signal-to-noise ratio of the output pixel information is low, and the accuracy of phase focusing is low. If the phase focusing is realized only by adopting the panchromatic phase detection pixel pair 18, the panchromatic pixel W in the panchromatic phase detection pixel pair 18 can receive more light rays and can output pixel information with high signal-to-noise ratio under the environment with lower brightness, and the phase focusing accuracy is higher at the moment; however, in an environment with high luminance, the panchromatic pixels W in the panchromatic phase detection pixel pair 18 easily reach a saturated state, so that overexposure of the panchromatic pixels W affects phase information, and accuracy of phase focusing is lowered.
For the above reasons, the image sensor 10 of the embodiment of the present application arranges the pair of color phase detection pixels 17 and the pair of panchromatic phase detection pixels 18 in the pixel array 11 at the same time, and phase information of the panchromatic pixel W in the pair of panchromatic phase detection pixels 18 is used for focus detection and phase information of the color pixel in the pair of color phase detection pixels 17 is used for focus detection. In this way, the image sensor 10 according to the embodiment of the present application can realize accurate focusing in scenes with different ambient brightness, and the scene adaptability of the image sensor 10 is improved.
Note that the spectral response of each pixel 1101 (i.e., the color of light that the pixel 1101 can receive) is determined by the color of the filter 118 corresponding to that pixel 1101. Throughout this application color pixels and panchromatic pixels refer to pixels 1101 that are capable of responding to light of the same color as the corresponding filter 118.
In some example embodiments, the phase information of the panchromatic pixel W in the pair of panchromatic phase detection pixels 18 is used for focus detection in the first luminance mode, and the phase information of the color pixel in the pair of color phase detection pixels 17 is used for focus detection in the second luminance mode. The brightness of the second brightness mode is larger than the first threshold value, and the brightness of the first brightness mode is smaller than the second threshold value. That is, when the luminance is greater than the first threshold value, the imaging apparatus 100 (shown in fig. 21) is in the second luminance mode, and the in-focus detection is performed on the phase information of the color pixels in the pair 17 based on the color phase detection pixels. At this time, since the luminance is greater than the first threshold, the color pixels in the color phase detection pixel pair 17 can receive more light, and can output pixel information with a high signal-to-noise ratio, and at this time, the phase information of the color pixels in the color phase detection pixel pair 17 is used to perform focus detection, so that the focus accuracy can be high. When the luminance is less than the second threshold value, the imaging apparatus 100 (shown in fig. 21) is in the first luminance mode, at which the in-focus detection is performed on the phase information of the full-color pixel W in the pair of full-color phase detection pixels 18. Since the luminance is low at this time, the panchromatic pixels W in the panchromatic phase detection pixel pair 18 are less likely to be saturated, and at this time, the phase information of the panchromatic pixels W in the panchromatic phase detection pixel pair 18 is used to perform focus detection, so that the focus accuracy can be made high.
In some embodiments, the first threshold is less than the second threshold. At this time, when the luminance is greater than the first threshold and smaller than the second threshold, the imaging apparatus 100 (shown in fig. 21) may be in the first luminance mode or the second luminance mode, and at this time, focus detection may be performed based on only the phase information of the color pixels in the color phase detection pixel pair 17 or may be performed based on only the phase information of the panchromatic pixel W in the panchromatic phase detection pixel pair 18. Of course, focus detection may be performed based on the phase information of the color pixels in the pair of color phase detection pixels 17 and the phase information of the panchromatic pixel W in the pair of panchromatic phase detection pixels 18, which is not limited herein.
It should be noted that, in some embodiments, the color pixels in the pair of color phase detection pixels 17 may be disposed in the same row (as shown in fig. 4), and the color pixels in the pair of color phase detection pixels 17 may be disposed in the same column (as shown in fig. 9); the panchromatic pixels W in the pair of panchromatic phase detecting pixels 18 may be arranged in the same row (as shown in fig. 4), and the panchromatic pixels W in the pair of panchromatic phase detecting pixels 18 may be arranged in the same column (as shown in fig. 9), without limitation.
In some embodiments, the number of panchromatic phase detecting pixel pairs 18 in the pixel array 11 is equal to the number of color phase detecting pixel pairs 17, i.e., the ratio of the number of panchromatic phase detecting pixel pairs 18 to the number of color phase detecting pixel pairs 17 in the pixel array 11 is 1: 1. If the number of the panchromatic phase detection pixel pairs 18 in the pixel array 11 is too large, the focusing accuracy is high when the brightness is low, but the panchromatic pixels W in the panchromatic phase detection pixel pairs 18 are easily saturated when the brightness is high, so that an overexposure phenomenon occurs, and the image quality of the finally obtained image is influenced; if the number of color phase detection pixel pairs 17 in the pixel array 11 is too large, the focusing accuracy is high when the luminance is high, but the focusing accuracy is low when the luminance is low. In the embodiment of the present application, since the number of pairs of full-color phase detection pixels 18 in the pixel array 11 is equal to the number of pairs of color phase detection pixels 17, a focus detection result with high accuracy can be obtained under different luminance conditions. Of course, in some embodiments, the number of panchromatic phase detection pixel pairs 18 in the pixel array 11 may be greater than the number of color phase detection pixel pairs 17 according to practical requirements; or the number of pairs of full-color phase detection pixels 18 in the pixel array 11 may be smaller than the number of pairs of color phase detection pixels 17, which is not limited herein.
Fig. 4 to 13 show examples of the arrangement of the pixels 1101 (shown in fig. 1) in the various image sensors 10. Referring to fig. 4 to 13, the plurality of pixels 1101 in the pixel array 11 may include a plurality of panchromatic pixels W and a plurality of color pixels (e.g., a plurality of first color pixels a, a plurality of second color pixels B, and a plurality of third color pixels C), wherein the color pixels and the panchromatic pixels are distinguished by the wavelength band of light that can pass through the filter 118 (shown in fig. 14) covering the color pixels, the color pixels have narrower spectral responses than the panchromatic pixels, and the first color pixels a and the third color pixels C each have narrower spectral responses than the second color pixels B, and the response spectrum of the color pixels is, for example, a portion of the response spectrum of the panchromatic pixels W. The pixel array 11 is composed of a plurality of minimum repeating units (fig. 4 to 13 show examples of minimum repeating units in various image sensors 10), which are duplicated and arranged in rows and columns.
It should be noted that the first color photosensitive pixel a may be a red photosensitive pixel R; the second color sensitive pixel B may be a green sensitive pixel G; the third color photosensitive pixel C may be a blue photosensitive pixel Bu; alternatively, the first color photosensitive pixel a may be a red photosensitive pixel R; the second color photosensitive pixel B may be a yellow photosensitive pixel Y; the third color photosensitive pixel C may be a blue photosensitive pixel Bu; alternatively, the first color photosensitive pixel a may be a magenta photosensitive pixel M; the second color photosensitive pixel B may be a cyan photosensitive pixel Cy; the third color photosensitive pixel C may be a yellow color photosensitive pixel Y, which is not limited herein. For convenience of description, the following embodiments use the first color photosensitive pixel a as the red photosensitive pixel R; the second color photosensitive pixel B is a green photosensitive pixel G; the third color sensitive pixel C is illustrated as a blue sensitive pixel Bu.
Referring to fig. 4 to fig. 9, in some embodiments, the minimum repeating unit includes a first type of pixel unit UA, at least one second type of pixel unit UB, and at least one third type of pixel unit UC. Wherein the first type of pixel unit UA does not include the color phase detection pixel pair 17 and the panchromatic phase detection pixel pair 18, the second type of pixel unit UB includes at least one color phase detection pixel pair 17, and the third type of pixel unit UC includes at least one panchromatic phase detection pixel pair 18. For example, the second type of pixel unit UB includes a color phase detection pixel pair 17, and the third type of pixel unit UC includes a full-color phase detection pixel pair 18; alternatively, the second type of pixel unit UB includes a plurality of color phase detection pixel pairs 17, and the third type of pixel unit UC includes a panchromatic phase detection pixel pair 18; or, the second type of pixel unit UB includes a color phase detection pixel pair 17, and the third type of pixel unit UC includes a plurality of panchromatic phase detection pixel pairs 18; alternatively, the second type of pixel unit UB includes a plurality of color phase detection pixel pairs 17, and the third type of pixel unit UC includes a plurality of full-color phase detection pixel pairs 18, which is not limited herein. "plurality" herein means two or more.
For example, fig. 4 to 9 are schematic diagrams illustrating the arrangement of the pixels 1101 and the coverage of the lens array 16 in some minimal repeating units in the embodiment of the present application; the minimum repetition unit is 8 rows, 8 columns and 64 pixels, and the pixel unit (two pixel units of the first type UA, one pixel unit of the second type UB and one pixel unit of the third type UC) is 4 rows, 4 columns and 16 pixels. Wherein the second type of pixel unit UB comprises a color phase detection pixel pair 17 and the third type of pixel unit UC comprises a panchromatic phase detection pixel pair 18. Each of the pairs of full-color phase detection pixels 18 is covered by the same first lens 161, each of the pairs of color phase detection pixels 17 is covered by the same first lens 161, and each of the pixels 1101 in the pixel array 11 except for the pixels 1101 in the pairs of color phase detection pixels 17 and the pairs of full-color phase detection pixels 18 is covered by one second lens 162.
In one example, as shown in fig. 4, two first type pixel units UA are disposed in a first diagonal D1 direction of the minimum repeating unit, a second type pixel unit UB and a second type pixel unit UC are disposed in a second diagonal D2 direction of the minimum repeating unit, and the first diagonal D1 is perpendicular to the second diagonal D2, although in other embodiments, the first diagonal D1 and the second diagonal D2 are not perpendicular, but have other preset included angles. The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal and the second diagonal are perpendicular. Since the color phase detection pixel pair 17 and the panchromatic phase detection pixel pair 18 share the first lens 1601 with the two adjacent pixels 1101, that is, the pixels 1101 in the color phase detection pixel pair 17 and the panchromatic phase detection pixel pair 18 cannot obtain the complete information of the incident light, during the subsequent image processing, the pixels 1101 in the color phase detection pixel pair 17 and the panchromatic phase detection pixel pair 18 are used as dead pixels for image processing. In the embodiment of the present application, the first type of pixel unit UA is disposed in the direction of the first diagonal D1 of the minimum repeating unit, and the second type of pixel unit UB and the second type of pixel unit UC are disposed in the direction of the second diagonal D2 of the minimum repeating unit, which is beneficial to dead pixel compensation in the subsequent image processing process.
In another example, as shown in fig. 5, the second type of pixel unit UB and the third type of pixel unit UC are disposed adjacent to each other and interlaced with the first type of pixel unit UA. That is, the second type of pixel unit UB and the third type of pixel unit UC are disposed on the same row of the minimum repeating unit, and the first type of pixel unit UA is disposed on a different row from the second type of pixel unit UB and the third type of pixel unit UC. Since there is no color phase detection pixel pair 17 or full color phase detection pixel pair 18 in the row of the first type pixel unit UA, there is no need to perform dead pixel compensation on the row of pixels 1101 during subsequent image processing. Similarly, in another example, as shown in fig. 6, the second type of pixel unit UB and the third type of pixel unit UC are disposed adjacent to each other and are disposed in a row with the first type of pixel unit UA. That is, the second type pixel unit UB and the third type pixel unit UC are disposed in the same column of the minimum repeating unit, and the first type pixel unit UA is disposed in a different column from the second type pixel unit UB and the third type pixel unit UC. Since there is no color phase detection pixel pair 17 or panchromatic phase detection pixel pair 18 in the column where the first type pixel unit UA is located, there is no need to perform dead pixel compensation on the column of pixels 1101 in the subsequent image processing.
In some embodiments, referring to fig. 4-6, the position of the color phase detection pixel pair 17 in the second type of pixel unit UB corresponds to the position of the panchromatic phase detection pixel pair 18 in the third type of pixel unit UC. Illustratively, the minimal repeating unit includes two pixel units UA of the first type, one pixel unit UB of the second type, and one pixel unit UC of the third type, wherein the pixel unit UB of the second type includes one pair of color phase detection pixels 17, the pixel unit UC of the third type includes one pair of panchromatic phase detection pixels 18, and the row of the pair of color phase detection pixels 17 in the pixel unit UB of the second type is the same as the row of the pair of panchromatic phase detection pixels 18 in the pixel unit UC of the third type, and the column of the pair of color phase detection pixels 17 in the pixel unit UB of the second type is the same as the column of the pair of panchromatic phase detection pixels 18 in the pixel unit UC of the third type. This enables the color phase detection pixel pairs 17 and the panchromatic phase detection pixel pairs 18 to be uniformly distributed in the pixel array 11, which is beneficial to improving the accuracy of focus detection and facilitating subsequent image processing to obtain higher quality images.
In some embodiments, referring to fig. 4 to fig. 9, a plurality of first-type pixel units UA are identical, and a second-type pixel unit UB can be formed by performing the following transformation on the basis of the first-type pixel units UA: at least one panchromatic pixel W in the first-type pixel unit UA is replaced with one color pixel, and a color pixel of the same color adjacent thereto forms a color phase detection pixel pair 17. The third type of pixel unit UC can be formed by performing the following transformation on the basis of the first type of pixel unit UA: at least one color pixel in the first-type pixel unit UA is replaced with one panchromatic pixel W, and a panchromatic phase detection pixel pair 18 is formed with the panchromatic pixel W adjacent thereto. For example, as shown in fig. 4, the second type of pixel unit UB can be formed by performing the following transformation on the basis of the first type of pixel unit UA: one full-color pixel W in the first-type pixel unit UA is replaced with one second-color pixel B, and the color phase detection pixel pair 17 is formed with the second-color pixel B adjacent thereto. In this way, the arrangement of the pixels 1101 in the second type of pixel unit UB except for one second color pixel B in the color phase detection pixel pair 17 is completely the same as that of the first type of sub-unit UA, which is beneficial to the dead pixel compensation of the second color pixel B in the color phase detection pixel pair 17 in the subsequent image processing process. In addition, compared with the method of forming the color phase detection pair 17 by using two adjacent second color pixels B or two adjacent third color pixels C, the method of forming the color phase detection pair 17 by using two adjacent second color pixels B can receive more incident light and improve the accuracy of phase focusing. Similarly, the third type of pixel unit UC can be formed by performing the following transformation on the basis of the first type of pixel unit UA: one second-color pixel B in the first-type pixel unit UA is replaced with one panchromatic pixel W, and forms a panchromatic phase detecting pixel pair 18 with its adjacent panchromatic pixel W. In this way, the pixels 1101 in the third type of pixel unit UC except for one panchromatic pixel W in the panchromatic phase detection pixel pair 18 are arranged in the same manner as the first type of sub-unit UA, which is beneficial to the dead pixel compensation of the panchromatic pixel W in the panchromatic phase detection pixel pair 18 in the subsequent image processing process.
It should be noted that, referring to fig. 7, in some embodiments, the second type of pixel unit UB can also be formed by performing the following transformation on the basis of the first type of pixel unit UA: one full-color pixel W in the first-type pixel unit UA is replaced with one third-color pixel C, and the color phase detection pixel pair 17 is formed with the third-color pixel C adjacent thereto. The third type of pixel unit UC can be formed by performing the following transformation on the basis of the first type of pixel unit UA: one third-color pixel C in the first-type pixel unit UA is replaced with one panchromatic pixel W, and forms a panchromatic phase detecting pixel pair 18 with its adjacent panchromatic pixel W. Of course, in some embodiments, as shown in fig. 8, the second type of pixel unit UB can be formed by performing the following transformation on the basis of the first type of pixel unit UA: one full-color pixel W in the first-type pixel unit UA is replaced with one first-color pixel a, and the color phase detection pixel pair 17 is formed with the first-color pixel a adjacent thereto. The third type of pixel unit UC can be formed by performing the following transformation on the basis of the first type of pixel unit UA: one first-color pixel a in the first-type pixel unit UA is replaced with one panchromatic pixel W, and the panchromatic pixel W adjacent thereto forms the pair of panchromatic phase detecting pixels 18, which is not limited herein.
Referring to fig. 10 to 12, in some embodiments, the color pixels in the same first type of pixel unit UA are all the same single-color pixels, the color pixels in the same second type of pixel unit UB are all the same single-color pixels, and the color pixels in the same third type of pixel unit UC are all the same single-color pixels. In the same first type of pixel element UA, the color pixels are arranged in a first diagonal direction D1 of the first type of pixel element UA, and the panchromatic pixels W are arranged in a second diagonal direction D2 of the first type of pixel element UA. In the same second type of pixel unit UB, the full-color pixel W is disposed in the second diagonal direction D2 of the second type of pixel unit UB, and the other color pixels except for the pair of color phase detecting pixels 17 are disposed in the first diagonal direction D1 of the second type of pixel unit UB. In the same third type of pixel unit UC, the color pixels are arranged in the first diagonal direction D1 of the third type of pixel unit UC, and the other panchromatic pixels W except for the pair of panchromatic phase detection pixels 18 are arranged in the second diagonal direction D2 of the third type of pixel unit UC.
It should be noted that the first diagonal direction D1 of the first-type pixel subunit UA, the first diagonal direction D1 of the second-type pixel subunit UB, and the first diagonal direction D1 of the third-type pixel subunit UC are the same, and the second diagonal direction D2 of the first-type pixel subunit UA, the second diagonal direction D2 of the second-type pixel subunit UB, and the second diagonal direction D2 of the third-type pixel subunit UC are the same. The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal and the second diagonal may be perpendicular or at other angles. The first and second diagonal directions D1 and D2 are not limited to diagonal lines but include directions parallel to diagonal lines. The "direction" herein is not a single direction, and is understood as a concept of "straight line" indicating arrangement, and there may be a bidirectional direction of both ends of the straight line.
For example, please refer to fig. 10 to 11, which are schematic diagrams illustrating the arrangement of the pixels 1101 and the coverage of the lens array 16 of some minimum repeating units according to the embodiment of the present application; the minimum repeating unit is 4 rows, 4 columns and 16 pixels, and the pixel unit (two pixel units of the first type UA, one pixel unit of the second type UB and one pixel unit of the third type UC) is 2 rows, 2 columns and 4 pixels. Wherein the second type of pixel unit UB comprises a color phase detection pixel pair 17 and the third type of pixel unit UC comprises a panchromatic phase detection pixel pair 18. Each of the pairs of full-color phase detection pixels 18 is covered by the same first lens 161, each of the pairs of color phase detection pixels 17 is covered by the same first lens 161, and each of the pixels 1101 in the pixel array 11 except for the pixels 1101 in the pairs of color phase detection pixels 17 and the pairs of full-color phase detection pixels 18 is covered by one second lens 162.
In one example, the single-color pixels within two first-type pixel units UA are different colors. As shown in fig. 10, one first-type pixel unit UA includes color pixels and a panchromatic pixel W, wherein the color pixels are first color pixels a, the first color pixels a are arranged along a first diagonal direction D1, and the panchromatic pixel W is arranged along a second diagonal direction D2. Another pixel unit UA of the first type includes color pixels and a panchromatic pixel W, wherein the color pixels are third color pixels C, the third color pixels C are arranged along a first diagonal direction D1, and the panchromatic pixel W is arranged along a second diagonal direction D2. The second type of pixel unit UB includes a color pixel, a panchromatic pixel W and a pair of color phase detection pixels 17, and the color pixels are all the second color pixels B, i.e., the first color pixels B. The color-phase detection pixel pair 17 in the second type of pixel unit UB includes two adjacent second color pixels B. In the second type of pixel unit UB, the full-color pixel W is disposed in the second diagonal direction D2, and the second-color pixels B other than the color phase detection pixel pair 17 are disposed in the first diagonal direction D1. The third type of pixel unit UC includes a color pixel, a panchromatic pixel W and a panchromatic phase detection pixel pair 18, and the color pixels are all the second color pixels B. In the third type of pixel unit UC, the second-color pixels B are arranged in the first diagonal direction D1, and the other panchromatic pixels W except for the pair of panchromatic phase detection pixels 18 are arranged in the second diagonal direction D2.
In another example, the single-color pixels in two first-type pixel units UA are the same color. As shown in fig. 11, two first-type pixel units UA include a color pixel and a panchromatic pixel W, wherein the color pixel is a second color pixel B, the second color pixel B is arranged along a first diagonal direction D1, and the panchromatic pixel W is arranged along a second diagonal direction D2. The second type of pixel unit UB includes a color pixel, a panchromatic pixel W and a color phase detection pixel pair 17, and the color pixels are all the first color pixels a, i.e., the first color pixels a. The color-phase detection pixel pair 17 in the second type of pixel unit UB includes two adjacent first color pixels a. In the second type of pixel unit UB, the full-color pixel W is disposed in the second diagonal direction D2, and the first-color pixels a other than the color phase detection pixel pair 17 are disposed in the first diagonal direction D1. The third type of pixel unit UC includes a color pixel, a panchromatic pixel W and a panchromatic phase detection pixel pair 18, and the color pixels are all the third color pixels C. In the third type of pixel unit UC, the third color pixel C is disposed in the first diagonal direction D1, and the other panchromatic pixels W except for the pair of panchromatic phase detection pixels 18 are disposed in the second diagonal direction D2.
Of course, in some embodiments, in the same first type of pixel element UA, the color pixels are arranged in the second diagonal direction D2 of the first type of pixel element UA, and the panchromatic pixels W are arranged in the first diagonal direction D1 of the first type of pixel element UA. In the same second type of pixel unit UB, the full-color pixel W is disposed in the first diagonal direction D1 of the second type of pixel unit UB, and the other color pixels except for the color phase detection pixel pair 17 are disposed in the second diagonal direction D2 of the second type of pixel unit UB. In the same third type of pixel unit UC, the color pixel is disposed in the second diagonal direction D2 of the third type of pixel unit UC, and the other panchromatic pixels W except for the pair of panchromatic phase detection pixels 18 are disposed in the first diagonal direction D1 of the third type of pixel unit UC, which is not limited herein.
Fig. 12 is a schematic diagram illustrating an arrangement of pixels 1101 and a coverage manner of the lens array 16, which are minimum repeating units in the embodiment of the present application; the minimum repetition unit is 8 rows, 8 columns and 64 pixels, and the pixel unit (two pixel units of the first type UA, one pixel unit of the second type UB and one pixel unit of the third type UC) is 4 rows, 4 columns and 16 pixels. Wherein the second type of pixel unit UB comprises a color phase detection pixel pair 17 and the third type of pixel unit UC comprises a panchromatic phase detection pixel pair 18. Each of the pairs of full-color phase detection pixels 18 is covered by the same first lens 161, each of the pairs of color phase detection pixels 17 is covered by the same first lens 161, and each of the pixels 1101 in the pixel array 11 except for the pixels 1101 in the pairs of color phase detection pixels 17 and the pairs of full-color phase detection pixels 18 is covered by one second lens 162.
Compared with the arrangement of one color phase detection pixel pair 17 and one panchromatic phase detection pixel pair 18 on 8 rows, 8 columns and 64 pixels of the minimum repeating unit, the density of the phase detection pixel pairs on the pixel array 11 of the image sensor 10 is smaller, which is beneficial to subsequent image processing, thereby improving the image quality of the finally obtained image.
Referring to fig. 13, in some embodiments, the minimum repeating unit includes a first type of pixel unit UA and at least one second type of pixel unit UB, wherein the first type of pixel unit UA does not include the color phase detection pixel pair 17 and the panchromatic phase detection pixel pair 18, and the second type of pixel unit UB includes at least one color phase detection pixel pair 17 and at least one panchromatic phase detection pixel pair 18. For example, the second type of pixel unit UB includes a color phase detection pixel pair 17 and a full-color phase detection pixel pair 18; alternatively, the second type of pixel unit UB includes a color phase detection pixel pair 17 and a plurality of panchromatic phase detection pixel pairs 18; alternatively, the second type of pixel unit UB includes a plurality of color phase detection pixel pairs 17 and a panchromatic phase detection pixel pair 18; alternatively, the second type of pixel unit UB includes a plurality of color phase detection pixel pairs 17 and a plurality of full-color phase detection pixel pairs 18. It should be noted that, in a minimum repeating unit, the number of the first type pixel units UA may be equal to the number of the second type pixel units UB; the number of the first type pixel units UA can be larger than the number of the second type pixel units UB; the number of the first type pixel units UA may also be smaller than the number of the second type pixel units UB, which is not limited herein.
Fig. 13 is a schematic diagram illustrating an arrangement of pixels 1101 and a coverage manner of the lens array 16, which are minimum repeating units in the embodiment of the present application; the minimum repeating unit is 8 rows, 8 columns and 64 pixels, and the pixel unit (three pixel units of the first type UA and one pixel unit of the second type UB) is 4 rows, 4 columns and 16 pixels. Wherein the second type of pixel cell UB comprises a color phase detection pixel pair 17 and a panchromatic phase detection pixel pair 18. Each of the pairs of full-color phase detection pixels 18 is covered by the same first lens 161, each of the pairs of color phase detection pixels 17 is covered by the same first lens 161, and each of the pixels 1101 in the pixel array 11 except for the pixels 1101 in the pairs of color phase detection pixels 17 and the pairs of full-color phase detection pixels 18 is covered by one second lens 162. Since the pair of color phase detection pixels 17 and the pair of full-color phase detection pixels 18 are in the same pixel unit, the difficulty in manufacturing the image sensor 10 can be reduced.
In some embodiments, referring to fig. 13, a plurality of first-type pixel units UA are identical, and a second-type pixel unit UB can be formed by performing the following transformation on the basis of the first-type pixel units UA: at least one panchromatic pixel W in the first-type pixel unit UA is replaced with one color pixel and the color pixel of the same color adjacent thereto forms a color phase detection pixel pair 17, and at least one color pixel in the first-type pixel unit UA is replaced with one panchromatic pixel W and the panchromatic pixel W adjacent thereto forms a panchromatic phase detection pixel pair 18. For example, as shown in fig. 13, the second type of pixel unit UB can be formed by performing the following transformation on the basis of the first type of pixel unit UA: at least one panchromatic pixel W in the first-type pixel unit UA is replaced with one second-color pixel B and forms a color phase detection pixel pair 17 with the second-color pixel B adjacent thereto, and at least one second pixel B in the first-type pixel unit UA is replaced with one panchromatic pixel W and forms a panchromatic phase detection pixel pair 18 with the panchromatic pixel W adjacent thereto. In this way, the arrangement of the pixels 1101 in the second type of pixel unit UB except for one second color pixel B in the color phase detection pixel pair 17 and one panchromatic pixel W in the panchromatic phase detection pixel pair 18 is completely the same as that of the first type of sub-unit UA, which is beneficial to the dead pixel compensation of the second color pixel B in the color phase detection pixel pair 17 in the subsequent image processing process. In addition, compared with the method of forming the color phase detection pair 17 by using two adjacent second color pixels B or two adjacent third color pixels C, the method of forming the color phase detection pair 17 by using two adjacent second color pixels B can receive more incident light and improve the accuracy of phase focusing.
It should be noted that, in some embodiments, the second type of pixel unit UB can also be formed by performing the following transformation on the basis of the first type of pixel unit UA: one panchromatic pixel W in the first-type pixel unit UA is replaced with one third-color pixel C, and the third-color pixel C adjacent thereto forms a color phase detection pixel pair 17 and one third-color pixel C in the first-type pixel unit UA is replaced with one panchromatic pixel W, and the panchromatic phase detection pixel pair 18 is formed with the adjacent panchromatic pixel W. Of course, in some embodiments, the second type of pixel unit UB can be further transformed on the basis of the first type of pixel unit UA as follows: one panchromatic pixel W in the first-type pixel unit UA is replaced with one first-color pixel a and forms a color phase detection pixel pair 17 with the first-color pixel a adjacent thereto, and one first-color pixel a in the first-type pixel unit UA is replaced with one panchromatic pixel W and forms a panchromatic phase detection pixel pair 18 with the adjacent panchromatic pixel W, which is not limited herein.
Fig. 14 to 18 show various schematic cross-sectional views of the pixel array 11 taken along the light receiving direction of the image sensor 10 in any one of the embodiments described in fig. 4 to 13. Each of the full-color pixels W and each of the color pixels includes a filter 118 and a photoelectric conversion element 117. The lens 16, the filter 118, and the photoelectric conversion element 117 are disposed in this order along the light receiving direction of the image sensor 10. The photoelectric conversion element 117 can convert received light into electric charges, and specifically, the photoelectric conversion element 117 includes a substrate 1171 and an n-well layer 1172 formed inside the substrate 1171, and the n-well layer 1172 can realize conversion of light into electric charges. A filter 118 is disposed on a surface of the n-well layer 1172 remote from the substrate 1171, the filter 118 being capable of passing light of a particular wavelength band. The microlens 1181 is disposed on a side of the optical filter 118 away from the n-well layer 1172, and the microlens 1181 is used for converging light rays, so that the incident light rays can be guided to the photoelectric conversion element 117 more.
The saturated exposure amount Q of the pixel is related to the full well capacity of the photoelectric conversion element 117. The larger the full well capacity, the larger the saturated exposure Q. The full well capacity is related to the doping concentration and volume of the n-well layer 1172 of the photoelectric conversion element 117. When the doping concentration of the n-well layer 1172 of the photoelectric conversion element 117 is the same, the full well capacity of the photoelectric conversion element 117 is related to the volume of the n-well layer 1172 of the photoelectric conversion element 117, and the larger the volume of the n-well layer 1172, the larger the full well capacity. The volume of n-well layer 1172 is related to the cross-section and depth of the n-well layer. The volume of the n-well layer 1172 can be increased by increasing the depth while the cross-section of the n-well layer 1172 is fixed. When the volume of the n-well layer 1172 of the photoelectric conversion element 117 is the same, the full well capacity of the photoelectric conversion element 117 is related to the doping concentration of the n-well layer 1172 of the photoelectric conversion element 117, and the larger the doping concentration of the n-well layer 1172, the larger the full well capacity.
For example, fig. 14 to 16 are schematic cross-sectional views of the pixel array 11 of any one of the above embodiments taken along the light receiving direction. In the light receiving direction, the sizes of a plurality of cross sections of the n-well layer 1172 of each pixel (the same pixel) are equal; the size of the cross section of the n-well layer 1172 of the panchromatic pixel W is equal to the size of the cross section of the n-well layer 1172 of the color pixel; the depth H1 of the n-well layer 1172 of the panchromatic pixel W is equal to the depth H2 of the n-well layer 1172 of the color pixel, and the doping concentration C1 of the n-well layer 1172 of the panchromatic pixel W is greater than the doping concentration C2 of the n-well layer 1172 of the color pixel, so that the panchromatic pixel W has a larger full-well capacity than the color pixel.
Note that, in the light receiving direction, the fact that the sizes of the plurality of cross sections of the n-well layer 1172 of the same pixel are all equal means that: the plurality of cross sections have the same area, and the corresponding side lengths in the plurality of cross sections are all equal. The cross section may be a polygon such as a rectangle, a square, a parallelogram, a rhombus, a pentagon, a hexagon, etc., and of course, the sizes of the plurality of cross sections of the n-well layer 1172 of each pixel (the same pixel) may not be equal to each other along the light receiving direction, which is not limited herein.
Since the doping concentration C1 of the n-well layer 1172 of the panchromatic pixel W is greater than the doping concentration C2 of the n-well layer 1172 of the color pixel, the panchromatic pixel W has a larger full-well capacity than the color pixel, the exposure Q for saturation of the panchromatic pixel W is increased, the problem of premature saturation of the panchromatic pixel W can be avoided, the exposure of the panchromatic pixel W and the color pixel can be equalized, and the image capturing quality is improved.
For example, fig. 17 to 19 are schematic cross-sectional views of the pixel array 11 according to any one of the above embodiments taken along the light receiving direction. The doping concentration C1 of the n-well layer 1172 of the full-color pixel W is equal to the doping concentration C2 of the n-well layer 1172 of the color pixel. In the light receiving direction, the sizes of a plurality of cross sections of the n-well layer 1172 of each pixel (the same pixel) are equal; the size of the cross section of the n-well layer 1172 of the panchromatic pixel W is equal to the size of the cross section of the n-well layer 1172 of the color pixel; the depth H1 of the n-well layer 1172 of the full-color pixel W is greater than the depth H2 of the n-well layer 1172 of the color pixel. This allows the volume of the n-well layer 1172 of a panchromatic pixel W, which has a larger full-well capacity than the color pixels, to be greater than the volume of the n-well layer 1172 of the color pixels.
Since the volume of the n-well layer 1172 of the panchromatic pixel W is larger than that of the n-well layer 1172 of the color pixel, the panchromatic pixel W has a larger full-well capacity than the color pixel, the exposure Q for saturation of the panchromatic pixel W is increased, the problem of premature saturation of the panchromatic pixel W can be avoided, the exposure of the panchromatic pixel W and the color pixel can be balanced, and the image shooting quality is improved.
In some embodiments, to make the panchromatic pixel have a larger full-well capacity than the color pixels, the doping concentration C1 of the n-well layer 1172 of the panchromatic pixel W may be set larger than the doping concentration C2 of the n-well layer 1172 of the color pixel while the depth H1 of the n-well layer 1172 of the panchromatic pixel is set larger than the depth H2 of the n-well layer 1172 of the color pixel (i.e., the volume of the n-well layer 1172 of the panchromatic pixel W is set larger than the volume of the n-well layer 1172 of the color pixel), so that the exposure amount Q at which the panchromatic pixel W is saturated is increased, the problem of early saturation of the panchromatic pixel W is avoided, and the image-taking quality is.
Referring to fig. 4 and 20, the present application further provides a control method for the image sensor 10 according to any one of the above embodiments. The control method comprises the following steps:
01: when the ambient brightness is less than the first threshold value, the in-focus detection is performed for in-focus according to the phase information of the panchromatic pixel W in the panchromatic phase detection pixel pair 18:
02: when the ambient brightness is greater than a second threshold value, focus detection is performed for focusing on the basis of the phase information of the color pixels in the color phase detection pixel pair 17, the second threshold value being greater than the first threshold value.
Referring to fig. 21, the present application further provides an imaging device 100, where the imaging device 100 includes the image sensor 10 and the lens 20 according to any of the above embodiments, and the image sensor 10 is capable of receiving light passing through the lens 20. In some embodiments, the imaging apparatus 100 further includes a processor 30, and step 01 and step 02 of the above control method can be implemented by the processor 30. That is, the processor 30 is configured to perform focus detection for focusing based on the phase information of the panchromatic pixel W in the panchromatic phase detection pixel pair 18 when the ambient brightness is less than the first threshold value: and when the ambient brightness is larger than a second threshold value, carrying out focusing detection according to the phase information of the color pixels in the color phase detection pixel pair 17 so as to focus, wherein the second threshold value is larger than the first threshold value.
The control method and the imaging device 100 in the embodiment of the application set the color phase detection pixel pair 17 and the panchromatic phase detection pixel pair 18 in the image sensor 10 in the imaging device 100, and under the environment with higher brightness, the color pixels in the color phase detection pixel pair 17 can receive more light, and at this time, the focusing detection is performed according to the phase information of the color pixels in the color phase detection pixel pair 17 for focusing; in an environment with low brightness, the panchromatic pixel W in the panchromatic phase detection pixel pair 18 is not easily saturated, and at this time, focusing detection is performed according to the phase information of the panchromatic pixel W in the panchromatic phase detection pixel pair 18 for focusing, so that accurate focusing can be realized in scenes with different ambient brightness, and the scene adaptability of the image sensor is improved.
Referring to fig. 4, 20 and 21, when the ambient brightness is smaller than the first threshold, which indicates that the brightness is low and the color pixels can receive less light, the phase information of the panchromatic pixel W in the panchromatic phase detecting pixel pair 18 is detected for focusing according to the panchromatic phase detecting pixel. Illustratively, the pair of full-color phase detection pixels 18 includes a first full-color pixel W disposed on a first side, and a second full-color pixel W disposed on a second side. When the ambient brightness is less than the first threshold, the processor 30 acquires the phase information of the first panchromatic pixel W and the phase information of the second panchromatic pixel W, and then the processor 30 calculates the phase difference from the phase information of the first panchromatic pixel W and the phase information of the second panchromatic pixel W to perform focusing.
When the ambient brightness is greater than the second threshold, which indicates that the brightness is high and the panchromatic pixel W is easily saturated, the focus detection is performed for focusing on the phase information of the color pixels in the pair 17 based on the color phase detection pixels. Illustratively, the color phase detection pixel pair 17 includes a first color pixel disposed on the first side, and a second color pixel disposed on the second side. When the ambient brightness is greater than the second threshold, the processor 30 obtains the phase information of the first color pixel and the phase information of the second color pixel, and then the processor 30 calculates the phase difference according to the phase information of the first color pixel and the phase information of the second color pixel to perform focusing.
In some embodiments, when the ambient brightness is less than the first threshold, the processor 30 calculates a phase difference between the phase information of the first panchromatic pixel W and the phase information of the second panchromatic pixel W, if the phase difference is zero, it indicates that focusing is successful without performing other operations, if the phase difference is not zero, the image sensor 10 is moved to the first direction, if the phase difference between the phase information of the first panchromatic pixel W and the phase information of the second panchromatic pixel W is decreased, the image sensor 10 is continuously moved to the first direction until reaching a certain position, and the phase difference between the phase information of the first panchromatic pixel W and the phase information of the second panchromatic pixel W is zero, so that focusing is completed; if the phase difference between the phase information of the first panchromatic pixel W and the phase information of the second panchromatic pixel W increases, when the image sensor 10 is moved to a certain position in the second direction, the phase difference between the phase information of the first panchromatic pixel W and the phase information of the second panchromatic pixel W becomes zero, and thus focusing is completed. When the ambient brightness is greater than the second threshold, the processor 30 calculates a phase difference between the phase information of the first color pixel and the phase information of the second color pixel, if the phase difference is zero, it indicates that focusing is successful without performing other operations, if the phase difference is not zero, the image sensor 10 is moved to the first direction, if the phase difference between the phase information of the first color pixel and the phase information of the second color pixel is reduced, the image sensor 10 is continuously moved to the first direction until reaching a certain position, and the phase difference between the phase information of the first color pixel and the phase information of the second color pixel is zero, so that focusing is completed; if the phase difference between the phase information of the first color pixel and the phase information of the second color pixel increases, when the image sensor 10 is moved to a certain position in the second direction, the phase difference between the phase information of the first color pixel and the phase information of the second color pixel is zero, and thus the focusing is completed. Note that the second direction is opposite to the first direction. In some embodiments, focusing may also be achieved by moving the lens 20.
Since the panchromatic pixels W in the panchromatic phase detection pixel pair 18 are less likely to be saturated when the luminance is smaller than the first threshold value, the focusing accuracy can be made higher by performing the focusing detection using the phase information of the panchromatic pixels W in the panchromatic phase detection pixel pair 18. When the brightness is greater than the second threshold, the color pixels in the color phase detection pixel pair 17 can receive more light and can output pixel information with high signal-to-noise ratio, and at this time, the phase information of the color pixels in the color phase detection pixel pair 17 is used for performing focusing detection, so that the focusing accuracy can be high.
Note that, as shown in fig. 4, when two panchromatic pixels W in the pair of panchromatic phase detecting pixels 18 are disposed in the same row of the pixel array 11, the first side of the pair of panchromatic phase detecting pixels 18 represents the left side, and the second side represents the right side; when two full-color pixels W in the full-color phase detection pixel pair 18 are disposed in the same column of the pixel array 11 as shown in fig. 9, the first side of the full-color phase detection pixel pair 18 represents the upper side, and the second side represents the lower side. Likewise, as shown in fig. 4, when two color pixels W of the pair of color phase detection pixels 17 are disposed on the same row of the pixel array 11, the first side of the pair of color phase detection pixels 17 represents the left side, and the second side represents the right side; when two color pixels W of the pair of color phase detection pixels 17 are disposed in the same column of the pixel array 11 as shown in fig. 9, the first side of the pair of color phase detection pixels 17 represents the upper side and the second side represents the lower side. The terms "left", "right", "upper" and "lower" in the above description only indicate the directional positions shown in the drawings, and do not limit the embodiments of the present application.
Referring to fig. 4 and 22, in some embodiments, the control method further includes:
03: when the ambient brightness is greater than the first threshold value and less than the second threshold value, the phase information of the panchromatic pixel W in the pair of panchromatic phase detection pixels 18 is subjected to focus detection for focusing.
Referring to fig. 21, step 03 can be implemented by the processor 30. That is, the processor 30 is also configured to perform focus detection for focusing on the phase information of the panchromatic pixel W in the pair of panchromatic phase detection pixels 18 when the ambient brightness is greater than the first threshold value and less than the second threshold value.
When the ambient brightness is greater than the first threshold and less than the second threshold, which indicates that the brightness is not too high and the panchromatic pixel W is not easily saturated, the phase information of the panchromatic pixel W in the panchromatic phase detection pixel pair 18 may be subjected to focus detection for focusing, and a specific implementation manner of performing focus detection for focusing according to the phase information of the panchromatic pixel W in the panchromatic phase detection pixel pair 18 in the above embodiment is the same as the specific implementation manner of performing focus detection for focusing according to the panchromatic phase detection pixel pair 18 in the above embodiment, and details are not described here.
Referring to fig. 4 and 23, in some embodiments, the control method further includes:
04: when the ambient brightness is greater than the first threshold value and less than the second threshold value, the focus detection is performed for focusing on the basis of the phase information of the color pixels in the color phase detection pixel pair 17.
Referring to fig. 21, step 04 may be implemented by processor 30. That is, the processor 30 is also configured to perform focus detection for focusing according to the phase information of the color pixels in the color phase detection pixel pair 17 when the ambient brightness is greater than the first threshold and less than the second threshold.
When the ambient brightness is greater than the first threshold and less than the second threshold, which indicates that the brightness is not too low at this time, and the color pixels can receive more incident light, the focusing detection may be performed according to the phase information of the color pixels in the color phase detection pixel pair 17 for focusing, and a specific implementation manner of performing the focusing detection according to the phase information of the color pixels in the color phase detection pixel pair 17 for focusing in the above embodiment is the same as that of performing the focusing detection according to the phase information of the color pixels in the color phase detection pixel pair 17 for focusing in the above embodiment, which is not described herein again.
Referring to fig. 4 and 24, in some embodiments, the control method further includes:
05: when the ambient brightness is greater than the first threshold value and less than the second threshold value, focus detection is performed for focusing on the basis of the phase information of the panchromatic pixels in the pair of panchromatic phase detection pixels 18 and the phase information of the color pixels in the pair of color phase detection pixels 17.
Referring to fig. 21, step 05 may be implemented by processor 30. That is, the processor 30 is also configured to perform focus detection for focusing based on the phase information of the panchromatic pixels in the panchromatic phase detection pixel pair 18 and the phase information of the color pixels in the color phase detection pixel pair 17 when the ambient brightness is greater than the first threshold value and less than the second threshold value.
When the ambient brightness is greater than the first threshold and less than the second threshold, it is indicated that the brightness is not too high at this time, and the panchromatic pixel W is not easily saturated; and the brightness is not too low, so that the color pixels can receive more incident light. Focus detection can be performed for focusing based on the phase information of the panchromatic pixels in the pair of panchromatic phase detection pixels 18 and the phase information of the color pixels in the pair of color phase detection pixels 17. When the ambient brightness is greater than the first threshold and less than the second threshold, the phase information of the panchromatic pixels in the color phase detection pixel pair 18 and the phase information of the color pixels in the color phase detection pixel pair 17 are jointly focused for focusing, so that errors can be reduced, the focusing accuracy can be improved, and the image quality of the finally obtained image can be improved.
For example, referring to fig. 4, 24 and 25, step 05 further includes:
051: acquiring phase information of all pixels in the panchromatic phase detection pixel pair 18 and the color phase detection pixel pair 17;
052: obtaining first phase information from phase information of a first panchromatic pixel W which is a panchromatic pixel W disposed on a first side of the pair of panchromatic phase detecting pixels 18 and phase information of a first color pixel which is a color pixel disposed on a first side of the pair of color phase detecting pixels 17;
053: obtaining second phase information from phase information of a second panchromatic pixel W which is a panchromatic pixel W disposed on a second side of the pair of panchromatic phase detection pixels 18 and phase information of a second color pixel which is a color pixel disposed on a second side of the pair of color phase detection pixels 17, the second side being disposed opposite to the first side;
054: and calculating the phase difference according to the first phase information and the second phase information to focus.
Referring to fig. 21, step 05 may be implemented by processor 30. That is, the processor 30 is also used to take the phase information of all the pixels in the panchromatic phase detection pixel pair 18 and the color phase detection pixel pair 17; obtaining first phase information from phase information of first panchromatic pixels, which are the panchromatic pixels W disposed on the first side of the pair of panchromatic phase detection pixels 18, and phase information of first color pixels, which are the color pixels disposed on the first side of the pair of color phase detection pixels 17; obtaining second phase information from phase information of a second panchromatic pixel which is a panchromatic pixel W disposed on a second side of the pair of panchromatic phase detection pixels 18 and phase information of a second color pixel which is a color pixel disposed on a second side of the pair of color phase detection pixels 17, the second side being disposed opposite to the first side; and calculating the phase difference according to the first phase information and the second phase information to carry out focusing.
Specifically, the pair of full-color phase detection pixels 18 includes a first full-color pixel W disposed on the first side and a second full-color pixel W disposed on the second side, and the pair of color phase detection pixels 17 includes a first color pixel disposed on the first side and a second color pixel disposed on the second side. When the ambient brightness is greater than the first threshold value and less than the second threshold value, the processor 30 acquires phase information of all pixels of the pair of panchromatic phase detection pixels 18 and the pair of color phase detection pixels 17, and the processor 30 acquires phase information of the first panchromatic pixel W, the second panchromatic pixel W, the first color pixel, and the second color pixel. Obtaining first phase information according to the phase information of the first panchromatic pixel W and the phase information of the first color pixel; and obtaining second phase information according to the phase information of the second panchromatic pixel W and the phase information of the second color pixel. The processor 30 then calculates a phase difference from the first phase information and the second phase information for focusing.
Referring to fig. 26, the present application further provides a mobile terminal 1000. Terminal 1000 can be cell-phone, panel computer, notebook computer, intelligent wearing equipment (such as intelligent wrist-watch, intelligent bracelet, intelligent glasses, intelligent helmet etc.), head display device, virtual reality equipment etc. do not do the restriction here. Terminal 1000 can include housing 200 and imaging device 100. The housing 200 is combined with the image forming apparatus 100. It should be noted that processor 30 can also be disposed on terminal 1000.
The terminal 100 in the embodiment of the present application arranges the color phase detection pixel pair 17 and the panchromatic phase detection pixel pair 18 in the image sensor 10 in the terminal 100, and under an environment with higher brightness, the color pixels in the color phase detection pixel pair 17 can receive more light, and at this time, focus detection is performed for focusing according to the phase information of the color pixels in the color phase detection pixel pair 17; in an environment with low brightness, the panchromatic pixel W in the panchromatic phase detection pixel pair 18 is not easily saturated, and at this time, focusing detection is performed according to the phase information of the panchromatic pixel W in the panchromatic phase detection pixel pair 18 for focusing, so that accurate focusing can be realized in scenes with different ambient brightness, and the scene adaptability of the image sensor is improved.
Referring to fig. 27, the present application also provides a non-transitory computer readable storage medium 600 containing a computer program, on which a computer program 610 is stored. The computer program, when executed by the processor 300, causes the processor 300 to perform the control method of any of the above embodiments.
For example, referring to fig. 20 and 27, the computer program, when executed by the processor 300, causes the processor 300 to perform the steps of:
01: when the ambient brightness is less than the first threshold value, the in-focus detection is performed for in-focus according to the phase information of the panchromatic pixel W in the panchromatic phase detection pixel pair 18:
02: when the ambient brightness is greater than a second threshold value, focus detection is performed for focusing on the basis of the phase information of the color pixels in the color phase detection pixel pair 17, the second threshold value being greater than the first threshold value.
The computer-readable storage medium 600 may be disposed in the terminal 1000, or may be disposed in the cloud server, and at this time, the terminal 1000 can communicate with the cloud server to obtain the corresponding computer program 310.
It will be appreciated that the computer program 310 comprises computer program code. The computer program code may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable storage medium may include: any entity or device capable of carrying computer program code, recording medium, U-disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), software distribution medium, and the like.
The processor 300 may be referred to as a driver board. The driver board may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. It should be noted that the processor 300 may be the same processor as the processor 30 provided in the imaging apparatus 100, and the processor 30 may not be the same processor as the processor 30 provided in the imaging apparatus 100, which is not limited herein.
In the description herein, reference to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (16)

1. An image sensor comprising a pixel array and a lens array, the pixel array comprising a plurality of color pixels, a plurality of panchromatic pixels, at least one color phase detection pixel pair, and at least one panchromatic phase detection pixel pair;
the panchromatic phase detection pixel pair includes two adjacent panchromatic pixels, phase information of the panchromatic pixels in the panchromatic phase detection pixel pair is used for in-focus detection;
the pair of color phase detection pixels comprises two adjacent color pixels of the same color, the color pixels having a narrower spectral response than the panchromatic pixels, phase information of the color pixels in the pair of color phase detection pixels being used for focus detection;
the lens array includes a plurality of first lenses, each of the panchromatic phase detection pixel pairs being covered by the same one of the first lenses, each of the color phase detection pixel pairs being covered by the same one of the first lenses.
2. The control method according to claim 1, wherein phase information of the panchromatic pixel in the pair of panchromatic phase detection pixels is used for focus detection in a first luminance mode, and phase information of the color pixel in the pair of color phase detection pixels is used for focus detection in a second luminance mode; the brightness of the second brightness mode is greater than a first threshold, the brightness of the first brightness mode is less than a second threshold, and the first threshold is less than the second threshold.
3. The image sensor of claim 1, wherein the number of panchromatic phase detection pixel pairs is equal to the number of color phase detection pixel pairs.
4. The image sensor of claim 1, wherein the pixel array comprises a minimal repeating unit comprising a first type of pixel unit, at least one second type of pixel unit, and at least one third type of pixel unit, wherein the second type of pixel unit comprises at least one of the color phase detection pixel pairs, and wherein the third type of pixel unit comprises at least one of the panchromatic phase detection pixel pairs.
5. The image sensor of claim 4, wherein the locations of the pairs of color phase detection pixels in the second type of pixel cell correspond to the locations of the pairs of panchromatic phase detection pixels in the third type of pixel cell.
6. The image sensor of claim 4,
the second type of pixel unit can be formed by performing the following transformation on the basis of the first type of pixel unit:
at least one panchromatic pixel in the first pixel unit is replaced by one color pixel, and the color pixels of the same color adjacent to the panchromatic pixel form the color phase detection pixel pair; and/or
The third type of pixel unit can be formed by performing the following transformation on the basis of the first type of pixel unit:
at least one of the color pixels in the first-type pixel unit is replaced by one of the panchromatic pixels, and the panchromatic pixels adjacent to the panchromatic pixel form a panchromatic phase detection pixel pair.
7. The image sensor of claim 4, wherein the color pixels in the same first type of pixel cell are all the same single color pixels, the color pixels in the same second type of pixel cell are all the same single color pixels, the color pixels in the same third type of pixel cell are all the same single color pixels,
in the same first pixel unit, the color pixels are arranged in a first diagonal direction of the first pixel unit, and the panchromatic pixels are arranged in a second diagonal direction of the first pixel unit;
in the same second type pixel unit, the panchromatic pixel is arranged in a second diagonal direction of the second type pixel unit, and the other color pixels except the color phase detection pixel are arranged in a first diagonal direction of the second type pixel unit;
in the same third type pixel unit, the color pixels are arranged in a first diagonal direction of the third type pixel unit, and the panchromatic pixels except the panchromatic phase detection pixel are arranged in a second diagonal direction of the third type pixel unit.
8. The image sensor of claim 1, wherein the pixel array comprises a minimal repeating unit comprising a first type of pixel sub-unit and at least one second type of pixel sub-unit, the second type of pixel sub-unit comprising at least one of the color phase detection pixel pairs and at least one of the panchromatic phase detection pixel pairs.
9. The image sensor of claim 8,
the second type of pixel unit can be formed by changing the following steps on the basis of the first type of pixel unit:
at least one of the panchromatic pixels in the first-type pixel unit is replaced with one of the color pixels and the color pixels of the same color adjacent thereto form the pair of color phase detection pixels, and at least one of the color pixels in the first-type pixel unit is replaced with one of the panchromatic pixels and the pair of panchromatic phase detection pixels adjacent thereto.
10. The control method of claim 1 wherein the doping concentration of the n-well layer of the panchromatic pixel is greater than the doping concentration of the n-well layer of the color pixel such that the panchromatic pixel has a greater full-well capacity than the color pixel; and/or
The depth of the n-well layer of the panchromatic pixel is greater than the depth of the n-well layer of the color pixel such that the panchromatic pixel has a greater full-well capacity than the color pixel.
11. A control method for an image sensor comprising a pixel array and a lens array, the pixel array comprising a plurality of color pixels, a plurality of panchromatic pixels, at least one color phase detection pixel pair, and at least one panchromatic phase detection pixel pair; the panchromatic phase detection pixel pair comprises two adjacent panchromatic pixels; the color phase detection pixel pair comprises two adjacent color pixels of the same color, the color pixels having a narrower spectral response than the panchromatic pixels; the lens array includes a plurality of first lenses, each of the panchromatic phase detection pixel pairs is covered by the same one of the first lenses, each of the color phase detection pixel pairs is covered by the same one of the first lenses, the control method includes:
performing focus detection for focusing according to phase information of the panchromatic pixels in the panchromatic phase detection pixel pair when ambient brightness is less than a first threshold; and
and when the ambient brightness is greater than a second threshold value, carrying out focusing detection according to the phase information of the color pixels in the color phase detection pixel pair so as to focus, wherein the second threshold value is greater than the first threshold value.
12. The control method according to claim 11, characterized by further comprising:
performing focus detection for focusing according to phase information of the panchromatic pixels in the panchromatic phase detection pixel pair when ambient brightness is greater than the first threshold and less than the second threshold; or
When the ambient brightness is larger than the first threshold and smaller than the second threshold, performing focusing detection according to the phase information of the color pixels in the color phase detection pixel pair to focus; or
When the ambient brightness is greater than the first threshold value and less than the second threshold value, performing focus detection for focusing according to phase information of the panchromatic pixels in a panchromatic phase detection pixel pair and phase information of the color pixels in the color phase detection pixel pair.
13. The control method according to claim 12, wherein performing focusing based on phase information of the panchromatic pixels in a panchromatic phase detection pixel pair and phase information of the color pixels in a color phase detection pixel pair, comprises:
acquiring phase information of all pixels in the panchromatic phase detection pixel pair and the color phase detection pixel pair;
obtaining first phase information from phase information of a first panchromatic pixel which is the panchromatic pixel disposed on a first side of the pair of panchromatic phase detection pixels and phase information of a first color pixel which is the color pixel disposed on a first side of the pair of color phase detection pixels;
obtaining second phase information from phase information of a second panchromatic pixel and phase information of a second color pixel, wherein the second panchromatic pixel is arranged at a second side of the panchromatic phase detection image pair pixel, the second color pixel is arranged at a second side of the color phase detection image pair pixel, and the second side is arranged opposite to the first side;
and calculating a phase difference according to the first phase information and the second phase information to focus.
14. An image forming apparatus, comprising:
a lens; and
the image sensor of any of claims 1-10, said image sensor capable of receiving light passing through said lens.
15. A terminal, comprising:
a housing; and
the imaging device of claim 14, in combination with the housing.
16. A non-transitory computer-readable storage medium containing a computer program, wherein the computer program, when executed by a processor, causes the processor to execute the control method of any one of claims 10 to 13.
CN202011105845.9A 2020-10-15 2020-10-15 Image sensor, control method, imaging device, terminal, and readable storage medium Active CN112235494B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011105845.9A CN112235494B (en) 2020-10-15 2020-10-15 Image sensor, control method, imaging device, terminal, and readable storage medium
CN202210323572.8A CN114845015A (en) 2020-10-15 2020-10-15 Image sensor, control method, imaging apparatus, terminal, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011105845.9A CN112235494B (en) 2020-10-15 2020-10-15 Image sensor, control method, imaging device, terminal, and readable storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210323572.8A Division CN114845015A (en) 2020-10-15 2020-10-15 Image sensor, control method, imaging apparatus, terminal, and readable storage medium

Publications (2)

Publication Number Publication Date
CN112235494A true CN112235494A (en) 2021-01-15
CN112235494B CN112235494B (en) 2022-05-20

Family

ID=74118715

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210323572.8A Pending CN114845015A (en) 2020-10-15 2020-10-15 Image sensor, control method, imaging apparatus, terminal, and readable storage medium
CN202011105845.9A Active CN112235494B (en) 2020-10-15 2020-10-15 Image sensor, control method, imaging device, terminal, and readable storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202210323572.8A Pending CN114845015A (en) 2020-10-15 2020-10-15 Image sensor, control method, imaging apparatus, terminal, and readable storage medium

Country Status (1)

Country Link
CN (2) CN114845015A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113419252A (en) * 2021-06-10 2021-09-21 Oppo广东移动通信有限公司 Time-of-flight module, terminal and depth detection method
CN113660415A (en) * 2021-08-09 2021-11-16 Oppo广东移动通信有限公司 Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium
CN113676617A (en) * 2021-08-12 2021-11-19 Oppo广东移动通信有限公司 Motion detection method, motion detection device, electronic equipment and computer-readable storage medium
CN113747022A (en) * 2021-09-09 2021-12-03 Oppo广东移动通信有限公司 Image sensor, camera assembly and mobile terminal
CN113891006A (en) * 2021-11-22 2022-01-04 Oppo广东移动通信有限公司 Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117706847A (en) * 2024-01-31 2024-03-15 荣耀终端有限公司 Focusing method, electronic equipment and medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104380168A (en) * 2012-06-19 2015-02-25 富士胶片株式会社 Imaging device and automatic focus adjustment method
CN106449674A (en) * 2015-08-05 2017-02-22 豪威科技股份有限公司 Image sensor with symmetric multi-pixel phase-difference detectors, and associated methods
CN110087065A (en) * 2019-04-30 2019-08-02 德淮半导体有限公司 Semiconductor device and its manufacturing method
CN111263129A (en) * 2020-02-11 2020-06-09 Oppo广东移动通信有限公司 Image sensor, camera assembly and mobile terminal
CN111586323A (en) * 2020-05-07 2020-08-25 Oppo广东移动通信有限公司 Image sensor, control method, camera assembly and mobile terminal

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070046807A1 (en) * 2005-08-23 2007-03-01 Eastman Kodak Company Capturing images under varying lighting conditions
US7855740B2 (en) * 2007-07-20 2010-12-21 Eastman Kodak Company Multiple component readout of image sensor
US9749556B2 (en) * 2015-03-24 2017-08-29 Semiconductor Components Industries, Llc Imaging systems having image sensor pixel arrays with phase detection capabilities
CN111756974A (en) * 2020-05-15 2020-10-09 深圳市汇顶科技股份有限公司 Image sensor and electronic device
CN111464733B (en) * 2020-05-22 2021-10-01 Oppo广东移动通信有限公司 Control method, camera assembly and mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104380168A (en) * 2012-06-19 2015-02-25 富士胶片株式会社 Imaging device and automatic focus adjustment method
CN106449674A (en) * 2015-08-05 2017-02-22 豪威科技股份有限公司 Image sensor with symmetric multi-pixel phase-difference detectors, and associated methods
CN110087065A (en) * 2019-04-30 2019-08-02 德淮半导体有限公司 Semiconductor device and its manufacturing method
CN111263129A (en) * 2020-02-11 2020-06-09 Oppo广东移动通信有限公司 Image sensor, camera assembly and mobile terminal
CN111586323A (en) * 2020-05-07 2020-08-25 Oppo广东移动通信有限公司 Image sensor, control method, camera assembly and mobile terminal

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113419252A (en) * 2021-06-10 2021-09-21 Oppo广东移动通信有限公司 Time-of-flight module, terminal and depth detection method
CN113660415A (en) * 2021-08-09 2021-11-16 Oppo广东移动通信有限公司 Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium
WO2023016144A1 (en) * 2021-08-09 2023-02-16 Oppo广东移动通信有限公司 Focusing control method and apparatus, imaging device, electronic device, and computer readable storage medium
CN113676617A (en) * 2021-08-12 2021-11-19 Oppo广东移动通信有限公司 Motion detection method, motion detection device, electronic equipment and computer-readable storage medium
WO2023016183A1 (en) * 2021-08-12 2023-02-16 Oppo广东移动通信有限公司 Motion detection method and apparatus, electronic device, and computer-readable storage medium
CN113676617B (en) * 2021-08-12 2023-08-18 Oppo广东移动通信有限公司 Motion detection method, motion detection device, electronic device and computer-readable storage medium
CN113747022A (en) * 2021-09-09 2021-12-03 Oppo广东移动通信有限公司 Image sensor, camera assembly and mobile terminal
CN113747022B (en) * 2021-09-09 2023-05-05 Oppo广东移动通信有限公司 Image sensor, camera assembly and mobile terminal
CN113891006A (en) * 2021-11-22 2022-01-04 Oppo广东移动通信有限公司 Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium

Also Published As

Publication number Publication date
CN114845015A (en) 2022-08-02
CN112235494B (en) 2022-05-20

Similar Documents

Publication Publication Date Title
CN112235494B (en) Image sensor, control method, imaging device, terminal, and readable storage medium
CN111405204B (en) Image acquisition method, imaging device, electronic device, and readable storage medium
CN111757006B (en) Image acquisition method, camera assembly and mobile terminal
WO2021223590A1 (en) Image sensor, control method, camera assembly, and mobile terminal
CN111479071B (en) High dynamic range image processing system and method, electronic device, and readable storage medium
CN111314592B (en) Image processing method, camera assembly and mobile terminal
CN111385543B (en) Image sensor, camera assembly, mobile terminal and image acquisition method
CN110649057B (en) Image sensor, camera assembly and mobile terminal
US20230086743A1 (en) Control method, camera assembly, and mobile terminal
CN111447380B (en) Control method, camera assembly and mobile terminal
CN110784634B (en) Image sensor, control method, camera assembly and mobile terminal
CN110740272A (en) Image acquisition method, camera assembly and mobile terminal
CN111899178B (en) Image processing method, image processing system, electronic device, and readable storage medium
WO2021159944A1 (en) Image sensor, camera assembly, and mobile terminal
CN111741221B (en) Image acquisition method, camera assembly and mobile terminal
US20220336508A1 (en) Image sensor, camera assembly and mobile terminal
CN112738493B (en) Image processing method, image processing apparatus, electronic device, and readable storage medium
CN114008782A (en) Image sensor, camera assembly and mobile terminal
CN111835971B (en) Image processing method, image processing system, electronic device, and readable storage medium
CN111970459B (en) High dynamic range image processing system and method, electronic device, and readable storage medium
CN111031297B (en) Image sensor, control method, camera assembly and mobile terminal
WO2021046691A1 (en) Image collection method, camera assembly and mobile terminal
WO2021062662A1 (en) Image sensor, camera assembly, and mobile terminal
US20220279108A1 (en) Image sensor and mobile terminal
CN112235485B (en) Image sensor, image processing method, imaging device, terminal, and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant