CN114222047A - Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium - Google Patents

Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium Download PDF

Info

Publication number
CN114222047A
CN114222047A CN202111617501.0A CN202111617501A CN114222047A CN 114222047 A CN114222047 A CN 114222047A CN 202111617501 A CN202111617501 A CN 202111617501A CN 114222047 A CN114222047 A CN 114222047A
Authority
CN
China
Prior art keywords
target
pixel
phase information
pixels
array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111617501.0A
Other languages
Chinese (zh)
Inventor
杨鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202111617501.0A priority Critical patent/CN114222047A/en
Publication of CN114222047A publication Critical patent/CN114222047A/en
Priority to PCT/CN2022/132425 priority patent/WO2023124611A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals

Abstract

The application relates to a focusing control method and device, an image sensor, electronic equipment and a computer readable storage medium, which are applied to the image sensor, wherein the method comprises the following steps: determining a phase information output mode matched with the light intensity of the current shooting scene according to the light intensity of the current shooting scene; wherein, under different phase information output modes, the output phase arrays have different sizes. Outputting a phase array corresponding to the pixel array according to the phase information output mode; the phase array comprises phase information corresponding to a target pixel in the pixel array. And calculating the phase difference of the pixel array based on the phase array, and performing focusing control according to the phase difference. Therefore, the accuracy of the phase information output under different light intensities can be improved, and the accuracy of the focusing control can be further improved.

Description

Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a focus control method and apparatus, an image sensor, an electronic device, and a computer-readable storage medium.
Background
With the development of electronic devices, more and more users take images through the electronic devices. In order to ensure that a shot image is clear, a camera module of the electronic device generally needs to be focused, that is, a distance between a lens and an image sensor is adjusted to enable a shot object to be imaged on a focal plane. The conventional focusing method includes Phase Detection Auto Focus (PDAF).
The traditional phase detection automatic focusing mainly calculates a phase difference based on an RGB pixel array, then controls a motor based on the phase difference, and further drives a lens to move to a proper position for focusing by the motor so as to enable a shooting object to be imaged on a focal plane.
However, since the RGB pixel arrays have different sensitivities under different light intensities, the accuracy of the phase difference calculated by the RGB pixel arrays is low under a portion of the light intensities, and the focusing accuracy is greatly reduced.
Disclosure of Invention
The embodiment of the application provides a focusing control method and device, electronic equipment, an image sensor and a computer readable storage medium, which can improve the focusing accuracy.
In one aspect, an image sensor is provided that includes a microlens array, a pixel array, and a filter array, the filter array including a minimal repeating unit, the minimal repeating unit including a plurality of filter sets, the filter sets including a color filter and a panchromatic filter; the color filter has a narrower spectral response than the panchromatic filter, and the color filter and the panchromatic filter each comprise 9 sub-filters arranged in an array;
wherein the pixel array comprises a plurality of pixel groups, the pixel groups are panchromatic pixel groups or color pixel groups, each panchromatic pixel group corresponds to the panchromatic filter, and each color pixel group corresponds to the color filter; the panchromatic pixel group and the color pixel group respectively comprise 9 pixels, the pixels of the pixel array are arranged corresponding to the sub-filters of the filter array, and each pixel corresponds to one photosensitive element;
wherein the microlens array comprises a plurality of microlens sets, each microlens set corresponds to the panchromatic pixel set or the color pixel set, the microlens sets comprise a plurality of microlenses, and at least one microlens in the plurality of microlenses corresponds to at least two pixels.
In another aspect, there is provided a focus control method applied to the image sensor as described above, the method including:
determining a phase information output mode adaptive to the light intensity of the current shooting scene according to the light intensity of the current shooting scene; wherein, under different phase information output modes, the output phase arrays have different sizes;
outputting a phase array corresponding to the pixel array according to the phase information output mode; the phase array comprises phase information corresponding to a target pixel in the pixel array;
and calculating the phase difference of the pixel array based on the phase array, and performing focusing control according to the phase difference.
In another aspect, there is provided a focus control apparatus applied to the image sensor as described above, the apparatus including:
the phase information output mode determining module is used for determining a phase information output mode adaptive to the light intensity of the current shooting scene according to the light intensity of the current shooting scene; wherein, under different phase information output modes, the output phase arrays have different sizes;
the phase array output module is used for outputting a phase array corresponding to the pixel array according to the phase information output mode; the phase array comprises phase information corresponding to a target pixel in the pixel array;
and the focusing control module is used for calculating the phase difference of the pixel array based on the phase array and carrying out focusing control according to the phase difference.
In another aspect, an electronic device is provided, which includes a memory and a processor, wherein the memory stores a computer program, and the computer program, when executed by the processor, causes the processor to execute the steps of the focusing control method.
In another aspect, a computer-readable storage medium is provided, having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method as set forth above.
In another aspect, a computer program product is provided, comprising computer program/instructions, characterized in that the computer program/instructions, when executed by a processor, implement the steps of the focus control method as described above.
According to the focusing control method, the focusing control device, the image sensor, the electronic equipment and the computer readable storage medium, the phase information output mode adaptive to the light intensity of the current shooting scene is determined according to the light intensity of the current shooting scene; wherein, under different phase information output modes, the output phase arrays have different sizes. Outputting a phase array corresponding to the pixel array according to the phase information output mode; the phase array comprises phase information corresponding to a target pixel in the pixel array. And calculating the phase difference of the pixel array based on the phase array, and performing focusing control according to the phase difference.
Under different light intensities of the current shooting scene, the accuracy of the original phase information which can be collected is different. Therefore, different phase information output modes can be adopted for the same pixel array according to the light intensity of the current shooting scene, and phase arrays with different sizes can be output based on the original phase information. Because the signal-to-noise ratios of the phase arrays with different sizes are different, the accuracy of the phase information output under different light intensities can be improved, and the accuracy of focusing control is further improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of an electronic device in one embodiment;
FIG. 2 is a schematic diagram of the phase detection autofocus;
fig. 3 is a schematic diagram of arranging phase detection pixel points in pairs among pixel points included in an image sensor;
FIG. 4 is an exploded view of an image sensor in one embodiment;
FIG. 5 is a schematic diagram of the connection of a pixel array and readout circuitry in one embodiment;
FIG. 6 is a diagram illustrating an exemplary layout of a minimal repeating unit of a pixel array;
FIG. 7 is a diagram illustrating an arrangement of minimum repeating units of a pixel array according to another embodiment;
FIG. 8 is a schematic diagram showing an arrangement of minimum repeating units of a microlens array in one embodiment;
FIG. 9 is a schematic view showing an arrangement of minimum repeating units of a microlens array in another embodiment;
FIG. 10 is a flow chart of a focus control method in one embodiment;
FIG. 11 is a flow chart illustrating a method for outputting a phased array corresponding to a pixel array according to a phase information output mode in one embodiment;
FIG. 12 is a flowchart illustrating a method for determining a phase information output mode adapted to the light intensity of the current shooting scene according to the light intensity of the current shooting scene in FIG. 11;
FIG. 13 is a flowchart illustrating a method for determining a phase information output mode adapted to the light intensity of the current scene according to the target light intensity range in FIG. 12;
FIG. 14 is a schematic diagram of generating a full-scale phased array in one embodiment;
FIG. 15 is a flow diagram of a method for generating a first size phased array in one embodiment;
FIG. 16 is a schematic diagram of one embodiment of generating a second size phased array;
FIG. 17 is a block diagram showing the structure of a focus control apparatus according to an embodiment;
FIG. 18 is a block diagram of the phase array output module of FIG. 17;
fig. 19 is a schematic diagram of an internal structure of an electronic device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," "third," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first dimension may be referred to as a second dimension, and similarly, a second dimension may be referred to as a first dimension, without departing from the scope of the present application. The first size and the second size are both sizes, but they are not the same size. The first preset threshold may be referred to as a second preset threshold, and similarly, the second preset threshold may be referred to as a first preset threshold. Both the first preset threshold and the second preset threshold are preset thresholds, but they are not the same preset threshold.
FIG. 1 is a diagram illustrating an application environment of a focus control method according to an embodiment. As shown in fig. 1, the application environment includes an electronic device 100. The electronic device 100 includes an image sensor including a pixel array, and determines a phase information output mode adapted to the light intensity of the current shooting scene according to the light intensity of the current shooting scene; wherein, under different phase information output modes, the output phase arrays have different sizes; outputting a phase array corresponding to the pixel array according to the phase information output mode; the phase array comprises phase information corresponding to the pixel array; and calculating the phase difference of the pixel array based on the phase array, and performing focusing control according to the phase difference. The electronic device may be any terminal device having an image processing function, such as a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a wearable device (smart band, smart watch, smart glasses, smart gloves, smart socks, smart belt, etc.), a VR (virtual reality) device, a smart home, and an unmanned vehicle.
The electronic device 100 includes, among other things, a camera 20, a processor 30, and a housing 40. The camera 20 and the processor 30 are disposed in the housing 40, and the housing 40 can also be used to mount functional modules of the terminal 100, such as a power supply device and a communication device, so that the housing 40 provides protection for the functional modules, such as dust prevention, drop prevention, and water prevention. The camera 20 may be a front camera, a rear camera, a side camera, an off-screen camera, etc., without limitation. The camera 20 includes a lens and an image sensor 21, when the camera 20 takes an image, light passes through the lens and reaches the image sensor 21, and the image sensor 21 is used for converting an optical signal irradiated onto the image sensor 21 into an electrical signal.
Fig. 2 is a schematic diagram of a Phase Detection Auto Focus (PDAF) principle. As shown in fig. 2, M1 is the position of the image sensor when the imaging device is in the in-focus state, where the in-focus state refers to a successfully focused state. When the image sensor is located at the position M1, the imaging light rays g reflected by the object W in different directions toward the Lens converge on the image sensor, that is, the imaging light rays g reflected by the object W in different directions toward the Lens are imaged at the same position on the image sensor, and at this time, the image sensor is imaged clearly.
M2 and M3 indicate positions where the image sensor may be located when the imaging device is not in focus, and as shown in fig. 2, when the image sensor is located at the M2 position or the M3 position, the imaging light rays g reflected by the object W in different directions toward the Lens will be imaged at different positions. Referring to fig. 2, when the image sensor is located at the position M2, the imaging light rays g reflected by the object W in different directions toward the Lens are imaged at the position a and the position B, respectively, and when the image sensor is located at the position M3, the imaging light rays g reflected by the object W in different directions toward the Lens are imaged at the position C and the position D, respectively, and at this time, the image sensor is not clear.
In the PDAF technique, the difference in the position of the image formed by the imaging light rays entering the lens from different directions in the image sensor can be obtained, for example, as shown in fig. 2, the difference between the position a and the position B, or the difference between the position C and the position D can be obtained; after acquiring the difference of the positions of images formed by imaging light rays entering the lens from different directions in the image sensor, obtaining the out-of-focus distance according to the difference and the geometric relationship between the lens and the image sensor in the camera, wherein the out-of-focus distance refers to the distance between the current position of the image sensor and the position where the image sensor is supposed to be in the in-focus state; the imaging device can focus according to the obtained defocus distance.
From this, it is understood that the calculated PD value is 0 at the time of focusing, whereas the larger the calculated value is, the farther the position of the clutch focus is indicated, and the smaller the value is, the closer the clutch focus is indicated. When PDAF focusing is adopted, the PD value is calculated, the corresponding relation between the PD value and the defocusing distance is obtained according to calibration, the defocusing distance can be obtained, and then the lens is controlled to move to reach the focusing point according to the defocusing distance, so that focusing is realized.
In the related art, some phase detection pixel points may be provided in pairs among the pixel points included in the image sensor, and as shown in fig. 3, a phase detection pixel point pair (hereinafter, referred to as a pixel point pair) a, a pixel point pair B, and a pixel point pair C may be provided in the image sensor. In each pixel point pair, one phase detection pixel point performs Left shielding (English), and the other phase detection pixel point performs Right shielding (English).
For the phase detection pixel point which is shielded on the left side, only the light beam on the right side in the imaging light beam which is emitted to the phase detection pixel point can image on the photosensitive part (namely, the part which is not shielded) of the phase detection pixel point, and for the phase detection pixel point which is shielded on the right side, only the light beam on the left side in the imaging light beam which is emitted to the phase detection pixel point can image on the photosensitive part (namely, the part which is not shielded) of the phase detection pixel point. Therefore, the imaging light beam can be divided into a left part and a right part, and the phase difference can be obtained by comparing images formed by the left part and the right part of the imaging light beam.
In one embodiment, it is further described that the image sensor includes a pixel array and a filter array, the filter array including a minimal repeating unit, the minimal repeating unit including a plurality of filter sets, the filter sets including color filters and panchromatic filters, the color filters being disposed in a first diagonal direction and the panchromatic filters being disposed in a second diagonal direction, the first diagonal direction being different from the second diagonal direction; the color filter has narrower spectral response than the panchromatic filter, and the color filter and the panchromatic filter respectively comprise 9 sub-filters arranged in an array;
the pixel array comprises a plurality of panchromatic pixel groups and a plurality of color pixel groups, wherein each panchromatic pixel group corresponds to a panchromatic filter, and each color pixel group corresponds to a color filter; the panchromatic pixel group and the color pixel group respectively comprise 9 pixels, the pixels of the pixel array are arranged corresponding to the sub-filters of the filter array, and each pixel corresponds to one photosensitive element.
The micro lens array comprises a plurality of micro lens groups, each micro lens group corresponds to a full-color pixel group or a color pixel group, the micro lens groups comprise a plurality of micro lenses, and at least one micro lens in the micro lenses corresponds to at least two pixels.
In the embodiment of the application, at least one microlens in the plurality of microlenses corresponds to at least two pixels. Thus, the phase difference of the pixel array can be calculated based on the phase information of the at least two pixels.
As shown in fig. 4, the image sensor 21 includes a microlens array 22, a filter array 23, and a pixel array 24.
The microlens array 22 includes a plurality of minimal repeating units 221, the minimal repeating unit 221 includes a plurality of microlens sets 222, and the microlens sets 222 include a plurality of microlenses 2221. The sub-filters in the filter array 23 and the pixels in the pixel array 24 are arranged in a one-to-one correspondence, and at least one of the plurality of microlenses 2221 corresponds to at least two pixels. The microlenses 2221 are configured to collect incident light, and the collected light passes through the corresponding sub-filters, is projected onto the pixels, and is received by the corresponding pixels, and then the pixels convert the received light into electrical signals.
The filter array 23 includes a plurality of minimal repeating units 231. Minimal repeating unit 231 may include a plurality of filter sets 232. Each filter set 232 includes a panchromatic filter 233 and a color filter 234, the color filter 234 having a narrower spectral response than the panchromatic filter 233. Each panchromatic filter 233 includes 9 sub-filters 2331, and each color filter 234 includes 9 sub-filters 2341. Different color filters 234 are also included in different filter sets.
The color corresponding to the wavelength band of the light transmitted by the color filter 234 of the filter set 232 in the minimum repeating unit 231 includes a color a, a color b, and/or a color c. The color corresponding to the wavelength band of the light transmitted by the color filter 234 of the filter set 232 includes a color a, a color b and a color c, or a color a, a color b or a color c, or a color a and a color b, or a color b and a color c, or a color a and a color c. The color a is red, the color b is green, and the color c is blue, or for example, the color a is magenta, the color b is cyan, and the color c is yellow, etc., which are not limited herein.
In one embodiment, the width of the wavelength band of the light transmitted by the color filter 234 is smaller than that of the light transmitted by the panchromatic filter 233, for example, the wavelength band of the light transmitted by the color filter 234 may correspond to the wavelength band of red light, the wavelength band of green light, or the wavelength band of blue light, and the wavelength band of the light transmitted by the panchromatic filter 233 is all the wavelength band of visible light, that is, the color filter 234 only allows a specific color light to pass through, while the panchromatic filter 233 can pass all the color light. Of course, the wavelength band of the transmitted light of the color filter 234 may correspond to other wavelength bands of color lights, such as magenta light, purple light, cyan light, yellow light, etc., and is not limited herein.
In one embodiment, the ratio of the number of color filters 234 in filter set 232 to the number of panchromatic filters 233 may be 1:3, 1:1, or 3: 1. For example, if the ratio of the number of the color filters 234 to the number of the panchromatic filters 233 is 1:3, the number of the color filters 234 is 1, the number of the panchromatic filters 233 is 3, and at this time, the number of the panchromatic filters 233 is large, and compared with the conventional case where only the color filters are used, more phase information can be acquired through the panchromatic filters 233 under dark light, so that the focusing quality is better; or, if the ratio of the number of the color filters 234 to the number of the panchromatic filters 233 is 1:1, the number of the color filters 234 is 2, and the number of the panchromatic filters 233 is 2, at this time, while better color representation can be obtained, more phase information can be obtained through the panchromatic filters 233 under dark light, so the focusing quality is also better; alternatively, if the ratio of the number of the color filters 234 to the number of the panchromatic filters 233 is 3:1, the number of the color filters 234 is 3, and the number of the panchromatic filters 233 is 1, then better color representation can be obtained, and similarly, the focusing quality under dark light can also be improved.
The pixel array 24 includes a plurality of pixels, and the pixels of the pixel array 24 are disposed corresponding to the sub-filters of the filter array 23. The pixel array 24 is configured to receive light rays passing through the filter array 23 to generate electrical signals.
Where the pixel array 24 is configured to receive light rays passing through the filter array 23 to generate an electrical signal, it means that the pixel array 24 is used to perform photoelectric conversion on light rays of a given set of scenes of a subject passing through the filter array 23 to generate an electrical signal. The rays of a scene of a given set of subjects are used to generate image data. For example, the subject is a building, and the scene of a given set of subjects refers to the scene in which the building is located, and other objects may be included in the scene.
In one embodiment, the pixel array 24 may be an RGBW pixel array including a plurality of minimal repeating units 241, the minimal repeating units 241 including a plurality of pixel groups 242, the plurality of pixel groups 242 including a panchromatic pixel group 243 and a color pixel group 244. Each panchromatic pixel group 243 includes 9 panchromatic pixels 2431 and each color pixel group 244 includes 9 color pixels 2441. Each panchromatic pixel 2431 corresponds to one sub-filter 2331 in the panchromatic filter 233, and the panchromatic pixel 2431 receives light passing through the corresponding sub-filter 2331 to generate an electrical signal. Each color pixel 2441 corresponds to one sub-filter 2341 of the color filter 234, and the color pixel 2441 receives the light passing through the corresponding sub-filter 2341 to generate an electrical signal. Wherein each pixel corresponds to a photosensitive element. I.e., one photosensitive element for each full-color pixel 2431; each color pixel 2441 corresponds to a light-sensing element.
The image sensor 21 in this embodiment includes the filter array 23 and the pixel array 24, the filter array 23 includes the minimum repeating unit 231, the minimum repeating unit 231 includes the plurality of filter sets 232, each filter set includes the panchromatic filter 233 and the color filter 234, the color filter 234 has a narrower spectral response than the panchromatic filter 233, and more light quantity can be acquired during shooting, so that shooting parameters do not need to be adjusted, the focusing quality under dim light is improved without affecting the shooting stability, stability and quality can be both considered during focusing under dim light, and the focusing stability and quality under dim light are both higher. Moreover, each panchromatic filter 233 includes 9 sub-filters 2331, each color filter 234 includes 9 sub-filters 2341, the pixel array 24 includes a plurality of panchromatic pixels 2431 and a plurality of color pixels 2441, each panchromatic pixel 2431 corresponds to one sub-filter 2331 of the panchromatic filter 233, each color pixel 2441 corresponds to one sub-filter 2341 of the color filter 234, the panchromatic pixels 2431 and the color pixels 2441 are used for receiving light passing through the corresponding sub-filters to generate an electric signal, phase information of pixels corresponding to the 9 sub-filters can be combined and output when focusing is performed in dark light, phase information with high signal-to-noise ratio can be obtained, and in a scene with sufficient light, phase information of pixels corresponding to each sub-filter can be output separately, so that phase information with high resolution and high signal-to-noise ratio can be obtained, so that different application scenes can be adapted, and the focusing quality in each scene can be improved.
In one embodiment, as shown in fig. 4, the minimum repeating unit 231 in the filter array 23 includes 4 filter sets 232, and the 4 filter sets 232 are arranged in a matrix. Each filter set 232 includes a panchromatic filter 233 and a color filter 234, each panchromatic filter 233 and each color filter 234 have 9 sub-filters, and the filter set 232 includes 36 sub-filters.
Similarly, the pixel array 24 includes a plurality of minimal repeating units 241, corresponding to the plurality of minimal repeating units 231. Each minimum repeating unit 241 includes 4 pixel groups 242, and the 4 pixel groups 242 are arranged in a matrix. One filter set 232 for each pixel group 242.
As shown in fig. 5, the readout circuit 25 is electrically connected to the pixel array 24, and is used to control exposure of the pixel array 24 and reading and outputting of pixel values of pixel points. The readout circuit 25 includes a vertical driving unit 251, a control unit 252, a column processing unit 253, and a horizontal driving unit 254. The vertical driving unit 251 includes a shift register and an address decoder. The vertical driving unit 251 includes a readout scanning and reset scanning functions. The control unit 252 configures timing signals according to an operation mode, and controls the vertical driving unit 251, the column processing unit 253, and the horizontal driving unit 254 to cooperatively operate using various timing signals. The column processing unit 253 may have an analog-to-digital (a/D) conversion function for converting analog pixel signals into a digital format. The horizontal driving unit 254 includes a shift register and an address decoder. The horizontal driving unit 254 sequentially scans the pixel array 24 column by column.
In one embodiment, as shown in fig. 6, each filter set 232 includes color filters 234 and panchromatic filters 233, each panchromatic filter 233 in a filter set 232 is disposed in a first diagonal direction D1, and each color filter 234 in a filter set 232 is disposed in a second diagonal direction. The direction of the first diagonal line D1 is different from the direction of the second diagonal line D2, so that the color expression and the dark light focusing quality can be considered at the same time.
The first diagonal D1 direction is different from the second diagonal D2 direction, and specifically, the first diagonal D1 direction is not parallel to the second diagonal D2 direction, or the first diagonal D1 direction is perpendicular to the second diagonal D2 direction, and the like.
In other embodiments, one color filter 234 and one panchromatic filter 233 may be located at a first diagonal D1, and the other color filter 234 and the other panchromatic filter 233 may be located at a second diagonal D2.
In one embodiment, each pixel corresponds to a photosensitive element. The photosensitive element is an element capable of converting an optical signal into an electrical signal. For example, the light sensing element may be a photodiode. As shown in fig. 6, each full-color pixel 2331 includes 1 photodiode pd (photodiode)), and each color pixel 2341 includes 1 photodiode pd (photodiode)).
In one embodiment, as shown in fig. 6, the minimum repeating unit 231 in the filter array 23 includes 4 filter sets 232, and the 4 filter sets 232 are arranged in a matrix. Each filter set 232 includes 2 panchromatic filters 233 and 2 color filters 234. The panchromatic filter 233 includes 9 sub-filters 2331, and the color filter 234 includes 9 sub-filters 2341, so that the minimum repeating unit 231 is 144 sub-filters in 12 rows and 12 columns, and the arrangement is as follows:
Figure BDA0003436988160000051
where w denotes the full-color sub-filter 2331, and a, b, and c each denote a color sub-filter 2341. The full color sub-filter 2331 refers to a sub-filter that filters all light rays outside the visible light band, and the color sub-filter 2341 includes a red sub-filter, a green sub-filter, a blue sub-filter, a magenta sub-filter, a cyan sub-filter, and a yellow sub-filter. The red sub-filter is a sub-filter for filtering all light except red light, the green sub-filter is a sub-filter for filtering all light except green light, the blue sub-filter is a sub-filter for filtering all light except blue light, the magenta sub-filter is a sub-filter for filtering all light except magenta light, the cyan sub-filter is a sub-filter for filtering all light except cyan light, and the yellow sub-filter is a sub-filter for filtering all light except yellow light.
a may be a red sub-filter, a green sub-filter, a blue sub-filter, a magenta sub-filter, a cyan sub-filter, or a yellow sub-filter, b may be a red sub-filter, a green sub-filter, a blue sub-filter, a magenta sub-filter, a cyan sub-filter, or a yellow sub-filter, and c may be a red sub-filter, a green sub-filter, a blue sub-filter, a magenta sub-filter, a cyan sub-filter, or a yellow sub-filter. For example, b is a red sub-filter, a is a green sub-filter, and c is a blue sub-filter; or c is a red sub-filter, a is a green sub-filter, and b is a blue sub-filter; for another example, c is a red sub-filter, a is a green sub-filter, and b is a blue sub-filter; alternatively, a is a red sub-filter, b is a blue sub-filter, c is a green sub-filter, etc., and the disclosure is not limited herein; for example, b is a magenta sub-filter, a is a cyan sub-filter, and b is a yellow sub-filter. In other embodiments, the color filter may further include other color sub-filters, such as an orange sub-filter, a violet sub-filter, and the like, which is not limited herein.
In one embodiment, as shown in fig. 7, the minimum repeating unit 231 in the filter array 23 includes 4 filter sets 232, and the 4 filter sets 232 are arranged in a matrix. Each filter set 232 includes color filters 234 and panchromatic filters 233, the respective color filters 234 in the filter set 232 being disposed in a first diagonal D1 direction, and the respective panchromatic filters 233 in the filter set 232 being disposed in a second diagonal D2 direction. And the pixels of the pixel array (not shown in fig. 7, refer to fig. 6) are disposed corresponding to the sub-filters of the filter array, and each pixel corresponds to a photosensitive element.
In one embodiment, each filter set 232 includes 2 panchromatic filters 233 and 2 color filters 234, the panchromatic filters 233 includes 9 sub-filters 2331, the color filters 234 include 9 sub-filters 2341, and the minimum repeating unit 231 is 12 rows, 12 columns and 144 sub-filters, as shown in fig. 5, the arrangement is:
Figure BDA0003436988160000061
where w denotes a panchromatic sub-filter, and a, b, and c each denote a color sub-filter.
The 12 rows and 12 columns of 144 sub-filters combine the dual advantages of quad and RGBW. The advantage of quad is that local 2 by 2 and 3 by 3 merging (binning) with pixels can be used to obtain images with different resolutions and high signal-to-noise ratio. The quad full-size output has high pixels, and a full-size full-resolution image is obtained, so that the definition is higher. The RGBW has the advantages that the W pixel is used for improving the light entering quantity of the whole image, and further improving the image quality signal to noise ratio.
In one embodiment, as shown in fig. 8 (a), the microlens array 22 includes a plurality of minimal repeating units 221, the minimal repeating unit 221 includes a plurality of microlens sets 222, and the microlens sets 222 include a plurality of microlenses 2221. Each microlens group 222 corresponds to either the panchromatic pixel group 243 or the color pixel group 244, and the microlens group 222 includes a plurality of microlenses 2211 therein, for example, the microlens group 222 includes 5 microlenses 2211 therein, and the 5 microlenses 2211 include 4 first microlenses 2211a and 1 second microlenses 2211 b; the first microlenses 2211a correspond to 2 pixels, and the second microlenses 2211b correspond to 1 pixel. The 4 first microlenses comprise 2 first microlenses in a first direction and 2 first microlenses in a second direction; first microlenses of the first direction are arranged in the microlens group along the first direction; the first microlenses of the second direction are arranged in the microlens group along the second direction. Wherein the second microlens 2211b is located at the center of the microlens group 222, and the second microlens 2211b corresponds to the central pixel of the pixel group corresponding to the microlens group.
As shown in fig. 8 (a), if the second microlens 2211b is located at the center of the microlens group, 2 first microlenses in the first direction (fig. 8 (a) filled with gray) and 2 first microlenses in the second direction (fig. 8 (a) filled with diagonal lines) are disposed around the second microlens 2211 b;
wherein 2 first microlenses in the first direction are arranged in central symmetry with respect to the second microlenses 2211b in a first diagonal direction; the 2 first microlenses in the second direction are arranged to be centrosymmetric with respect to the second microlenses 2211b in the second diagonal direction.
As shown in (b) of fig. 8, in another embodiment, the second microlens 2211b is located at one of four corners of the microlens group, and the second microlens 2211b corresponds to a pixel at one of four corners of the pixel group corresponding to the microlens group. For example, the second microlens 2211b is located at the lower right corner of the microlens set 222. Of course, the second microlens 2211b can also be positioned at the lower left corner of the microlens set 222; the second microlens 2211b may also be located in the upper left corner of the microlens set 222; the second microlens 2211b can also be located at the upper right corner of the microlens assembly 222, which is not limited in this application.
In one embodiment, one of the plurality of microlenses corresponds to at least four pixels. I.e., one microlens in the plurality of microlenses 2211 corresponds to at least four pixels.
In one embodiment, as shown in fig. 9, for example, the microlens group 222 includes 4 microlenses 2211, the 4 microlenses include 1 third microlens 2211c, 2 fourth microlenses 2211d, and 1 fifth microlens 2211 e; the third microlenses 2211c correspond to 4 pixels, the fourth microlenses 2211d correspond to 2 pixels, and the fifth microlenses 2211e correspond to 1 pixel. Here, the fourth microlens 2211d may be the same microlens as the first microlens 2211 a; the fifth microlenses 2211e can be microlenses similar to the second microlenses 2211b, but this is not limited in the present application.
In one embodiment, as shown in fig. 10, there is provided a focus control method applied to an image sensor as in the above embodiments, the image sensor includes a pixel array and a filter array, and the method includes:
step 1020, determining a phase information output mode adapted to the light intensity of the current shooting scene according to the light intensity of the current shooting scene; wherein, under different phase information output modes, the output phase arrays have different sizes.
In different shooting scenes or different moments, the light intensity of the current shooting scene is different, and because the light sensitivity of the RGB pixel array is different under different light intensities, the accuracy of the phase difference calculated by the RGB pixel array is lower under partial light intensity, and the focusing accuracy is further greatly reduced. The intensity of light is also called illumination intensity, which is a physical term referring to the luminous flux of visible light received per unit area, and is called illuminance, unit Lux or lx for short. The illumination intensity is used to indicate the intensity of the illumination and the amount of illumination to which the surface area of the object is illuminated. The following table shows the illumination intensity values for different weather and locations:
TABLE 1-1
Weather and location Illumination intensity value
Sunshine direct-irradiating ground in sunny day 100000lx
Indoor center of sunny day 200lx
Outside cloudy sky 50-500lx
In the cloudy sky 5-50lx
Moonlight (full moon) 2500lx
Clear moon night 0.2lx
Night 0.0011lx
As can be seen from table 1-1, the light intensities of the current shooting scene are greatly different at the shooting scene or different moments.
In order to solve the problem that the accuracy of the phase difference calculated by the RGB pixel array is low under partial light intensity, and further the accuracy of focusing is greatly reduced, under different light intensities of a current shooting scene, a phase information output mode matched with the light intensity of the current shooting scene is determined, and then different phase information output modes are respectively adopted to output the phase information of the pixel array. The phase information output mode refers to a mode in which raw phase information of a pixel array is processed based on the raw phase information to generate phase information of the pixel array to be finally output.
Wherein, under different phase information output modes, the output phase arrays have different sizes. That is, the phase arrays output by the same pixel array have different sizes under different light intensities of the current shooting scene. In other words, under different light intensities of the current shooting scene, the phase information corresponding to the same pixel array is directly output as the phase array corresponding to the pixel array or combined to a certain extent, so as to generate the phase array corresponding to the pixel array. For example, if the light intensity of the current shooting scene is large, the phase information corresponding to the same pixel array can be directly output as the phase array corresponding to the pixel array. The size of the output phased array at this time is equal to the size of the pixel array. If the light intensity of the current shooting scene is small, the phase information corresponding to the same pixel array can be combined to a certain degree, and a phase array corresponding to the pixel array is generated. The size of the output phased array is smaller than the size of the pixel array.
Because the signal-to-noise ratios of the phase arrays with different sizes are different, the accuracy of the phase information output under different light intensities can be improved, and the focusing accuracy is further improved.
Step 1040, outputting a phase array corresponding to the pixel array according to the phase information output mode; the phase array comprises phase information corresponding to a target pixel in the pixel array.
After the phase information output mode adapted to the light intensity of the current shooting scene is determined according to the light intensity of the current shooting scene, the phase information corresponding to the pixel array can be output according to the phase information output mode. Specifically, when phase information corresponding to the pixel array is output, the phase information may be output in the form of a phase array. The phase array comprises phase information corresponding to the pixel array.
Specifically, under different light intensities of a current shooting scene, according to a phase information output mode adapted to the light intensity, phase information corresponding to the same pixel array is directly output as a phase array corresponding to the pixel array or combined to a certain extent, so as to generate a phase array corresponding to the pixel array, which is not limited in this application.
Step 1060, calculating a phase difference of the pixel array based on the phase array, and performing focus control according to the phase difference.
After outputting a phase array corresponding to the pixel array in the phase information output mode, a phase difference of the pixel array may be calculated based on phase information in the phase array. Assuming that a phase array of the pixel array in the second direction can be obtained, a phase difference is calculated based on two adjacent phase information in the second direction, and finally the phase difference of the entire pixel array in the second direction is obtained. Assuming that a phase array of the pixel array in the first direction can be obtained, a phase difference is calculated based on two adjacent phase information in the first direction, and finally a phase difference of the entire pixel array in the first direction is obtained, and the second direction is different from the first direction. The second direction may be a vertical direction of the pixel array, the first direction may be a horizontal direction of the pixel array, and the second direction is perpendicular to the first direction. Of course, the phase difference of the entire pixel array in the second direction and the first direction may be obtained at the same time, and the phase difference of the pixel array in other directions, such as the diagonal directions (including the first diagonal direction and the second diagonal direction perpendicular to the first diagonal direction), may also be calculated, which is not limited in this application.
When focus control is performed based on the calculated phase difference, since the acquired phase difference parallel to a certain direction on the preview image corresponding to the currently photographed scene is almost 0 for the texture feature of the direction, it is apparent that focus cannot be performed based on the acquired phase difference parallel to the direction. Therefore, if the preview image corresponding to the current shooting scene includes the texture feature in the first direction, the phase difference of the pixel array in the second direction is calculated based on the phase array of the pixel array in the second direction. And performing focusing control according to the phase difference of the pixel array in the second direction.
For example, it is assumed that the second direction is a vertical direction of the pixel array, the first direction is a horizontal direction of the pixel array, and the second direction and the first direction are perpendicular to each other. The first-direction texture feature included in the preview image means that the preview image includes horizontal stripes, which may be pure color and horizontal stripes. At this time, if the preview image corresponding to the current shooting scene includes texture features in the horizontal direction, focus control is performed based on the phase difference in the vertical direction.
And if the preview image corresponding to the current shooting scene comprises the texture features in the second direction, performing focusing control based on the phase difference in the first direction. And if the preview image corresponding to the current shooting scene comprises the texture features in the first diagonal direction, performing focusing control based on the phase difference in the second diagonal direction, and vice versa. Therefore, the phase difference can be accurately acquired according to the texture features in different directions, and then focusing is accurately performed.
In the embodiment of the application, a phase information output mode adaptive to the light intensity of the current shooting scene is determined according to the light intensity of the current shooting scene; wherein, under different phase information output modes, the output phase arrays have different sizes. Outputting a phase array corresponding to the pixel array according to the phase information output mode; the phase array comprises phase information corresponding to the pixel array. And calculating the phase difference of the pixel array based on the phase array, and performing focusing control according to the phase difference.
Under different light intensities of the current shooting scene, the accuracy of the original phase information which can be collected is different. Therefore, different phase information output modes can be adopted for the same pixel array according to the light intensity of the current shooting scene, and phase arrays with different sizes can be output based on the original phase information. Because the signal-to-noise ratios of the phase arrays with different sizes are different, the accuracy of the phase information output under different light intensities can be improved, and the accuracy of focusing control is further improved.
In the former embodiment, as shown in fig. 11, step 1040, outputting a phase array corresponding to the pixel array according to the phase information output mode, includes:
step 1042, according to the phase information output mode, determining a target pixel group from the color pixel group and the panchromatic pixel group in the pixel array, determining a target microlens corresponding to the target pixel group, and taking at least two pixels corresponding to the target microlens as target pixels.
Firstly, according to the light intensity of the current shooting scene, a phase information output mode matched with the light intensity of the current shooting scene is determined. That is, the determined phase information output modes are different under different light intensities, and the size of the output phase array is different under different phase information output modes.
Next, a target pixel group is determined from the color pixel group and the panchromatic pixel group in the pixel array in accordance with the phase information output mode. Since the phase information output mode adapted to the light intensity of the current shooting scene is determined based on the light intensity of the current shooting scene, and the sensitivity of the color pixels and the panchromatic pixels is different under different light intensities, the target pixel group can be determined from the color pixel group and the panchromatic pixel group in the pixel array under different phase information output modes.
Specifically, in different phase information output modes, all or part of the color pixel groups can be selected from the color pixel groups and the panchromatic pixel groups in the pixel array as target pixel groups. All or part of the panchromatic pixel groups can be screened out from the color pixel groups and the panchromatic pixel groups in the pixel array to be used as the target pixel group. All or part of the color pixel groups and all or part of the panchromatic pixel groups can be selected from the color pixel groups and the panchromatic pixel groups in the pixel array to serve as target pixel groups, and the application is not limited to this.
And finally, determining a target microlens corresponding to the target pixel group, and taking at least two pixels corresponding to the target microlens as target pixels.
The target pixel group comprises a plurality of micro lenses, wherein part of the micro lenses correspond to at least two pixels, and part of the micro lenses correspond to one pixel. Therefore, the target microlens needs to be determined from the microlenses corresponding to at least two pixels, that is, all the microlenses corresponding to at least two pixels can be determined as the target microlens, and a part of the microlenses corresponding to at least two pixels can be screened out and determined as the target microlens, which is not limited in the present application.
And after the target micro lens is determined, at least two pixels corresponding to the target micro lens are taken as target pixels. That is, at least two pixels corresponding to the target microlens are used as the target pixels, or all the pixels corresponding to the target microlens are used as the target pixels, which is not limited in the present application.
Step 1044, acquiring phase information of the target pixels for each target pixel group.
Step 1046, generating a phase array corresponding to the pixel array according to the phase information of the target pixel.
After the target pixel is determined from the target microlens, phase information of the target pixel is acquired for each target pixel group. A phase array corresponding to the pixel array may be generated directly from the phase information of the target pixel. The phase information of the target pixel may also be combined to a certain extent to generate a phase array corresponding to the pixel array, which is not limited in this application.
In the embodiment of the present application, different phase information output modes are corresponding to different light intensities, and a target pixel group can be determined from a color pixel group and a panchromatic pixel group in a pixel array in different phase information output modes. Thus, the target pixel group can be determined based on at least one of the color pixel group and the panchromatic pixel group. And determining a target pixel from the target pixel group, and generating a phase array corresponding to the pixel array according to the phase information of the target pixel. And generating a phase array corresponding to the pixel array from two dimensions of determining the target pixel group and determining the target pixel. Ultimately, the accuracy of the generated phased array is improved.
In the previous embodiment, step 1042, determining a target microlens corresponding to the target pixel group, and regarding at least two pixels corresponding to the target microlens as a target pixel, includes:
determining a target microlens of a first direction corresponding to a target pixel group; the target micro-lenses in the first direction correspond to at least 2 adjacent target pixels which are arranged along the first direction;
and taking at least two pixels corresponding to the target micro-lens in the first direction as target pixels.
After a target pixel group is determined from among a color pixel group and a panchromatic pixel group in a pixel array in accordance with a phase information output mode, a target microlens corresponding to the target pixel group is determined. Specifically, assuming that a panchromatic pixel group in the pixel array is determined as a target pixel group, a target microlens is determined from microlenses corresponding to the panchromatic pixel group. Since the target microlens corresponds to at least two pixels, as shown in fig. 8, the microlens corresponding to the full-color pixel group includes 4 first microlenses 2211a and 1 second microlens 2211b, wherein the first microlenses 2211a correspond to 2 pixels, respectively, and the second microlenses 2211b correspond to 1 pixel.
Therefore, the target microlens of the first direction is further determined from the 4 first microlenses 2211 a. The target micro-lenses in the first direction correspond to at least 2 adjacent target pixels which are arranged along the first direction. Assuming that the first direction is a horizontal direction, as can be seen from fig. 8, the 4 first microlenses 2211a include first microlenses 2211a in the first direction (filled with gray color in the figure) and first microlenses 2211a in the second direction (filled with oblique lines in the figure). Then, the target microlens of the first direction corresponding to the target pixel group is determined as the first microlens 2211a of the first direction (gray filling in the figure). Of course, the first direction and the second direction may be interchanged, and the present application does not limit this.
At least two pixels corresponding to the first microlenses 2211a in the first direction (filled with gray in the figure) are set as target pixels. Then, for each target pixel group, phase information of the target pixel in the second direction can be acquired. And generating a phase array corresponding to the pixel array according to the phase information of the target pixel.
As shown in fig. 9, the full-color pixel group includes 4 microlenses 2211, including 1 third microlens 2211c, 2 fourth microlenses 2211d, and 1 fifth microlens 2211 e; the third microlenses 2211c correspond to 4 pixels, the fourth microlenses 2211d correspond to 2 pixels, and the fifth microlenses 2211e correspond to 1 pixel.
Therefore, the target microlens in the first direction is further determined from the 1 third microlens 2211c and the 2 fourth microlenses 2211 d. Since the third microlenses 2211c include 4 pixels arranged in a 2 × 2 array, the third microlenses 2211c can be regarded as microlenses belonging to the first direction or as microlenses belonging to the second direction. Then, the target microlens of the first direction corresponding to the target pixel group is determined as the first microlens 2211a (gray-filled in the figure) and the third microlens 2211c of the first direction.
At least two pixels corresponding to the first microlens 2211a (gray filling in the figure) in the first direction and at least four pixels corresponding to the third microlens 2211c are set as target pixels. Then, for each target pixel group, phase information of the target pixel in the second direction can be acquired. And generating a phase array corresponding to the pixel array according to the phase information of the target pixel.
In the embodiment of the present application, after a target pixel group is determined from a color pixel group and a panchromatic pixel group in a pixel array in accordance with a phase information output mode, a target microlens corresponding to the target pixel group is determined. Specifically, the microlens in the first direction may be determined as the target microlens from the microlenses corresponding to the target pixel group. Then, at least two pixels corresponding to the first microlens in the first direction may be used as the target pixel. Then, phase information of the target pixel in the second direction is acquired. And generating a phase array corresponding to the pixel array according to the phase information of the target pixel. Finally, for each target pixel group, a phase array corresponding to the pixel array in the second direction can be acquired.
Optionally, in step 1042, determining a target microlens corresponding to the target pixel group, and regarding at least two pixels corresponding to the target microlens as a target pixel, further including:
determining a target microlens of a second direction corresponding to the target pixel group; the target micro-lenses in the second direction correspond to at least 2 adjacent target pixels which are arranged along the second direction; the second direction is perpendicular to the first direction;
and taking at least two pixels corresponding to the target micro-lens in the second direction as target pixels.
After a target pixel group is determined from among a color pixel group and a panchromatic pixel group in a pixel array in accordance with a phase information output mode, a target microlens corresponding to the target pixel group is determined. Specifically, assuming that a panchromatic pixel group in the pixel array is determined as a target pixel group, a target microlens is determined from microlenses corresponding to the panchromatic pixel group. Since the target microlens corresponds to at least two pixels, as shown in fig. 8, the microlens corresponding to the full-color pixel group includes 4 first microlenses 2211a and 1 second microlens 2211b, wherein the first microlenses 2211a correspond to 2 pixels, respectively, and the second microlenses 2211b correspond to 1 pixel.
Therefore, the target microlens of the first direction is further determined from the 4 first microlenses 2211 a. The target micro-lenses in the first direction correspond to at least 2 adjacent target pixels which are arranged along the first direction. The second direction is perpendicular to the first direction, and if the first direction is a horizontal direction, the second direction is a vertical direction. As can be seen from fig. 8, the 4 first microlenses 2211a include first microlenses 2211a in the first direction (filled with gray color in the figure) and first microlenses 2211a in the second direction (filled with oblique lines in the figure). Then, the target microlens of the second direction corresponding to the target pixel group is determined as the first microlens 2211a of the second direction (the oblique line is filled in the figure). Of course, the first direction and the second direction may be interchanged, and the present application does not limit this.
At least two pixels corresponding to the first microlenses 2211a in the second direction (filled with oblique lines in the drawing) are set as target pixels. Then, for each target pixel group, phase information of the target pixel in the second direction can be acquired. And generating a phase array corresponding to the pixel array according to the phase information of the target pixel.
As shown in fig. 9, the full-color pixel group includes 4 microlenses 2211, including 1 third microlens 2211c, 2 fourth microlenses 2211d, and 1 fifth microlens 2211 e; the third microlenses 2211c correspond to 4 pixels, the fourth microlenses 2211d correspond to 2 pixels, and the fifth microlenses 2211e correspond to 1 pixel.
Therefore, the target microlens in the second direction is further determined from the 1 third microlens 2211c and the 2 fourth microlenses 2211 d. Since the third microlenses 2211c include 4 pixels arranged in a 2 × 2 array, the third microlenses 2211c can be regarded as microlenses belonging to the first direction or as microlenses belonging to the second direction. Then, the target microlenses in the second direction corresponding to the target pixel group are determined to be the first microlenses 2211a (filled with oblique lines in the drawing) and the third microlenses 2211c in the second direction.
At least two pixels corresponding to the first microlens 2211a (filled with diagonal lines in the drawing) in the second direction and at least four pixels corresponding to the third microlens 2211c are set as target pixels. Then, for each target pixel group, phase information of the target pixel in the first direction can be acquired. And generating a phase array corresponding to the pixel array according to the phase information of the target pixel.
In the embodiment of the present application, after a target pixel group is determined from a color pixel group and a panchromatic pixel group in a pixel array in accordance with a phase information output mode, a target microlens corresponding to the target pixel group is determined. Specifically, the microlens in the second direction may be determined as the target microlens from the microlenses corresponding to the target pixel group. Then, at least two pixels corresponding to the first microlens in the second direction may be used as the target pixel. Then, phase information of the target pixel in the first direction is acquired. And generating a phase array corresponding to the pixel array according to the phase information of the target pixel. Finally, for each target pixel group, a phase array corresponding to the pixel array in the first direction can be acquired.
Optionally, in step 1042, determining a target microlens corresponding to the target pixel group, and regarding at least two pixels corresponding to the target microlens as a target pixel, includes:
determining a target micro-lens in a first direction and a target micro-lens in a second direction corresponding to the target pixel group; the target micro-lenses in the first direction correspond to at least 2 adjacent target pixels which are arranged along the first direction; the target micro-lenses in the second direction correspond to at least 2 adjacent target pixels which are arranged along the second direction; the second direction is perpendicular to the first direction;
and taking at least two pixels corresponding to the target micro-lens in the first direction and at least two pixels corresponding to the target micro-lens in the second direction as target pixels.
After a target pixel group is determined from among a color pixel group and a panchromatic pixel group in a pixel array in accordance with a phase information output mode, a target microlens corresponding to the target pixel group is determined. Specifically, assuming that a panchromatic pixel group in the pixel array is determined as a target pixel group, a target microlens is determined from microlenses corresponding to the panchromatic pixel group. Since the target microlens corresponds to at least two pixels, as shown in fig. 8, the microlens corresponding to the full-color pixel group includes 4 first microlenses 2211a and 1 second microlens 2211b, wherein the first microlenses 2211a correspond to 2 pixels, respectively, and the second microlenses 2211b correspond to 1 pixel.
Therefore, the target microlens in the first direction and the target microlens in the second direction are further determined from the 4 first microlenses 2211 a. The target micro-lenses in the first direction correspond to at least 2 adjacent target pixels which are arranged along the first direction, and the target micro-lenses in the first direction correspond to at least 2 adjacent target pixels which are arranged along the first direction. Assuming that the first direction is a horizontal direction, as can be seen from fig. 8, the 4 first microlenses 2211a include first microlenses 2211a in the first direction (filled with gray color in the figure) and first microlenses 2211a in the second direction (filled with oblique lines in the figure). Then, the target microlens in the first direction corresponding to the target pixel group is determined as the first microlens 2211a in the first direction (filled with gray in the figure), and the target microlens in the second direction corresponding to the target pixel group is determined as the first microlens 2211a in the second direction (filled with oblique lines in the figure). Of course, the first direction and the second direction may be interchanged, and the present application does not limit this.
At least two pixels corresponding to the first microlenses 2211a in the first direction (filled with gray in the figure) and the first microlenses 2211a in the second direction (filled with diagonal lines in the figure) are set as target pixels. Then, for each target pixel group, phase information of the target pixel in the first direction and phase information of the target pixel in the second direction can be acquired. And generating a phase array corresponding to the pixel array according to the phase information of the target pixel.
As shown in fig. 9, the full-color pixel group includes 4 microlenses 2211, including 1 third microlens 2211c, 2 fourth microlenses 2211d, and 1 fifth microlens 2211 e; the third microlenses 2211c correspond to 4 pixels, the fourth microlenses 2211d correspond to 2 pixels, and the fifth microlenses 2211e correspond to 1 pixel.
Therefore, the target microlens in the first direction and the target microlens in the second direction are further determined from the 1 third microlens 2211c and the 2 fourth microlenses 2211 d. Since the third microlenses 2211c include 4 pixels arranged in a 2 × 2 array, the third microlenses 2211c can be regarded as microlenses belonging to the first direction or as microlenses belonging to the second direction. Then, the target microlens in the first direction and the target microlens in the second direction corresponding to the target pixel group are determined as the first microlens 2211a in the first direction (filled with gray color in the figure), the first microlens 2211a in the second direction (filled with oblique lines in the figure), and the third microlens 2211 c.
At least two pixels corresponding to the first microlens 2211a in the first direction (filled with gray in the figure), at least two pixels corresponding to the first microlens 2211a in the second direction (filled with diagonal lines in the figure), and at least four pixels corresponding to the third microlens 2211c are set as target pixels. Then, for each target pixel group, phase information of the target pixel in the first direction and the second direction can be acquired. And generating a phase array corresponding to the pixel array according to the phase information of the target pixel.
In the embodiment of the present application, after a target pixel group is determined from a color pixel group and a panchromatic pixel group in a pixel array in accordance with a phase information output mode, a target microlens corresponding to the target pixel group is determined. Specifically, the microlens in the first direction may be determined as the target microlens from the microlenses corresponding to the target pixel group. Then, at least two pixels corresponding to the first microlens in the first direction may be used as the target pixel. Then, phase information of the target pixel in the second direction is acquired. At least two pixels corresponding to the first microlenses in the second direction may be set as target pixels. Then, phase information of the target pixel in the first direction is acquired. And generating a phase array corresponding to the pixel array according to the phase information of the target pixel. Finally, for each target pixel group, the phase arrays corresponding to the pixel arrays in both the first direction and the second direction can be acquired. Therefore, the phase difference in two directions can be calculated, and focusing can be realized in the two directions.
In the previous embodiment, it is described that different phase information output modes are adopted for the same pixel array according to the light intensity of the current shooting scene, a phase array corresponding to the pixel array is output, the phase difference of the pixel array is calculated based on the phase array, and the focusing control is performed according to the phase difference. In this embodiment, as shown in fig. 12, a detailed description is given to the step 1020, and a specific implementation step of determining the phase information output mode adapted to the light intensity of the current shooting scene according to the light intensity of the current shooting scene includes:
step 1022, determining a target light intensity range to which the light intensity of the current shooting scene belongs; wherein, different light intensity ranges correspond to different phase information output modes;
and step 1024, determining a phase information output mode adaptive to the light intensity of the current shooting scene according to the target light intensity range.
Specifically, the light intensity may be divided into different light intensity ranges according to the magnitude order based on different preset threshold values of the light intensity. The preset threshold of the light intensity can be determined according to the exposure parameters and the size of the pixels in the pixel array. The exposure parameters include shutter speed, lens aperture size and sensitivity (ISO).
Then, different phase information output modes are set for different light intensity ranges. Specifically, according to the order of the light intensity in the light intensity range, the size of the phase array output by the phase information output mode set for different light intensity ranges decreases in sequence.
And judging the light intensity range of the current shooting scene, and taking the light intensity range as the target light intensity range to which the light intensity of the current shooting scene belongs. And taking the phase information output mode corresponding to the target light intensity range as the phase information output mode matched with the light intensity of the current shooting scene.
In the embodiment of the application, when the phase information output mode adapted to the light intensity of the current shooting scene is determined according to the light intensity of the current shooting scene, different light intensity ranges correspond to different phase information output modes, so that the target light intensity range to which the light intensity of the current shooting scene belongs is determined first. And determining a phase information output mode matched with the light intensity of the current shooting scene according to the target light intensity range. Different phase information output modes are respectively set for different light intensity ranges in advance, and the size of the phase array output in each phase information output mode is different. Therefore, the phase information can be calculated more finely for the pixel array based on the light intensity of the current shooting scene, so as to realize more accurate focusing.
In the previous embodiment, it is further described that the phase information output mode includes a full-size output mode and a first-size output mode, and the size of the phase array in the full-size output mode is larger than that in the first-size output mode. Then, as shown in fig. 13, step 1024, determining a phase information output mode adapted to the light intensity of the current shooting scene according to the target light intensity range, includes:
step 1024a, if the light intensity of the current shooting scene is larger than a first preset threshold, determining that the phase information output mode adaptive to the light intensity of the current shooting scene is a full-size output mode;
step 1024b, if the light intensity of the current shooting scene is greater than the second preset threshold and less than or equal to the first preset threshold, determining that the phase information output mode adapted to the light intensity of the current shooting scene is the first size output mode.
Specifically, if one of the light intensity ranges is greater than the first preset threshold, the phase information output mode corresponding to the light intensity range is the full-scale output mode. If the light intensity of the current shooting scene is judged to be larger than the first preset threshold value, the light intensity of the current shooting scene falls into the light intensity range. Namely, the phase information output mode adapted to the light intensity of the current shooting scene is determined to be the full-size output mode. The phase array is output in a full-size output mode, that is, the original phase information of the pixel array is completely output to generate the phase array of the pixel array.
If one of the light intensity ranges is greater than the second preset threshold and less than or equal to the first preset threshold, the phase information output mode corresponding to the light intensity range is the first size output mode. If the light intensity of the current shooting scene is judged to be larger than the second preset threshold value and smaller than or equal to the first preset threshold value, the light intensity of the current shooting scene falls into the light intensity range. Namely, the phase information output mode adapted to the light intensity of the current shooting scene is determined to be the first size output mode. The phase array is output by adopting a first size output mode, namely, the original phase information of the pixel array is combined and output to generate the phase array of the pixel array.
In the embodiment of the application, since the size of the phase array in the full-size output mode is larger than that of the phase array in the first-size output mode, if the light intensity of the current shooting scene is larger than a first preset threshold, it is determined that the phase information output mode adapted to the light intensity of the current shooting scene is the full-size output mode. And if the light intensity of the current shooting scene is greater than the second preset threshold and less than or equal to the first preset threshold, determining that the phase information output mode adaptive to the light intensity of the current shooting scene is the first size output mode. That is, if the light intensity of the current shooting scene is larger, the phase array with the same size as the pixel array is output by adopting the full-size output mode, and if the light intensity of the current shooting scene is less than the pixel array, the phase array with the smaller size than the pixel array is output by adopting the first-size output mode. That is, in the case that the light intensity of the current shooting scene is inferior, the signal-to-noise ratio of the phase information is improved by narrowing the phase array.
In the previous embodiment, it is described that the pixel array may be an RGBW pixel array, including a plurality of minimal repeating units 241, the minimal repeating unit 241 including a plurality of pixel groups 242, the plurality of pixel groups 242 including a panchromatic pixel group 243 and a color pixel group 244. Each panchromatic pixel group 243 includes 9 panchromatic pixels 2431 and each color pixel group 244 includes 9 color pixels 2441. Each panchromatic pixel 2431 includes 2 subpixels arranged in an array, and each color pixel 2441 includes 2 subpixels arranged in an array.
In this embodiment, when the pixel array is an RGBW pixel array, if the phase information output mode is the full-size output mode, the method includes, as a target pixel group, a color pixel group and a panchromatic pixel group in the pixel array:
taking a color pixel group in the pixel array as a target pixel group; or a color pixel group and a panchromatic pixel group in the pixel array are taken as a target pixel group; wherein one of the target pixel groups corresponds to one pixel group;
accordingly, the method can be used for solving the problems that,
generating a phase array corresponding to the pixel array according to the phase information of the target pixel, comprising:
and generating a full-size phase array corresponding to the pixel array according to the phase information of the target pixel.
The embodiment is a specific implementation step of outputting a phase array corresponding to a pixel array according to a full-size output mode under the condition that the light intensity of a current shooting scene is greater than a first preset threshold. The first preset threshold may be 2000lux, which is not limited in this application. I.e. in an environment with a light intensity of more than 2000 lux. As shown in connection with fig. 11, a color pixel group is first determined from the pixel array as a target pixel group for calculating phase information. Since the panchromatic pixels are easily saturated in a scene with sufficient light due to high sensitivity in the case that the light intensity of the current shooting scene is greater than the first preset threshold, that is, in a scene with sufficient light, and correct phase information cannot be obtained after saturation, phase information of the color pixel group can be used to realize phase focusing (PDAF). Of course, the color pixel group and the panchromatic pixel group in the pixel array may also be taken as the target pixel group.
Specifically, if phase information of a color pixel group is used to realize phase focusing (PDAF), phase information of a part of the color pixel groups in the pixel array may be used to realize phase focusing, for example, phase information of at least one of a red pixel group, a green pixel group, and a blue pixel group is used to realize phase focusing. The phase focusing may also be achieved by using a part of pixels in a part of color pixel groups, for example, the phase focusing is achieved by using phase information of a part of pixels in a red pixel group, which is not limited in this application. Since phase focusing is performed using only the phase information of the color pixel group at this time, the data amount of the phase information to be output is reduced, and the efficiency of phase focusing is improved.
And secondly, determining a target micro lens from the micro lenses corresponding to the target pixel group, and taking at least two pixels corresponding to the target micro lens as target pixels.
Specifically, a target microlens of a first direction corresponding to the target pixel group may be determined; the target micro-lenses in the first direction correspond to at least 2 adjacent target pixels arranged along the first direction. And taking at least two pixels corresponding to the target micro-lens in the first direction as target pixels. A target microlens of a second direction corresponding to the target pixel group may also be determined; the target micro-lenses in the second direction correspond to at least 2 adjacent target pixels which are arranged along the second direction; the second direction is perpendicular to the first direction. And taking at least two pixels corresponding to the target micro-lens in the second direction as target pixels. A target microlens in a first direction and a target microlens in a second direction corresponding to the target pixel group can also be determined; the target micro-lenses in the first direction correspond to at least 2 adjacent target pixels which are arranged along the first direction; the target micro-lenses in the second direction correspond to at least 2 adjacent target pixels arranged along the second direction. And taking at least two pixels corresponding to the target micro-lens in the first direction and at least two pixels corresponding to the target micro-lens in the second direction as target pixels.
And thirdly, acquiring the phase information of each target pixel in the target pixel group aiming at each target pixel group. Wherein each pixel corresponds to a photosensitive element. In the embodiment of the present application, it is assumed that all color pixel groups in the pixel array are set as target pixel groups, and for each color pixel group, phase information of each target pixel in the color pixel group is acquired. Finally, the phase information of all the target pixels is output as a full-size phase array corresponding to the pixel array.
As shown in connection with fig. 14, a 12 x 12 pixel array may include 2 red pixel groups, 4 green pixel groups, 2 blue pixel groups, and 8 panchromatic pixel groups. Here, as shown in fig. 7 and 8, it is assumed that a represents green, b represents red, c represents blue, and w represents full color. And assuming that all color pixel groups in the pixel array are set as target pixel groups, phase information of each pixel group is sequentially calculated for 2 red pixel groups, 4 green pixel groups and 2 blue pixel groups included in the pixel array. For example, the phase information of the red pixel group 244 is calculated for the red pixel group, which includes 9 red pixels arranged in a 3 × 3 array, numbered in order as red pixel 1, red pixel 2, red pixel 3, red pixel 4, red pixel 5, red pixel 6, red pixel 7, red pixel 8, and red pixel 9. Wherein each pixel corresponds to a photosensitive element. That is, the phase information of the red pixel 1 is L1, and the phase information of the red pixel 2 is R1; the phase information of the red pixel 8 is L2, and the phase information of the red pixel 9 is L2; the phase information of the red pixel 3 is U1, and the phase information of the red pixel 6 is D1; the phase information of the red pixel 4 is U1, and the phase information of the red pixel 7 is D1. The pixel 5 located at the center of the red pixel group corresponds to one microlens, and therefore, no phase information is obtained.
Therefore, if the target microlens in the first direction corresponding to the target pixel group is determined, at least two pixels corresponding to the target microlens in the first direction are used as the target pixels. Assuming that the first direction is the horizontal direction, L1, R1, L2, and R2 are sequentially output to obtain phase information of the red pixel group.
Thus, the above-described processing is sequentially performed on each of the other 1 red pixel group and 4 green pixel groups and 2 blue pixel groups included in the pixel array, and phase information L3, R3, L4, R4, L5, R5, L6, R6, L7, R7, L8, R8, L9, R9, L10, R10, L11, R11, L12, R12, L13, R13, L14, R14, L15, R15, L16, and R16 of the pixel array are obtained.
Finally, the phase information of all the target pixels is output as a full-size phase array corresponding to the pixel array. That is, L1, R1, L2, R2, L3, R3, L4, R4, L5, R5, L6, R6, L7, R7, L8, R8, L9, R9, L10, R10, L11, R11, L12, R12, L13, R13, L14, R14, L15, R15, L16, and R16 are sequentially arranged to generate a full-size array. And the size of the full-scale phased array is equivalent to the size of 4 × 8 pixels arranged in the array. Here, the size of a pixel refers to the size of the area of one pixel, which is related to the length and width of the pixel.
Wherein the pixel is the smallest photosensitive unit on a digital camera photosensitive device (CCD or CMOS). Among them, the CCD is an abbreviation of charge coupled device (charge coupled device), CMOS (Complementary Metal-Oxide-Semiconductor), which can be interpreted as a Complementary Metal Oxide Semiconductor. Typically, the pixels are not of a fixed size, which is related to the size of the display screen and the resolution. For example, if the size of the display screen is 4.5 inches and the resolution of the display screen is 1280 × 720, the length of the display screen is 99.6mm, and the width is 56mm, the length of one pixel is 99.6 mm/1280-0.0778 mm, and the width is 56 mm/720-0.0778 mm. In this example, the size of the 4 × 8 pixels arranged in the array is: the length is 4X 0.0778mm, and the width is 8X 0.0778 mm. Of course, this is not limited in this application. The full size phased array then has a size of 4 x 0.0778mm in length and 8 x 0.0778mm in width. Of course, in other embodiments, the pixels may not be rectangles with equal length and width, and the pixels may also have other anisotropic structures, which is not limited in this application.
Similarly, if the target microlens in the second direction corresponding to the target pixel group is determined, and at least two pixels corresponding to the target microlens in the second direction are used as the target pixels. Assuming that the second direction is a vertical direction, U1, D1, U2, and D2 are sequentially output, resulting in phase information of the red pixel group.
Thus, the above-described processing is sequentially performed on each of the other 1 red pixel group and 4 green pixel groups and 2 blue pixel groups included in the pixel array, and phase information U3, D3, U4, D4, U5, D5, U6, D6, U7, D7, U8, D8, U9, D9, U10, D10, U11, D11, U12, D12, U13, D13, U14, D14, U15, D15, U16, and D16 of the pixel array is obtained.
Finally, the phase information of all the target pixels is output as a full-size phase array corresponding to the pixel array. That is, the full-size phase arrays are generated by sequentially arranging U1, D1, U2, D2, U3, D3, U4, D4, U5, D5, U6, D6, U7, D7, U8, D8, U9, D9, U10, D10, U11, D11, U12, D12, U13, D13, U14, D14, U15, D15, U16 and D16. And the size of the full-scale phased array is equivalent to the size of 4 × 8 pixels arranged in the array.
Similarly, if the target microlens in the first direction and the target microlens in the second direction corresponding to the target pixel group are determined, at least two pixels corresponding to the target microlens in the first direction and the target microlens in the second direction are used as target pixels. Assuming that the first direction is a horizontal direction and the second direction is a vertical direction, L1, R1, L2, R2, U1, D1, U2, and D2 are sequentially output to obtain phase information of the red pixel group. Then, the above-mentioned processing is sequentially performed on the other 1 red pixel group and 4 green pixel groups and 2 blue pixel groups included in the pixel array, so as to obtain the phase information of the pixel array. Finally, the phase information of all the target pixels is output as a full-size phase array corresponding to the pixel array. And the size of the full-scale phase array is equivalent to the size of 8 x 8 pixels arranged in the array.
In this case, the phase array may be input to an ISP (image Signal processing) and the ISP may calculate the phase difference of the pixel array based on the phase array. Then, a defocus distance is calculated based on the phase difference, and a DAC code value corresponding to the defocus distance is calculated. Finally, the code value is converted into a driving current by a driver IC of a motor (VCM), and the lens is moved to a clear position by the motor driving. Thus, focus control is realized according to the phase difference.
In the embodiment of the application, under the condition that the light intensity of the current shooting scene is greater than a first preset threshold value, the phase array corresponding to the pixel array is output according to a full-size output mode. The color pixel group in the pixel array is used as a target pixel group, and phase information of each pixel in the target pixel group is acquired aiming at each target pixel group. And the phase information of the target pixel can be acquired from at least one of the microlenses in the first direction and the microlenses in the second direction, and therefore, a variety of phase information can be acquired. And finally, generating a full-size phase array corresponding to the pixel array according to the phase information of the target pixel. Since phase focusing is performed using only the phase information of the color pixel group at this time, the data amount of the phase information to be output is reduced, and the efficiency of phase focusing is improved.
In one embodiment, specifying that if the phase information output mode is the first size output mode, the selecting, as the target pixel group, a color pixel group and a panchromatic pixel group in the pixel array includes:
at least one of a color pixel group and a panchromatic pixel group in the pixel array is taken as a target pixel group; the target pixel group corresponds to a pixel group;
accordingly, the method can be used for solving the problems that,
generating a phase array corresponding to the pixel array according to the phase information of the target pixel, comprising:
combining the phase information of the target pixels in the target pixel group to generate a plurality of groups of intermediate combined phase information;
and generating a first size phase array corresponding to the pixel array according to the plurality of sets of intermediate combined phase information.
The embodiment is a specific implementation step of outputting a phase array corresponding to a pixel array according to a first size output mode when the light intensity of a current shooting scene is greater than a second preset threshold and is less than or equal to a first preset threshold. The second preset threshold may be 500lux, which is not limited in this application. Namely, the environment with the light intensity of more than 500lux and less than or equal to 2000lux is present. As shown in fig. 15, at least one of the color pixel group and the panchromatic pixel group is first determined from the pixel array as a target pixel group for calculating phase information. Since the panchromatic pixels are not easily saturated in a scene with a weak light, in the case where the light intensity of the current shooting scene is greater than the second preset threshold and less than or equal to the first preset threshold, that is, in a scene with a weak light, phase information of the panchromatic pixel group can be used to realize phase focusing (PDAF) at this time. In a scene with weak light, the color pixels can acquire accurate phase information in the scene with weak light, so that phase information of the color pixel group can be used to realize phase focusing (PDAF). Therefore, when outputting the phase array corresponding to the pixel array in the first size output mode, the color pixel group may be selected as the target pixel group, the panchromatic pixel group may be selected as the target pixel group, and the color pixel group and the panchromatic pixel group may be selected as the target pixel group, which is not limited in the present application.
Specifically, for the case that the target pixel group is a color pixel group, phase focusing may be implemented by using phase information of a part of color pixel groups in the pixel array, or phase focusing may be implemented by using a part of color pixels in a part of color pixel groups, which is not limited in this application. Similarly, for the case that the target pixel group is a panchromatic pixel group, phase focusing may be achieved by using phase information of a part of the panchromatic pixel groups in the pixel array, and phase focusing may also be achieved by using a part of the panchromatic pixels in the part of the panchromatic pixel groups, which is not limited in this application. Similarly, for the case that the target pixel group is a color pixel group and a panchromatic pixel group, phase focusing may be achieved by using phase information of a part of the panchromatic pixel group and a part of the color pixel group in the pixel array, or phase focusing may be achieved by using a part of the panchromatic pixels in the part of the panchromatic pixel group and a part of the color pixels in the part of the color pixel group, which is not limited in this application.
At this time, phase focusing can be performed by using only the phase information of the partial pixel group, or by using only the phase information of the partial pixels in the partial pixel group, so that the data amount of the output phase information is reduced, and the phase focusing efficiency is improved.
And secondly, determining a target micro lens from the micro lenses corresponding to the target pixel group, and taking at least two pixels corresponding to the target micro lens as target pixels.
Specifically, a target microlens of a first direction corresponding to the target pixel group may be determined; the target micro-lenses in the first direction correspond to at least 2 adjacent target pixels arranged along the first direction. And taking at least two pixels corresponding to the target micro-lens in the first direction as target pixels. A target microlens of a second direction corresponding to the target pixel group may also be determined; the target micro-lenses in the second direction correspond to at least 2 adjacent target pixels which are arranged along the second direction; the second direction is perpendicular to the first direction. And taking at least two pixels corresponding to the target micro-lens in the second direction as target pixels. A target microlens in a first direction and a target microlens in a second direction corresponding to the target pixel group can also be determined; the target micro-lenses in the first direction correspond to at least 2 adjacent target pixels which are arranged along the first direction; the target micro-lenses in the second direction correspond to at least 2 adjacent target pixels arranged along the second direction. And taking at least two pixels corresponding to the target micro-lens in the first direction and at least two pixels corresponding to the target micro-lens in the second direction as target pixels.
And thirdly, acquiring the phase information of each target pixel in the target pixel group aiming at each target pixel group. Wherein each pixel corresponds to a photosensitive element. In the embodiment of the present application, on the one hand, it is assumed that all the panchromatic pixel groups in the pixel array are set as target pixel groups, and for each of the panchromatic pixel groups, phase information of each target pixel in the panchromatic pixel group is acquired. Combining the phase information of the target pixel to generate a plurality of groups of intermediate combined phase information; and finally, generating a first size phase array corresponding to the pixel array according to the multiple groups of intermediate combined phase information.
As shown in connection with fig. 15, one pixel array may include 2 red pixel groups, 4 green pixel groups, 2 blue pixel groups, and 8 panchromatic pixel groups. Assuming that all the panchromatic pixel groups in the pixel array are set as target pixel groups, phase information of each pixel group is sequentially calculated for 8 panchromatic pixel groups included in the pixel array. For example, the phase information of the panchromatic pixel group is calculated for the panchromatic pixel group including 9 panchromatic pixels arranged in a 3 × 3 array, numbered in order as panchromatic pixel 1, panchromatic pixel 2, panchromatic pixel 3, panchromatic pixel 4, panchromatic pixel 5, panchromatic pixel 6, panchromatic pixel 7, panchromatic pixel 8, and panchromatic pixel 9. Wherein each pixel corresponds to a photosensitive element. That is, the phase information of the panchromatic pixel 1 is L1, and the phase information of the panchromatic pixel 2 is R1; the phase information of the panchromatic pixel 8 is L2, and the phase information of the panchromatic pixel 9 is L2; the phase information of the panchromatic pixel 3 is U1, and the phase information of the panchromatic pixel 6 is D1; the phase information of the panchromatic pixel 4 is U1, and the phase information of the panchromatic pixel 7 is D1. In this case, the pixel 5 located at the center of the full-color pixel group corresponds to one microlens, and therefore, no phase information is obtained.
Therefore, if the target microlens in the first direction corresponding to the target pixel group is determined, at least two pixels corresponding to the target microlens in the first direction are used as the target pixels. Assuming that the first direction is the horizontal direction, L1, R1, L2, and R2 are sequentially output, and the phase information of the target pixel is combined to generate a plurality of sets of intermediate combined phase information. For example, L1, L2 are combined to generate first intermediate combined phase information L; r1 and R2 are combined to generate first intermediate combined phase information R.
Thus, the above-described processing is sequentially performed on all the other full-color pixel groups included in the pixel array, and phase information L3, R3, L4, R4, L5, R5, L6, R6, L7, R7, L8, R8, L9, R9, L10, R10, L11, R11, L12, R12, L13, R13, L14, R14, L15, R15, L16, and R16 of the pixel array are obtained. Then, combining the L3 and the L4 to generate first intermediate combined phase information L; r3 and R4 are combined to generate first intermediate combined phase information R. Combining the L5 and the L6 to generate first intermediate combined phase information L; r5 and R6 are combined to generate first intermediate combined phase information R. Combining the L7 and the L8 to generate first intermediate combined phase information L; r7 and R8 are combined to generate first intermediate combined phase information R. Combining the L9 and the L10 to generate first intermediate combined phase information L; r9 and R10 are combined to generate first intermediate combined phase information R. Combining the L11 and the L12 to generate first intermediate combined phase information L; r11 and R12 are combined to generate first intermediate combined phase information R. Combining the L13 and the L14 to generate first intermediate combined phase information L; r13 and R14 are combined to generate first intermediate combined phase information R. Combining the L15 and the L16 to generate first intermediate combined phase information L; r15 and R16 are combined to generate first intermediate combined phase information R.
And finally, generating a first size phase array corresponding to the pixel array according to the multiple groups of intermediate combined phase information. In this way, the first-size phase array is generated by sequentially arranging 8 sets of the first intermediate combined phase information L and the first intermediate combined phase information R. And the size of the first size phased array is equivalent to the size of 4 x 4 pixels arranged in the array.
Of course, the first intermediate combined phase information L and the first intermediate combined phase information R may be combined or converted for 8 sets to generate a first-size phase array corresponding to the pixel array. The conversion process may be a process of correcting 8 sets of first intermediate combined phase information L and first intermediate combined phase information R, and the present application is not limited thereto. When the combination processing is performed on 8 sets of the first intermediate combined phase information L and the first intermediate combined phase information R, the phase arrays corresponding to the size of 4 × 4 pixels may be combined into a phase array of 4 × 2 pixels. Of course, the present application does not limit the specific size of the combined phased array. Here, the size of a pixel refers to the size of the area of one pixel, which is related to the length and width of the pixel.
Wherein the pixel is the smallest photosensitive unit on a digital camera photosensitive device (CCD or CMOS). Typically, the pixels are not of a fixed size, which is related to the size of the display screen and the resolution. For example, if the size of the display screen is 4.5 inches and the resolution of the display screen is 1280 × 720, the length of the display screen is 99.6mm, and the width is 56mm, the length of one pixel is 99.6 mm/1280-0.0778 mm, and the width is 56 mm/720-0.0778 mm. In this example, the size of the 4 × 4 pixels arranged in the array is: the length is 4X 0.0778mm, and the width is 4X 0.0778 mm. Of course, this is not limited in this application. The full size phased array then has a size of 4 x 0.0778mm in length and 4 x 0.0778mm in width. Of course, in other embodiments, the pixels may not be rectangles with equal length and width, and the pixels may also have other anisotropic structures, which is not limited in this application.
Similarly, if the target microlens in the second direction corresponding to the target pixel group is determined, and at least two pixels corresponding to the target microlens in the second direction are used as the target pixels. Assuming that the second direction is a vertical direction, U1, D1, U2 and D2 are sequentially output, and then the phase information of the target pixel is combined to generate a plurality of sets of intermediate combined phase information. For example, U1, U2 are combined to generate first intermediate combined phase information U; d1 and D2 are combined to generate first intermediate combined phase information D.
Thus, the above-described processing is sequentially performed on all the other panchromatic pixel groups included in the pixel array, and phase information U3, D3, U4, D4, U5, D5, U6, D6, U7, D7, U8, D8, U9, D9, U10, D10, U11, D11, U12, D12, U13, D13, U14, D14, U15, D15, U16, and D16 of the pixel array is obtained. Then, the U3 and the U4 are combined to generate first intermediate combined phase information U; d3 and D4 are combined to generate first intermediate combined phase information D. Merging U5 and U6 to generate first intermediate merged phase information U; d5 and D6 are combined to generate first intermediate combined phase information D. Merging U7 and U8 to generate first intermediate merged phase information U; d7 and D8 are combined to generate first intermediate combined phase information D. Merging U9 and U10 to generate first intermediate merged phase information U; d9 and D10 are combined to generate first intermediate combined phase information D. Merging U11 and U12 to generate first intermediate merged phase information U; d11 and D12 are combined to generate first intermediate combined phase information D. Merging U13 and U14 to generate first intermediate merged phase information U; d13 and D14 are combined to generate first intermediate combined phase information D. Merging U15 and U16 to generate first intermediate merged phase information U; d15 and D16 are combined to generate first intermediate combined phase information D.
And finally, generating a first size phase array corresponding to the pixel array according to the multiple groups of intermediate combined phase information. In this way, the first-size phase array is generated by sequentially arranging 8 sets of the first intermediate combined phase information U and the first intermediate combined phase information D. And the size of the first size phased array is equivalent to the size of 4 x 4 pixels arranged in the array.
Similarly, if the target microlens in the first direction and the target microlens in the second direction corresponding to the target pixel group are determined, at least two pixels corresponding to the target microlens in the first direction and the target microlens in the second direction are used as target pixels. Assuming that the first direction is a horizontal direction and the second direction is a vertical direction, L1, R1, L2, R2, U1, D1, U2, and D2 are sequentially output, and the phase information of the target pixel is combined to generate a plurality of sets of intermediate combined phase information. For example, L1, L2 are combined to generate first intermediate combined phase information L; r1 and R2 are combined to generate first intermediate combined phase information R. Merging U1 and U2 to generate first intermediate merged phase information U; d1 and D2 are combined to generate first intermediate combined phase information D.
Then, the above-mentioned processing is sequentially performed on all other panchromatic pixel groups included in the pixel array, and phase information of the pixel array is obtained. And finally, combining the phase information of the target pixel to generate a plurality of groups of intermediate combined phase information, and taking the plurality of groups of intermediate combined phase information as a first-size phase array corresponding to the pixel array. And the size of the first full-scale phase array is equivalent to the size of 8 x 8 pixels arranged in the array.
On the other hand, assuming that all the color pixel groups in the pixel array are set as target pixel groups, for each color pixel group, phase information of each target pixel in the each color pixel group is acquired. Combining the phase information of the target pixel to generate a plurality of groups of intermediate combined phase information; and finally, generating a first size phase array corresponding to the pixel array according to the multiple groups of intermediate combined phase information. In the specific calculation process, when all the panchromatic pixel groups in the pixel array are set as the target pixel group, the process of calculating the first-size phase array is not described herein again.
On the other hand, assuming that all of the panchromatic pixel groups and the color pixel groups in the pixel array are set as target pixel groups, phase information of each target pixel in the panchromatic pixel groups and the color pixel groups is acquired for each of the panchromatic pixel groups and each of the color pixel groups. Combining the phase information of the target pixel to generate a plurality of groups of intermediate combined phase information; and finally, generating a first size phase array corresponding to the pixel array according to the multiple groups of intermediate combined phase information. In the specific calculation process, when all the panchromatic pixel groups in the pixel array are set as the target pixel group, the process of calculating the first-size phase array is not described herein again.
At this time, the phase array may be input to the ISP, and the phase difference of the pixel array may be calculated by the ISP based on the phase array. Then, a defocus distance is calculated based on the phase difference, and a DAC code value corresponding to the defocus distance is calculated. Finally, the code value is converted into a driving current by a driver IC of a motor (VCM), and the lens is moved to a clear position by the motor driving. Thus, focus control is realized according to the phase difference.
In the embodiment of the present application, when the light intensity of the current shooting scene is greater than the second preset threshold and is less than or equal to the first preset threshold, because the light intensity at this time is slightly weak, the phase information acquired by the color pixel group or the panchromatic pixel group is not very accurate, and the phase information may not be acquired by a part of the color pixel group or a part of the panchromatic pixel group. Therefore, at least one of the color pixel group or the panchromatic pixel group in the pixel array is set as a target pixel group, and for each target pixel group, phase information of each pixel in the target pixel group is acquired. And the phase information of the target pixel can be acquired from at least one of the microlenses in the first direction and the microlenses in the second direction, and therefore, a variety of phase information can be acquired.
And combining the phase information of the target pixel to generate a plurality of groups of intermediate combined phase information. And generating a first size phase array corresponding to the pixel array according to the plurality of sets of intermediate combined phase information. The phase information of the target pixel is merged to a certain degree by adopting a first size output mode, so that the accuracy of the output phase information is improved, and the signal-to-noise ratio of the phase information is improved. Finally, the focusing accuracy can be improved by carrying out phase focusing based on the first-size phase array corresponding to the pixel array.
In the previous embodiment, if the target pixel is at least two pixels corresponding to the target microlens in the first direction, merging the phase information of the target pixel to generate a plurality of sets of intermediate merged phase information includes:
acquiring at least two target microlenses in a first direction in a target pixel group, and determining target pixels at the same position from pixels corresponding to the at least two target microlenses in the first direction; the co-located target pixels are in the same orientation in the first direction in the corresponding target microlens;
and combining the phase information of the target pixels at the same position to generate a plurality of groups of first combined phase information.
Referring to fig. 15, if the target microlens in the first direction corresponding to the target pixel group is determined, at least two pixels corresponding to the target microlens in the first direction are used as the target pixels. Assuming that the first direction is the horizontal direction, L1, R1, L2, and R2 are sequentially output, and the phase information of the target pixel is combined to generate a plurality of sets of intermediate combined phase information.
Specifically, as shown in fig. 8 (a), at least two target microlenses in the first direction in the target pixel group are obtained (gray filling in the figure), and the target pixels at the same position are determined from the pixels corresponding to the at least two target microlenses in the first direction. Wherein the target pixels at the same position are at the same orientation in the first direction in the corresponding target microlens. For example, the target pixel at the same position is determined from among the pixels 1, 2, 8, and 9 corresponding to the target microlens 2211a (gray fill in the figure) in the first direction. That is, the pixel on the left side of the target microlens 2211a (gray fill in the figure) in the first direction is determined as the target pixel, and the pixel on the right side of the target microlens 2211a (gray fill in the figure) in the first direction is determined as the target pixel. Combining L1 and L2 to generate first intermediate combined phase information L; r1 and R2 are combined to generate first intermediate combined phase information R.
And generating a plurality of groups of first combined phase information based on the plurality of groups of first intermediate combined phase information L and the first intermediate combined phase information R.
In the embodiment of the application, at least two target microlenses in the first direction in a target pixel group are obtained, and target pixels at the same position are determined from pixels corresponding to the at least two target microlenses in the first direction. And combining the phase information of the target pixels at the same position to generate a plurality of groups of first combined phase information. The phase information of the pixels is merged to a certain degree by adopting a first size output mode, so that the accuracy of the output phase information is improved, and the signal-to-noise ratio of the phase information is improved. Finally, the focusing accuracy can be improved by carrying out phase focusing based on the first-size phase array corresponding to the pixel array.
In connection with the previous embodiment, assuming that the first direction is a vertical direction, the vertical direction is named as a second direction, and if the target pixel is at least two pixels corresponding to the target microlens in the second direction, the combining the phase information of the target pixels in the target pixel group to generate a plurality of sets of intermediate combined phase information includes:
acquiring at least two target microlenses in the second direction in the target pixel group, and determining target pixels at the same position from pixels corresponding to the target microlenses in the at least two second directions; the target pixels at the same position are at the same orientation in the second direction in the corresponding target microlens;
and combining the phase information of the target pixels at the same position to generate a plurality of groups of second combined phase information.
Referring to fig. 15, if the target microlens in the second direction corresponding to the target pixel group is determined, at least two pixels corresponding to the target microlens in the second direction are used as the target pixels. Assuming that the second direction is a vertical direction, U1, D1, U2 and D2 are sequentially output, and then the phase information of the target pixel is combined to generate a plurality of sets of intermediate combined phase information.
Specifically, as shown in fig. 8 (a), at least two target microlenses in the second direction in the target pixel group are obtained (the target microlenses are filled with oblique lines in the drawing), and the target pixels at the same position are determined from the pixels corresponding to the target microlenses in the at least two second directions. Wherein the target pixels at the same position are at the same orientation in the second direction in the corresponding target microlens. For example, the target pixel at the same position is specified from among the pixels 3, 6, 4, and 7 corresponding to the target microlens 2211a in the second direction (filled with diagonal lines in the drawing). That is, the pixel above the target microlens 2211a (diagonally filled in the drawing) in the second direction is determined as the target pixel, and the pixel below the target microlens 2211a (diagonally filled in the drawing) in the second direction is determined as the target pixel. Combining U1 and U2 to generate second intermediate combined phase information U; d1 and D2 are combined to generate second intermediate combined phase information D.
And generating a plurality of groups of second intermediate combined phase information based on the plurality of groups of second intermediate combined phase information U and the second intermediate combined phase information D.
In the embodiment of the application, at least two target microlenses in the second direction in the target pixel group are obtained, and the target pixel at the same position is determined from the pixels corresponding to the at least two target microlenses in the second direction. And combining the phase information of the target pixels at the same position to generate a plurality of groups of second combined phase information. The phase information of the pixels is merged to a certain degree by adopting a first size output mode, so that the accuracy of the output phase information is improved, and the signal-to-noise ratio of the phase information is improved. Finally, the focusing accuracy can be improved by carrying out phase focusing based on the first-size phase array corresponding to the pixel array.
As shown in the previous embodiment, if the target pixel is at least two pixels corresponding to the target microlens in the first direction and at least two pixels corresponding to the target microlens in the second direction, the merging the phase information of the target pixels in the target pixel group to generate a plurality of sets of intermediate merged phase information includes:
acquiring at least two target microlenses in a first direction in a target pixel group, and determining target pixels at the same position from pixels corresponding to the at least two target microlenses in the first direction; the co-located target pixels are in the same orientation in the first direction in the corresponding target microlens;
combining the phase information of the target pixels at the same position to generate a plurality of groups of first combined phase information;
acquiring at least two target microlenses in the second direction in the target pixel group, and determining target pixels at the same position from pixels corresponding to the target microlenses in the at least two second directions; the target pixels at the same position are at the same orientation in the second direction in the corresponding target microlens;
combining the phase information of the target pixels at the same position to generate a plurality of groups of second combined phase information;
and generating a plurality of groups of intermediate combined phase information based on the plurality of groups of first combined phase information and the plurality of groups of second combined phase information.
Specifically, as shown in fig. 15, if the target microlens in the first direction corresponding to the target pixel group is determined, at least two pixels corresponding to the target microlens in the first direction are used as the target pixels. Assuming that the first direction is the horizontal direction, L1, R1, L2, and R2 are sequentially output, and the phase information of the target pixel is combined to generate a plurality of sets of intermediate combined phase information.
Specifically, as shown in fig. 8 (a), at least two target microlenses in the first direction in the target pixel group are obtained (gray filling in the figure), and the target pixels at the same position are determined from the pixels corresponding to the at least two target microlenses in the first direction. Wherein the target pixels at the same position are at the same orientation in the first direction in the corresponding target microlens. For example, the target pixel at the same position is determined from among the pixels 1, 2, 8, and 9 corresponding to the target microlens 2211a (gray fill in the figure) in the first direction. That is, the pixel on the left side of the target microlens 2211a (gray fill in the figure) in the first direction is determined as the target pixel, and the pixel on the right side of the target microlens 2211a (gray fill in the figure) in the first direction is determined as the target pixel. Combining L1 and L2 to generate first intermediate combined phase information L; r1 and R2 are combined to generate first intermediate combined phase information R.
And generating a plurality of groups of first combined phase information based on the plurality of groups of first intermediate combined phase information L and the first intermediate combined phase information R.
And if the target micro lens in the second direction corresponding to the target pixel group is determined, and at least two pixels corresponding to the target micro lens in the second direction are used as target pixels. Assuming that the second direction is a vertical direction, U1, D1, U2 and D2 are sequentially output, and then the phase information of the target pixel is combined to generate a plurality of sets of intermediate combined phase information.
Specifically, at least two target microlenses in the second direction in the target pixel group are obtained (filled with oblique lines in the figure), and the target pixel at the same position is determined from the pixels corresponding to the at least two target microlenses in the second direction. Wherein the co-located target pixels are co-located in the second direction in the corresponding target microlens. For example, the target pixel at the same position is specified from among the pixels 3, 6, 4, and 7 corresponding to the target microlens 2211a in the second direction (filled with diagonal lines in the drawing). That is, the pixel above the target microlens 2211a (diagonally filled in the drawing) in the second direction is determined as the target pixel, and the pixel below the target microlens 2211a (diagonally filled in the drawing) in the second direction is determined as the target pixel. Combining U1 and U2 to generate second intermediate combined phase information U; d1 and D2 are combined to generate second intermediate combined phase information D.
And generating a plurality of groups of second intermediate combined phase information based on the plurality of groups of second intermediate combined phase information U and the second intermediate combined phase information D.
And finally, generating a plurality of groups of intermediate combined phase information based on the plurality of groups of first combined phase information and the plurality of groups of second combined phase information.
In the embodiment of the application, at least two target microlenses in a first direction and at least two target microlenses in a second direction in a target pixel group are obtained, and target pixels at the same position are determined from pixels corresponding to the target microlenses. And combining the phase information of the target pixels at the same position to generate a plurality of groups of first combined phase information and second combined phase information. And finally, generating a plurality of groups of intermediate combined phase information based on the plurality of groups of first combined phase information and the plurality of groups of second combined phase information. The phase information of the pixels is merged to a certain degree by adopting a first size output mode, so that the accuracy of the output phase information is improved, and the signal-to-noise ratio of the phase information is improved. Finally, the focusing accuracy can be improved by carrying out phase focusing based on the first-size phase array corresponding to the pixel array.
In another embodiment, if the target pixel group includes a color pixel group and a panchromatic pixel group, generating a first-size phased array of the pixel array in the first direction according to the target phase information includes:
determining a first phase weight corresponding to the color pixel group and a second phase weight corresponding to the panchromatic pixel group according to the light intensity of the current shooting scene; the first phase weights corresponding to the color pixel groups under different light intensities are different, and the second phase weights corresponding to the panchromatic pixel groups under different light intensities are different;
a first size phase array of the pixel array in a first direction is generated based on the target phase information and the first phase weight of the color pixel group and the target phase information and the second phase weight of the panchromatic pixel group.
Specifically, in a scene where the light intensity of the current shooting scene is greater than the second preset threshold and less than or equal to the first preset threshold, and the determined target pixel group includes a color pixel group and a panchromatic pixel group, when the first-size phase array is generated based on the phase information of the target pixel group, the weight between different pixel groups may be considered. The first phase weight corresponding to the color pixel group and the second phase weight corresponding to the panchromatic pixel group can be determined according to the light intensity of the current shooting scene. Specifically, the closer the light intensity of the current shooting scene is to the second preset threshold, the smaller the first phase weight corresponding to the color pixel group is, and the larger the second phase weight corresponding to the panchromatic pixel group is. Because the light intensity is smaller in a scene that is greater than the second preset threshold and less than or equal to the first preset threshold, the larger the second phase weight corresponding to the panchromatic pixel group is, the more accurate the acquired phase information is. With the increase of the light intensity, the closer the light intensity of the current shooting scene is to the first preset threshold, the larger the first phase weight corresponding to the color pixel group at this time is, and the smaller the second phase weight corresponding to the panchromatic pixel group is. At this time, the light intensity is larger in a scene larger than the second preset threshold and smaller than or equal to the first preset threshold, and the first phase weight corresponding to the color pixel group is larger, so that the acquired phase information is more comprehensive and more accurate. The first phase weights corresponding to the color pixel groups under different light intensities are different, and the second phase weights corresponding to the panchromatic pixel groups under different light intensities are different. For example, when the light intensity of the current shooting scene is 2000lux, the corresponding first phase weight of the color pixel set is determined to be 40%, wherein the phase weight of the green pixel set is 20%, the phase weight of the red pixel set is 10%, and the phase weight of the blue pixel set is 10%. And the second phase weight corresponding to the panchromatic pixel group is determined to be 60%, which is not limited in this application.
Then, a first size phase array of the pixel array in the first direction may be generated based on the target phase information and the first phase weights of the color pixel groups and the target phase information and the second phase weights of the panchromatic pixel groups. For example, for the pixel array, the phase information of the pixel array in the first direction is calculated by a common summation based on 10% of the target phase information and phase weight of the first red pixel group, 10% of the target phase information and phase weight of the second red pixel group, 10% of the target phase information and phase weight of the first blue pixel group, 10% of the target phase information and phase weight of the second blue pixel group, 20% of the target phase information and phase weight of each green pixel group, and 60% of the target phase information and phase weight of each panchromatic pixel group, and the first-size phase array is obtained.
In the embodiment of the present application, when performing phase focusing, if it is determined that the target pixel group includes a color pixel group and a panchromatic pixel group, the first-size phase array of the pixel array may be generated based on target phase information of the color pixel group and a first phase weight thereof, and target phase information of the panchromatic pixel group and a second phase weight thereof. In this way, the first-size phase array of the pixel array is generated based on the target phase information of the color pixel group and the panchromatic pixel group, and the comprehensiveness of the phase information can be improved. Meanwhile, the phase weights of the target phase information of the color pixel group and the panchromatic pixel group are different under different light intensities, so that the accuracy of the phase information can be improved by adjusting the weight under different light intensities.
In the foregoing embodiment, it is described that the pixel array may be an RGBW pixel array, including a plurality of minimal repeating units 241, the minimal repeating unit 241 including a plurality of pixel groups 242, the plurality of pixel groups 242 including a panchromatic pixel group 243 and a color pixel group 244. Each panchromatic pixel group 243 includes 9 panchromatic pixels 2431 and each color pixel group 244 includes 9 color pixels 2441.
In this embodiment, in the case where the pixel array is an RGBW pixel array, the phase information output pattern further includes a second size output pattern; the size of the phase array in the first dimension output mode is larger than that in the second dimension output mode;
referring to fig. 13, step 1024, determining a phase information output mode adapted to the light intensity of the current shooting scene according to the target light intensity range, includes:
step 1024c, if the light intensity of the current shooting scene is smaller than a second preset threshold, determining that the phase information output mode adaptive to the light intensity of the current shooting scene is a second size output mode.
Specifically, if one of the light intensity ranges is smaller than a second preset threshold, the phase information output mode corresponding to the light intensity range is a second size output mode. If the light intensity of the current shooting scene is judged to be smaller than the second preset threshold value, the light intensity of the current shooting scene falls into the light intensity range. Namely, the phase information output mode adapted to the light intensity of the current shooting scene is determined to be the second size output mode. The second preset threshold may be 500lux, which is not limited in this application. I.e. in an environment at dusk or at a light intensity of less than 500 lux.
And outputting the phase array by adopting a second size output mode, namely combining and outputting the original phase information of the pixel array to generate the phase array of the pixel array. In other words, the size of the pixel array is larger than the size of the phased array of the pixel array. For example, if the size of the pixel array is 12 × 12, the size of the phase array of each target pixel group in the pixel array is 4 × 2, and the size of the phase array is not limited in this application.
In the embodiment of the present application, since the size of the phase array in the first size output mode is greater than or equal to the size of the phase array in the second size output mode, if the light intensity of the current shooting scene is smaller than the second preset threshold, the phase information output mode adapted to the light intensity of the current shooting scene is determined to be the second size output mode, and at this time, the phase array with the same size as the pixel array is output by using the second size output mode. That is, in the case that the light intensity of the current shooting scene is weaker, the signal-to-noise ratio of the phase information is improved by narrowing the phase array.
In one embodiment, if the phase information output mode is the second size output mode, the selecting, as the target pixel group, a color pixel group and a panchromatic pixel group in the pixel array includes:
two color pixel groups adjacent along a first diagonal direction in the pixel array are used as a target pixel group, and two panchromatic pixel groups adjacent along a second diagonal direction in the pixel array are used as a target pixel group;
generating a phase array corresponding to the pixel array according to the phase information of the target pixel, comprising:
combining phase information of target pixels of two panchromatic pixel groups in the target pixel group, and combining the phase information of the target pixels of two color pixel groups in the target pixel group to generate a plurality of groups of target combined phase information;
and generating a second-size phase array corresponding to the pixel array according to the multiple groups of target combined phase information.
The embodiment is a specific implementation step of outputting the phase array corresponding to the pixel array according to the second size output mode under the condition that the light intensity of the current shooting scene is smaller than the second preset threshold. Wherein it is now in an environment at dusk or at a light intensity of less than 500 lux. Two color pixel groups adjacent along a first diagonal direction in the pixel array are used as a target pixel group, and two panchromatic pixel groups adjacent along a second diagonal direction in the pixel array are used as the target pixel group. Since the panchromatic pixels can capture more light information in an extremely dark scene when the light intensity of the current shooting scene is smaller than the second preset threshold, that is, in a dark scene, two color pixel groups adjacent to each other in the first diagonal direction in the pixel array can be used as the target pixel group and two panchromatic pixel groups adjacent to each other in the second diagonal direction in the pixel array can be used as the target pixel group. And combining the phase information of the target pixels of the two panchromatic pixel groups in the target pixel group, and combining the phase information of the target pixels of the two color pixel groups in the target pixel group to generate a plurality of groups of target combined phase information. And generating a second-size phase array corresponding to the pixel array according to the multiple groups of target combined phase information. In the process of combining the phase information of the target pixels of the two panchromatic pixel groups in the target pixel group, reference may be made to the process of combining the phase information of the target pixels of the two panchromatic pixel groups in the next embodiment. The phase information of the target pixels of the two color pixel groups in the target pixel group is combined, or refer to the process of combining the phase information of the target pixels of the two panchromatic pixel groups in the next embodiment. This implementation is not described in detail.
In one embodiment, if the phase information output mode is the second size output mode, the selecting, as a target pixel group, the group of color pixels and the group of panchromatic pixels in the pixel array includes:
taking the two panchromatic pixel groups adjacent in the second diagonal direction as a target pixel group;
generating a phase array corresponding to the pixel array according to the phase information of the target pixel, including:
combining the phase information of the target pixels of two panchromatic pixel groups in the target pixel group to generate a plurality of groups of target combined phase information;
and generating a second-size phase array corresponding to the pixel array according to the multiple groups of target combined phase information.
The embodiment is a specific implementation step of outputting the phase array corresponding to the pixel array according to the second size output mode under the condition that the light intensity of the current shooting scene is smaller than the second preset threshold. Wherein it is now in an environment at dusk or at a light intensity of less than 500 lux. Referring to fig. 16, two color pixel groups adjacent to each other along a first diagonal direction in the pixel array are first set as a target pixel group, and two panchromatic pixel groups adjacent to each other along a second diagonal direction in the pixel array are first set as a target pixel group. Since the panchromatic pixels can capture more light information in an extremely dark scene in the case that the light intensity of the current shooting scene is less than the second preset threshold, i.e., in a dark scene, phase information of the panchromatic pixel group can be used to realize phase focusing (PDAF). Therefore, when outputting the phase array corresponding to the pixel array in the second size output mode, the panchromatic pixel group may be selected as the target pixel group, or the color pixel group and the panchromatic pixel group may be selected as the target pixel group.
Specifically, for the case that the target pixel group is a panchromatic pixel group, phase focusing may be realized by using phase information of a part of the panchromatic pixel groups in the pixel array, and phase focusing may also be realized by using a part of the panchromatic pixels in the part of the panchromatic pixel groups, which is not limited in this application. For the case that the target pixel group is a color pixel group and a panchromatic pixel group, phase focusing may be achieved by using phase information of a part of the color pixel group and a part of the panchromatic pixel group in the pixel array, or phase focusing may be achieved by using a part of the color pixels of the part of the color pixel group and a part of the panchromatic pixels of the part of the panchromatic pixel group, which is not limited in this application.
At this time, phase focusing can be performed by using only the phase information of the partial pixel group, or by using only the phase information of the partial pixels in the partial pixel group, so that the data amount of the output phase information is reduced, and the phase focusing efficiency is improved.
And secondly, determining a target micro lens from the micro lenses corresponding to the target pixel group, and taking at least two pixels corresponding to the target micro lens as target pixels.
Specifically, a target microlens of a first direction corresponding to the target pixel group may be determined; the target micro-lenses in the first direction correspond to at least 2 adjacent target pixels arranged along the first direction. And taking at least two pixels corresponding to the target micro-lens in the first direction as target pixels. A target microlens of a second direction corresponding to the target pixel group may also be determined; the target micro-lenses in the second direction correspond to at least 2 adjacent target pixels which are arranged along the second direction; the second direction is perpendicular to the first direction. And taking at least two pixels corresponding to the target micro-lens in the second direction as target pixels. A target microlens in a first direction and a target microlens in a second direction corresponding to the target pixel group can also be determined; the target micro-lenses in the first direction correspond to at least 2 adjacent target pixels which are arranged along the first direction; the target micro-lenses in the second direction correspond to at least 2 adjacent target pixels arranged along the second direction. And taking at least two pixels corresponding to the target micro-lens in the first direction and at least two pixels corresponding to the target micro-lens in the second direction as target pixels.
And thirdly, acquiring the phase information of each target pixel in the target pixel group aiming at each target pixel group. Wherein each pixel corresponds to a photosensitive element. On the one hand, assuming that two panchromatic pixel groups adjacent in the second diagonal direction in the pixel array are both taken as target pixel groups, for the panchromatic pixel groups in the target pixel groups, phase information of each target pixel in the panchromatic pixel group is acquired. Combining phase information of target pixels of two panchromatic pixel groups in the target pixel group, and combining the phase information of the target pixels of two color pixel groups in the target pixel group to generate a plurality of groups of target combined phase information; and finally, generating a second-size phase array corresponding to the pixel array according to the multiple groups of target combined phase information.
As shown in connection with fig. 16, one pixel array may include 2 red pixel groups, 4 green pixel groups, 2 blue pixel groups, and 8 panchromatic pixel groups. Assuming that both of two panchromatic pixel groups adjacent in the second diagonal direction in the pixel array are set as target pixel groups, phase information of each target pixel group is sequentially calculated for 8 panchromatic pixel groups included in the pixel array, that is, 4 pairs of panchromatic pixel groups. For example, for a panchromatic pixel group in each target pixel group, phase information of the panchromatic pixel group is calculated. Specifically, for a first panchromatic pixel group in the target pixel group, the first panchromatic pixel group comprises 9 panchromatic pixels arranged in a 3 × 3 array, and the first panchromatic pixel group is numbered as panchromatic pixel 1, panchromatic pixel 2, panchromatic pixel 3, panchromatic pixel 4, panchromatic pixel 5, panchromatic pixel 6, panchromatic pixel 7, panchromatic pixel 8 and panchromatic pixel 9 in sequence. Wherein each pixel corresponds to a photosensitive element. That is, the phase information of the panchromatic pixel 1 is L1, and the phase information of the panchromatic pixel 2 is R1; the phase information of the panchromatic pixel 8 is L2, and the phase information of the panchromatic pixel 9 is L2; the phase information of the panchromatic pixel 3 is U1, and the phase information of the panchromatic pixel 6 is D1; the phase information of the panchromatic pixel 4 is U1, and the phase information of the panchromatic pixel 7 is D1. In this case, the pixel 5 located at the center of the full-color pixel group corresponds to one microlens, and therefore, no phase information is obtained.
Similarly, for the second panchromatic pixel group in the target pixel group, the second panchromatic pixel group comprises 9 panchromatic pixels arranged in a 3 × 3 array, and the 9 panchromatic pixels are numbered as panchromatic pixel 1, panchromatic pixel 2, panchromatic pixel 3, panchromatic pixel 4, panchromatic pixel 5, panchromatic pixel 6, panchromatic pixel 7, panchromatic pixel 8 and panchromatic pixel 9 in sequence. Wherein each pixel corresponds to a photosensitive element. That is, the phase information of the panchromatic pixel 1 is L1, and the phase information of the panchromatic pixel 2 is R1; the phase information of the panchromatic pixel 8 is L2, and the phase information of the panchromatic pixel 9 is L2; the phase information of the panchromatic pixel 3 is U1, and the phase information of the panchromatic pixel 6 is D1; the phase information of the panchromatic pixel 4 is U1, and the phase information of the panchromatic pixel 7 is D1. In this case, the pixel 5 located at the center of the full-color pixel group corresponds to one microlens, and therefore, no phase information is obtained.
Therefore, if the target microlens in the first direction corresponding to the target pixel group is determined, at least two pixels corresponding to the target microlens in the first direction are used as the target pixels. Assuming that the first direction is the horizontal direction, L1, R1, L2, and R2 corresponding to the first full-color pixel group are sequentially output, and L1, R1, L2, and R2 corresponding to the second full-color pixel group are sequentially output. And then combining the phase information of the target pixels to generate a plurality of groups of target combined phase information. For example, L1, L2 corresponding to the first panchromatic pixel group and L1, L2 corresponding to the second panchromatic pixel group are merged to generate first target merged phase information L; r1, R2 corresponding to the first panchromatic pixel group and R1, R2 corresponding to the second panchromatic pixel group are combined to generate first target combined phase information R.
In this way, the above-described processing is sequentially performed on the other panchromatic pixel groups included in the pixel array, and then L3 and L4 corresponding to the first panchromatic pixel group and L3 and L4 corresponding to the second panchromatic pixel group are merged to generate first target merged phase information L; r3, R4 corresponding to the first panchromatic pixel group and R3, R4 corresponding to the second panchromatic pixel group are combined to generate first target combined phase information R. Merging L5, L6 corresponding to the first panchromatic pixel group with L5, L6 corresponding to the second panchromatic pixel group to generate first target merged phase information L; r6, R6 corresponding to the first panchromatic pixel group and R5, R6 corresponding to the second panchromatic pixel group are combined to generate first target combined phase information R. Merging L3, L4 corresponding to the first panchromatic pixel group with L3, L4 corresponding to the second panchromatic pixel group to generate first target merged phase information L; r7, R8 corresponding to the first panchromatic pixel group and R7, R8 corresponding to the second panchromatic pixel group are combined to generate first target combined phase information R.
And finally, generating a second-size phase array corresponding to the pixel array according to the multiple groups of target combined phase information. In this way, the second-size phased array is generated by sequentially arranging 4 sets of the first target-combined phase information L and the first target-combined phase information R. And the size of the second size phased array is equivalent to the size of 4 × 2 pixels arranged in the array.
Of course, the 4 sets of the first target integrated phase information L and the first target integrated phase information R may be subjected to the integration processing or the conversion processing to generate the second-size phase array corresponding to the pixel array. The conversion process may be a process of correcting 4 sets of the first target integrated phase information L and the first target integrated phase information R, and the present application is not limited thereto. When the 4 sets of the first target integrated phase information L and the first target integrated phase information R are integrated, the phase arrays corresponding to the size of 4 × 2 pixels may be integrated into the phase array corresponding to the size of 2 × 2 pixels. Of course, the present application does not limit the specific size of the combined phased array. Here, the size of a pixel refers to the size of the area of one pixel, which is related to the length and width of the pixel.
Wherein the pixel is the smallest photosensitive unit on a digital camera photosensitive device (CCD or CMOS). Typically, the pixels are not of a fixed size, which is related to the size of the display screen and the resolution. For example, if the size of the display screen is 4.5 inches and the resolution of the display screen is 1280 × 720, the length of the display screen is 99.6mm, and the width is 56mm, the length of one pixel is 99.6 mm/1280-0.0778 mm, and the width is 56 mm/720-0.0778 mm. In this example, the size of the 4 × 2 pixels arranged in the array is: the length is 4X 0.0778mm, and the width is 2X 0.0778 mm. Of course, this is not limited in this application. The full size phased array then has a size of 4 x 0.0778mm in length and 2 x 0.0778mm in width. Of course, in other embodiments, the pixels may not be rectangles with equal length and width, and the pixels may also have other anisotropic structures, which is not limited in this application.
Similarly, if the target microlens in the second direction corresponding to the target pixel group is determined, and at least two pixels corresponding to the target microlens in the second direction are used as the target pixels. Assuming that the second direction is the vertical direction, U1, D1, U2, D2 corresponding to the first full-color pixel group are sequentially output, and U1, D1, U2, D2 corresponding to the second full-color pixel group are sequentially output. And then combining the phase information of the target pixels to generate a plurality of groups of target combined phase information. For example, U1, U2 corresponding to the first panchromatic pixel group and U1, U2 corresponding to the second panchromatic pixel group are merged to generate first target merged phase information U; d1, D2 corresponding to the first panchromatic pixel group and D1, D2 corresponding to the second panchromatic pixel group are combined to generate the first target combined phase information D.
In this way, the above-mentioned processing is sequentially performed on the other panchromatic pixel groups included in the pixel array, and then U3 and U4 corresponding to the first panchromatic pixel group and U3 and U4 corresponding to the second panchromatic pixel group are combined to generate first target combined phase information U; d3, D4 corresponding to the first panchromatic pixel group and D3, D4 corresponding to the second panchromatic pixel group are combined to generate the first target combined phase information D. Merging U5, U6 corresponding to the first panchromatic pixel group with U5, U6 corresponding to the second panchromatic pixel group to generate first target merged phase information U; d6, D6 corresponding to the first panchromatic pixel group and D5, D6 corresponding to the second panchromatic pixel group are combined to generate the first target combined phase information D. Merging U3, U4 corresponding to the first panchromatic pixel group with U3, U4 corresponding to the second panchromatic pixel group to generate first target merged phase information U; d7, D8 corresponding to the first panchromatic pixel group and D7, D8 corresponding to the second panchromatic pixel group are combined to generate the first target combined phase information D.
And finally, generating a second-size phase array corresponding to the pixel array according to the multiple groups of target combined phase information. In this way, the second-size phased array is generated by sequentially arranging 4 sets of the first target-combined phase information U and the first target-combined phase information D. And the size of the second size phased array is equivalent to the size of 4 × 2 pixels arranged in the array.
Similarly, if the target microlens in the first direction and the target microlens in the second direction corresponding to the target pixel group are determined, at least two pixels corresponding to the target microlens in the first direction and the target microlens in the second direction are used as target pixels. Assuming that the first direction is a horizontal direction and the second direction is a vertical direction, L1, R1, L2, R2, U1, D1, U2, and D2 are sequentially output, and the phase information of the target pixels is combined to generate a plurality of sets of target combined phase information. For example, L1, L2 corresponding to the first panchromatic pixel group and L1, L2 corresponding to the second panchromatic pixel group are merged to generate first target merged phase information L; r1, R2 corresponding to the first panchromatic pixel group and R1, R2 corresponding to the second panchromatic pixel group are combined to generate first target combined phase information R. Merging U1, U2 corresponding to the first panchromatic pixel group with U1, U2 corresponding to the second panchromatic pixel group to generate first target merged phase information U; d1, D2 corresponding to the first panchromatic pixel group and D1, D2 corresponding to the second panchromatic pixel group are combined to generate the first target combined phase information D.
Then, the above-mentioned processing is sequentially performed on all other panchromatic pixel groups included in the pixel array, and phase information of the pixel array is obtained. And finally, combining the phase information of the target pixel to generate a plurality of groups of target combined phase information, and taking the plurality of groups of target combined phase information as a second-size phase array corresponding to the pixel array. And the size of the second full-scale phased array is equivalent to the size of 4 x 4 pixels arranged in the array.
On the other hand, assuming that all of the panchromatic pixel groups and the color pixel groups in the pixel array are set as target pixel groups, phase information of each target pixel in the panchromatic pixel groups and the color pixel groups is acquired for each of the panchromatic pixel groups and each of the color pixel groups. Combining the phase information of the target pixels to generate a plurality of groups of target combined phase information; and finally, generating a second-size phase array corresponding to the pixel array according to the multiple groups of target combined phase information. In the specific calculation process, when all the panchromatic pixel groups in the pixel array are taken as the target pixel group, the process of calculating the phase array with the second size is not described herein again.
At this time, the phase array may be input to the ISP, and the phase difference of the pixel array may be calculated by the ISP based on the phase array. Then, a defocus distance is calculated based on the phase difference, and a DAC code value corresponding to the defocus distance is calculated. Finally, the code value is converted into a driving current by a driver IC of a motor (VCM), and the lens is moved to a clear position by the motor driving. Thus, focus control is realized according to the phase difference.
In the embodiment of the present application, under the condition that the light intensity of the current shooting scene is smaller than the second preset threshold, because the light intensity at this time is weaker, the phase information acquired through the color pixel groups is not very accurate, and the phase information may not be acquired by some color pixel groups. Therefore, two color pixel groups adjacent in the first diagonal direction and two panchromatic pixel groups adjacent in the second diagonal direction in the pixel array are set as target pixel groups, or two panchromatic pixel groups adjacent in the second diagonal direction are set as target pixel groups, and for each target pixel group, phase information of each pixel in the target pixel group is acquired. And the phase information of the target pixel can be acquired from at least one of the microlenses in the first direction and the microlenses in the second direction, and therefore, a variety of phase information can be acquired.
And combining the phase information of the target pixels to generate a plurality of groups of target combined phase information. And generating a second-size phase array corresponding to the pixel array according to the multiple groups of target combined phase information. The phase information of the target pixel is greatly combined by adopting a second size output mode, so that the accuracy of the output phase information is improved, and the signal-to-noise ratio of the phase information is improved. Finally, the focusing accuracy can be improved by carrying out phase focusing based on the second-size phase array corresponding to the pixel array.
In another embodiment, if at least two pixels corresponding to the target microlens in the first direction are used as the target pixel, the combining the phase information of the target pixels to generate a plurality of sets of target combined phase information includes:
acquiring at least four target microlenses in a first direction of two panchromatic pixel groups in a target pixel group, and determining target pixels at the same position from pixels corresponding to the target microlenses in the at least four first directions; the co-located target pixels are in the same orientation in the first direction in the corresponding target microlens;
and combining the phase information of the target pixels at the same position to generate a plurality of groups of first combined phase information.
Referring to fig. 16, if a target microlens in a first direction corresponding to a target pixel group is determined, and at least two pixels corresponding to the target microlens in the first direction in the first panchromatic pixel group and the second panchromatic pixel group are used as target pixels. Wherein the target pixels at the same position are at the same orientation in the first direction in the corresponding target microlens. Assuming that the first direction is the horizontal direction, L1, R1, L2, and R2 corresponding to the first panchromatic pixel group are sequentially output, L1, R1, L2, and R2 corresponding to the second panchromatic pixel group are sequentially output, and phase information of the target pixels is combined to generate a plurality of sets of target combined phase information.
Specifically, at least four target microlenses in the first direction are obtained from a first full-color pixel group and a second full-color pixel group in the target pixel group (filled with gray color in the figure), and target pixels at the same position are determined from pixels corresponding to the at least four target microlenses in the first direction. For example, the target pixels at the same position are determined from among the pixels 1, 2, 8, and 9 corresponding to the target microlens 2211a (gray fill in the drawing) in the first direction in the first full-color pixel group, and from among the pixels 1, 2, 8, and 9 corresponding to the target microlens 2211a (gray fill in the drawing) in the first direction in the second full-color pixel group. That is, the pixel on the left side of the target microlens 2211a (gray fill in the figure) in the first direction is determined as the target pixel, and the pixel on the right side of the target microlens 2211a (gray fill in the figure) in the first direction is determined as the target pixel. That is, L1 and L2 in the first panchromatic pixel group and L1 and L2 in the second panchromatic pixel group are combined to generate first target combined phase information L; r1, R2 in the first panchromatic pixel group and R1, R2 in the second panchromatic pixel group are combined to generate first target combined phase information R.
And generating a plurality of groups of first merging phase information based on the plurality of groups of first target merging phase information L and first target merging phase information R.
In the embodiment of the application, at least four target microlenses in the first direction in a target pixel group are obtained, and target pixels at the same position are determined from pixels corresponding to the at least four target microlenses in the first direction. And combining the phase information of the target pixels at the same position to generate a plurality of groups of first combined phase information. The second size output mode is adopted to greatly combine the phase information of the pixels, so that the accuracy of the output phase information is improved, and the signal to noise ratio of the phase information is improved. Finally, the focusing accuracy can be improved by carrying out phase focusing based on the second-size phase array corresponding to the pixel array.
In the embodiment, assuming that the first direction is a vertical direction, the vertical direction is named as a second direction, and if at least two pixels corresponding to the target microlens in the second direction are used as the target pixel, the phase information of the target pixels in the target pixel group is merged to generate multiple sets of target merged phase information, including:
acquiring at least four target microlenses in the second direction in the target pixel group, and determining target pixels at the same position from pixels corresponding to the at least four target microlenses in the second direction; the target pixels at the same position are at the same orientation in the second direction in the corresponding target microlens;
and combining the phase information of the target pixels at the same position to generate a plurality of groups of second combined phase information.
Referring to fig. 16, if a target microlens in the second direction corresponding to the target pixel group is determined, and at least two pixels corresponding to the target microlens in the second direction in the first panchromatic pixel group and the second panchromatic pixel group are used as target pixels. The target pixels at the same position are at the same orientation in the second direction in the corresponding target microlens. Assuming that the second direction is a vertical direction, U1, D1, U2 and D2 corresponding to the first panchromatic pixel group are sequentially output, U1, D1, U2 and D2 corresponding to the second panchromatic pixel group are sequentially output, and then phase information of the target pixels is combined to generate a plurality of groups of target combined phase information.
Specifically, at least four target microlenses in the second direction are obtained from the first panchromatic pixel group and the second panchromatic pixel group in the target pixel group (filled with oblique lines in the figure), and the target pixel at the same position is determined from the pixels corresponding to the at least four target microlenses in the second direction. For example, target pixels at the same position are determined from among the pixels 3, 5, 4, and 7 corresponding to the target microlens 2211a (diagonally filled in the drawing) in the second direction in the first full-color pixel group, and from among the pixels 3, 6, 3, and 7 corresponding to the target microlens 2211a (diagonally filled in the drawing) in the second direction in the second full-color pixel group. That is, the pixel above the target microlens 2211a (diagonally filled in the drawing) in the second direction is determined as the target pixel, and the pixel below the target microlens 2211a (diagonally filled in the drawing) in the second direction is determined as the target pixel. Combining U1 and U2 in the first panchromatic pixel group and U1 and U2 in the second panchromatic pixel group to generate second target combined phase information U; d1 and D2 in the first panchromatic pixel group and D1 and D2 in the second panchromatic pixel group are combined to generate second target combined phase information D.
And generating a plurality of groups of second combined phase information based on the plurality of groups of second target combined phase information U and the second target combined phase information D.
In the embodiment of the application, at least four target microlenses in the second direction in the target pixel group are obtained, and the target pixels at the same position are determined from the pixels corresponding to the at least four target microlenses in the second direction. And combining the phase information of the target pixels at the same position to generate a plurality of groups of second combined phase information. The second size output mode is adopted to greatly combine the phase information of the pixels, so that the accuracy of the output phase information is improved, and the signal to noise ratio of the phase information is improved. Finally, the focusing accuracy can be improved by carrying out phase focusing based on the second-size phase array corresponding to the pixel array.
In the above embodiment, if at least two pixels corresponding to the target microlens in the first direction and at least two pixels corresponding to the target microlens in the second direction are used as target pixels, the combining the phase information of the target pixels of the two panchromatic pixel groups in the target pixel group to generate multi-group target combined phase information includes:
acquiring target microlenses in at least four first directions of two panchromatic pixel groups in a target pixel group, determining target pixels at the same position from pixels corresponding to the target microlenses in the at least four first directions,
combining the phase information of the target pixels at the same position to generate a plurality of groups of first combined phase information;
acquiring at least four target microlenses in the second direction of two panchromatic pixel groups in a target pixel group, and determining target pixels at the same position from pixels corresponding to the target microlenses in the at least four second directions;
and combining the phase information of the target pixels at the same position to generate a plurality of groups of second combined phase information.
Specifically, as shown in fig. 16, if a target microlens in a first direction corresponding to a target pixel group is determined, and at least two pixels corresponding to the target microlens in the first direction in the first full-color pixel group and the second full-color pixel group are used as target pixels. Assuming that the first direction is the horizontal direction, L1, R1, L2, and R2 corresponding to the first panchromatic pixel group are sequentially output, L1, R1, L2, and R2 corresponding to the second panchromatic pixel group are sequentially output, and phase information of the target pixels is combined to generate a plurality of sets of target combined phase information.
Specifically, at least four target microlenses in the first direction are obtained from a first full-color pixel group and a second full-color pixel group in the target pixel group (filled with gray color in the figure), and target pixels at the same position are determined from pixels corresponding to the at least four target microlenses in the first direction. For example, the target pixels at the same position are determined from among the pixels 1, 2, 8, and 9 corresponding to the target microlens 2211a (gray fill in the drawing) in the first direction in the first full-color pixel group, and from among the pixels 1, 2, 8, and 9 corresponding to the target microlens 2211a (gray fill in the drawing) in the first direction in the second full-color pixel group. That is, the pixel on the left side of the target microlens 2211a (gray fill in the figure) in the first direction is determined as the target pixel, and the pixel on the right side of the target microlens 2211a (gray fill in the figure) in the first direction is determined as the target pixel. That is, L1 and L2 in the first panchromatic pixel group and L1 and L2 in the second panchromatic pixel group are combined to generate first target combined phase information L; r1, R2 in the first panchromatic pixel group and R1, R2 in the second panchromatic pixel group are combined to generate first target combined phase information R.
And generating a plurality of groups of first merging phase information based on the plurality of groups of first target merging phase information L and first target merging phase information R.
As shown in fig. 16, if the target microlens in the first direction and the target microlens in the second direction corresponding to the target pixel group are determined, and at least two pixels corresponding to the target microlens in the first direction in the first panchromatic pixel group and the target microlens in the second panchromatic pixel group are used as the target pixels, at least two pixels corresponding to the target microlens in the second direction in the first panchromatic pixel group and the second panchromatic pixel group are also used as the target pixels. Assuming that the first direction is a horizontal direction and the second direction is a vertical direction, L1, R1, L2, and R2 corresponding to the first panchromatic pixel group are sequentially output, L1, R1, L2, and R2 corresponding to the second panchromatic pixel group are sequentially output, U1, D1, U2, and D2 corresponding to the first panchromatic pixel group are sequentially output, U1, D1, U2, and D2 corresponding to the second panchromatic pixel group are sequentially output, and phase information of target pixels is combined to generate a plurality of sets of target combined phase information.
Specifically, at least four target microlenses in the first direction are obtained from a first full-color pixel group and a second full-color pixel group in the target pixel group (filled with gray color in the figure), and target pixels at the same position are determined from pixels corresponding to the at least four target microlenses in the first direction. For example, the target pixels at the same position are determined from among the pixels 1, 2, 8, and 9 corresponding to the target microlens 2211a (gray fill in the drawing) in the first direction in the first full-color pixel group, and from among the pixels 1, 2, 8, and 9 corresponding to the target microlens 2211a (gray fill in the drawing) in the first direction in the second full-color pixel group. That is, the pixel on the left side of the target microlens 2211a (gray fill in the figure) in the first direction is determined as the target pixel, and the pixel on the right side of the target microlens 2211a (gray fill in the figure) in the first direction is determined as the target pixel. That is, L1 and L2 in the first panchromatic pixel group and L1 and L2 in the second panchromatic pixel group are combined to generate first target combined phase information L; r1, R2 in the first panchromatic pixel group and R1, R2 in the second panchromatic pixel group are combined to generate first target combined phase information R.
Further, at least four target microlenses in the second direction are obtained from the first panchromatic pixel group and the second panchromatic pixel group in the target pixel group (the diagonal lines are filled in the figure), and the target pixel at the same position is determined from the pixels corresponding to the at least four target microlenses in the second direction. For example, target pixels at the same position are determined from among the pixels 3, 5, 4, and 7 corresponding to the target microlens 2211a (diagonally filled in the drawing) in the second direction in the first full-color pixel group, and from among the pixels 3, 6, 3, and 7 corresponding to the target microlens 2211a (diagonally filled in the drawing) in the second direction in the second full-color pixel group. That is, the pixel above the target microlens 2211a (diagonally filled in the drawing) in the second direction is determined as the target pixel, and the pixel below the target microlens 2211a (diagonally filled in the drawing) in the second direction is determined as the target pixel. Combining U1 and U2 in the first panchromatic pixel group and U1 and U2 in the second panchromatic pixel group to generate second target combined phase information U; d1 and D2 in the first panchromatic pixel group and D1 and D2 in the second panchromatic pixel group are combined to generate second target combined phase information D.
And generating a plurality of groups of second combined phase information based on the plurality of groups of first target combined phase information L and first target combined phase information R, and the plurality of groups of second target combined phase information U and second target combined phase information D.
In the embodiment of the application, at least four target microlenses in the first direction and at least four target microlenses in the second direction of two panchromatic pixel groups in a target pixel group are obtained, and target pixels at the same position are determined from pixels corresponding to the at least four target microlenses in the first direction and the at least four target microlenses in the second direction. And combining the phase information of the target pixels at the same position to generate a plurality of groups of second combined phase information. Therefore, the phase information can be acquired based on the target micro-lenses in the first direction and the second direction at the same time, and the phase information of the pixels is greatly combined by adopting the second size output mode, so that the accuracy of the output phase information is improved, and the signal-to-noise ratio of the phase information is improved. Finally, the focusing accuracy can be improved by carrying out phase focusing based on the second-size phase array corresponding to the pixel array.
In one embodiment, if the target pixel group is two color pixel groups adjacent to each other along a first diagonal direction and two panchromatic pixel groups adjacent to each other along a second diagonal direction in the pixel array, generating a second-size phase array corresponding to the pixel array according to the multiple groups of target merged phase information includes:
determining a third phase weight corresponding to the color pixel group and a fourth phase weight corresponding to the panchromatic pixel group according to the light intensity of the current shooting scene; the third phase weights corresponding to the color pixel groups under different light intensities are different, and the fourth phase weights corresponding to the panchromatic pixel groups under different light intensities are different;
and generating a second size phase array of the pixel array in the second direction based on the target phase information and the third phase weight of the color pixel group and the target phase information and the fourth phase weight of the panchromatic pixel group.
Specifically, in a scene where the light intensity of the current shooting scene is smaller than the second preset threshold, and the determined target pixel group includes two color pixel groups adjacent to each other in the pixel array along the first diagonal direction and two panchromatic pixel groups adjacent to each other in the second diagonal direction, when the second-size phase array is generated based on the phase information of the target pixel group, the weights between different pixel groups may be considered. The third phase weight corresponding to the color pixel group and the fourth phase weight corresponding to the panchromatic pixel group can be determined according to the light intensity of the current shooting scene. Specifically, the farther the light intensity of the current shooting scene is from the second preset threshold, the smaller the third phase weight corresponding to the color pixel group at this time is, and the larger the fourth phase weight corresponding to the panchromatic pixel group is. Since the light intensity is smaller in a scene smaller than the second preset threshold, the larger the fourth phase weight corresponding to the panchromatic pixel group is, the more accurate the acquired phase information is. As the light intensity increases, the closer the light intensity of the current shooting scene is to the second preset threshold, the greater the third phase weight corresponding to the color pixel group at this time, and the smaller the fourth phase weight corresponding to the panchromatic pixel group. At this time, the light intensity is larger in a scene smaller than the second preset threshold, and the larger the third phase weight corresponding to the color pixel group is, the more comprehensive and accurate the acquired phase information is. The third phase weights corresponding to the color pixel groups under different light intensities are different, and the fourth phase weights corresponding to the panchromatic pixel groups under different light intensities are different. For example, when the light intensity of the current shooting scene is 450lux, the third phase weight corresponding to the color pixel set is determined to be 40%, wherein the phase weight of the green pixel set is 20%, the phase weight of the red pixel set is 10%, and the phase weight of the blue pixel set is 10%. And determines the fourth phase weight corresponding to the panchromatic pixel group to be 60%, which is not limited in this application.
A second size phased array of the pixel array may then be generated based on the sets of target-combined phase information and third phase weights for the color pixel sets and the sets of target-combined phase information and fourth phase weights for the panchromatic pixel sets. For example, for the pixel array, the phase information of the pixel array in the first direction is calculated by common summation based on 10% of the target combined phase information and phase weight of the first red pixel group, 10% of the target combined phase information and phase weight of the second red pixel group, 10% of the target combined phase information and phase weight of the first blue pixel group, 10% of the target combined phase information and phase weight of the second blue pixel group, 20% of the target combined phase information and phase weight of each green pixel group, and 60% of the target combined phase information and phase weight of each panchromatic pixel group, so as to obtain the second-size phase array.
In the embodiment of the present application, when performing phase focusing, if it is determined that the target pixel group is two color pixel groups adjacent to each other in the pixel array along the first diagonal direction and two panchromatic pixel groups adjacent to each other in the second diagonal direction, the second-size phase array of the pixel array may be generated based on the multiple sets of target merged phase information of the color pixel groups and the third phase weights thereof, and the multiple sets of target merged phase information of the panchromatic pixel groups and the fourth phase weights thereof. In this way, the second-size phase array of the pixel array is generated based on the target phase information of the color pixel group and the panchromatic pixel group, and the comprehensiveness of the phase information can be improved. Meanwhile, the phase weights of the target phase information of the color pixel group and the panchromatic pixel group are different under different light intensities, so that the accuracy of the phase information can be improved by adjusting the weight under different light intensities.
In one embodiment, before outputting the phase array corresponding to the pixel array according to the phase information output mode, the method further includes:
determining a target pixel array from a plurality of pixel arrays in an image sensor according to a preset extraction ratio and a preset extraction position of the pixel array for focus control;
outputting a phase array corresponding to the pixel array according to a phase information output mode, including:
and outputting a phase array corresponding to the target pixel array according to the phase information output mode.
Specifically, the area of the image sensor is large, the number of pixel arrays including the minimum unit is tens of thousands, and if all phase information is extracted from the image sensor for phase focusing, the amount of phase information data is too large, which results in an excessive amount of actual calculation, thereby wasting system resources and reducing the image processing speed.
In order to save system resources and improve image processing speed, a pixel array for focus control may be extracted from a plurality of pixel arrays in an image sensor in advance according to a preset extraction ratio and a preset extraction position. For example, the extraction may be performed at a preset extraction ratio of 3%, that is, one pixel array is extracted from 32 pixel arrays as a pixel array for focus control. And the extracted pixel arrays are arranged with the vertices of a hexagon, i.e., the extracted pixel arrays constitute a hexagon. In this way, the phase information can be uniformly acquired. Of course, the preset extraction ratio and the preset extraction position are not limited in the present application.
Then, the phase information output mode adapted to the light intensity of the current shooting scene can be determined according to the light intensity of the current shooting scene. Outputting a phase array corresponding to the pixel array according to a phase information output mode aiming at the pixel array for focusing control; the phase array comprises phase information corresponding to a target pixel in the pixel array. And finally, calculating the phase difference of the pixel array based on the phase array, and performing focusing control according to the phase difference.
In the embodiment of the application, a target pixel array is determined from a plurality of pixel arrays in an image sensor according to a preset extraction ratio and a preset extraction position of the pixel array for focus control. Therefore, all phase information in the image sensor is not required to be adopted for focusing, and only the phase information corresponding to the target pixel array is adopted for focusing, so that the data volume is greatly reduced, and the image processing speed is improved. Meanwhile, according to the preset extraction position, the target pixel array is determined from the plurality of pixel arrays in the image sensor, and phase information can be acquired more uniformly. Finally, the accuracy of phase focusing is improved.
In one embodiment, there is provided a focus control method, further comprising:
and determining a first preset threshold and a second preset threshold of the light intensity according to the exposure parameters and the size of the pixel.
Specifically, the threshold of the light intensity may be determined according to the exposure parameters and the size of the pixel. The exposure parameters include shutter speed, lens aperture size and sensitivity (ISO).
In the embodiment of the application, the first preset threshold and the second preset threshold of the light intensity are determined according to the exposure parameters and the size of the pixel, and the light intensity range is divided into 3 ranges, so that the phase information output mode corresponding to the light intensity range is adopted in each light intensity range, and the phase information is calculated more finely.
In one embodiment, as shown in fig. 17, there is provided a focus control apparatus 1700 applied to an image sensor, the apparatus including:
the phase information output mode determining module 1720 is configured to determine, according to the light intensity of the current shooting scene, a phase information output mode adapted to the light intensity of the current shooting scene; wherein, under different phase information output modes, the output phase arrays have different sizes;
a phase array output module 1740 configured to output a phase array corresponding to the pixel array according to the phase information output mode; the phase array comprises phase information corresponding to a target pixel in the pixel array;
and a focusing control module 1760 for calculating the phase difference of the pixel array based on the phased array and performing focusing control according to the phase difference.
In one embodiment, as shown in fig. 18, the phased array output module 1740 includes:
a target pixel determining unit 1742 configured to determine a target pixel group from the color pixel group and the panchromatic pixel group in the pixel array according to the phase information output mode, determine a target microlens corresponding to the target pixel group, and set at least two pixels corresponding to the target microlens as a target pixel;
a phase information generating unit 1744 configured to acquire phase information of the target pixel for each target pixel group;
a phase array generating unit 1746 configured to generate a phase array corresponding to the pixel array according to the phase information of the target pixel.
In one embodiment, the target pixel determining unit 1742 is further configured to determine a target microlens of a first direction corresponding to the target pixel group; the target micro-lenses in the first direction correspond to at least 2 adjacent target pixels which are arranged along the first direction; and taking at least two pixels corresponding to the target micro-lens in the first direction as target pixels.
In one embodiment, the target pixel determining unit 1742 is further configured to determine a target microlens of a second direction corresponding to the target pixel group; the target micro-lenses in the second direction correspond to at least 2 adjacent target pixels which are arranged along the second direction; and taking at least two pixels corresponding to the target micro-lens in the second direction as target pixels.
In one embodiment, the target pixel determining unit 1742 is further configured to determine a target microlens in a first direction and a target microlens in a second direction corresponding to the target pixel group; the target micro-lenses in the first direction correspond to at least 2 adjacent target pixels which are arranged along the first direction; the target micro-lenses in the second direction correspond to at least 2 adjacent target pixels which are arranged along the second direction; the second direction is perpendicular to the first direction; and taking at least two pixels corresponding to the target micro-lens in the first direction and at least two pixels corresponding to the target micro-lens in the second direction as target pixels.
In one embodiment, the phase information output mode determining module 1720 is further configured to determine a target light intensity range to which the light intensity of the current shooting scene belongs; wherein, different light intensity ranges correspond to different phase information output modes; and determining a phase information output mode adaptive to the light intensity of the current shooting scene according to the target light intensity range.
In one embodiment, the phase information output mode comprises a full-size output mode and a first-size output mode, and the size of the phase array in the full-size output mode is larger than or equal to that in the first-size output mode;
phase information output mode determination module 1720, comprising:
the full-size output mode determining unit is used for determining that the phase information output mode adaptive to the light intensity of the current shooting scene is the full-size output mode if the light intensity of the current shooting scene is larger than a first preset threshold;
the first size output mode determining unit is used for determining that the phase information output mode adaptive to the light intensity of the current shooting scene is the first size output mode if the light intensity of the current shooting scene is greater than the second preset threshold and is less than or equal to the first preset threshold; the first preset threshold is greater than the second preset threshold.
In one embodiment, if the phase information output mode is the full-size output mode, the target pixel determining unit 1742 is further configured to use the color pixel set in the pixel array as the target pixel set; or a color pixel group and a panchromatic pixel group in the pixel array are taken as a target pixel group; wherein one of the target pixel groups corresponds to one pixel group;
accordingly, the method can be used for solving the problems that,
the phase array generating unit 1746 is further configured to generate a full-scale phase array corresponding to the pixel array according to the phase information of the target pixel.
In one embodiment, if the phase information output mode is the first size output mode, the target pixel determination unit 1742 is further configured to use at least one of a color pixel group and a panchromatic pixel group in the pixel array as the target pixel group; the target pixel group corresponds to a pixel group;
accordingly, the method can be used for solving the problems that,
a phase array generating unit 1746, configured to combine the phase information of the target pixels in the target pixel group to generate multiple sets of intermediate combined phase information; and generating a first size phase array corresponding to the pixel array according to the plurality of sets of intermediate combined phase information.
In an embodiment, if the target pixel is at least two pixels corresponding to the target microlens in the first direction, the phase array generating unit 1746 is further configured to obtain at least two target microlenses in the first direction in the target pixel group, and determine the target pixel at the same position from the pixels corresponding to the at least two target microlenses in the first direction; the co-located target pixels are in the same orientation in the first direction in the corresponding target microlens; and combining the phase information of the target pixels at the same position to generate a plurality of groups of first combined phase information.
In an embodiment, if the target pixel is at least two pixels corresponding to the target microlens in the second direction, the phase array generating unit 1746 is further configured to obtain at least two target microlenses in the second direction in the target pixel group, and determine the target pixel at the same position from the pixels corresponding to the at least two target microlenses in the second direction; the target pixels at the same position are at the same orientation in the second direction in the corresponding target microlens; and combining the phase information of the target pixels at the same position to generate a plurality of groups of second combined phase information.
In an embodiment, if the target pixel is at least two pixels corresponding to the target microlens in the first direction and at least two pixels corresponding to the target microlens in the second direction, the phase array generating unit 1746 is further configured to obtain at least two target microlenses in the first direction in the target pixel group, and determine the target pixel at the same position from the pixels corresponding to the at least two target microlenses in the first direction; the co-located target pixels are in the same orientation in the first direction in the corresponding target microlens; combining the phase information of the target pixels at the same position to generate a plurality of groups of first combined phase information; acquiring at least two target microlenses in the second direction in the target pixel group, and determining target pixels at the same position from pixels corresponding to the target microlenses in the at least two second directions; the target pixels at the same position are at the same orientation in the second direction in the corresponding target microlens; combining the phase information of the target pixels at the same position to generate a plurality of groups of second combined phase information; and generating a plurality of groups of intermediate combined phase information based on the plurality of groups of first combined phase information and the plurality of groups of second combined phase information.
In one embodiment, if the target pixel group includes a color pixel group and a panchromatic pixel group, the phase array generating unit 1746 is further configured to determine a first phase weight corresponding to the color pixel group and a second phase weight corresponding to the panchromatic pixel group according to the light intensity of the current shooting scene; the first phase weights corresponding to the color pixel groups under different light intensities are different, and the second phase weights corresponding to the panchromatic pixel groups under different light intensities are different; and generating a first size phase array of the pixel array based on the multiple groups of intermediate combined phase information and first phase weights corresponding to the color pixel groups and the multiple groups of intermediate combined phase information and second phase weights corresponding to the panchromatic pixel groups.
In one embodiment, the phase information output pattern further comprises a second size output pattern; the size of the phase array in the first size output mode is larger than or equal to that of the phase array in the second size output mode;
phase information output mode determination module 1720, comprising:
and the second size output mode determining unit is used for determining that the phase information output mode adaptive to the light intensity of the current shooting scene is the second size output mode if the light intensity of the current shooting scene is smaller than a second preset threshold.
In one embodiment, if the phase information output mode is the second size output mode, the target pixel determining unit 1742 is further configured to use two color pixel groups adjacent along a first diagonal direction as a target pixel group and two panchromatic pixel groups adjacent along a second diagonal direction as a target pixel group in the pixel array;
a phase array generating unit 1746, further configured to combine the phase information of the target pixel of two panchromatic pixel groups in the target pixel group, combine the phase information of the target pixel of two color pixel groups in the target pixel group, and generate multiple sets of target combined phase information; and generating a second-size phase array corresponding to the pixel array according to the multiple groups of target combined phase information.
In one embodiment, if the phase information output mode is the second size output mode, the target pixel determining unit 1742 is further configured to take two panchromatic pixel groups adjacent in the second diagonal direction as the target pixel group;
a phase array generating unit 1746, further configured to combine the phase information of the target pixels of two panchromatic pixel groups in the target pixel group, and generate multiple sets of target combined phase information; and generating a second-size phase array corresponding to the pixel array according to the multiple groups of target combined phase information.
In one embodiment, if at least two pixels corresponding to the target microlens in the first direction are taken as the target pixel, the phased array generating unit 1746 is further configured to obtain the target microlenses in at least four first directions of two panchromatic pixel groups in the target pixel group, and determine the target pixel at the same position from the pixels corresponding to the target microlenses in at least four first directions; the co-located target pixels are in the same orientation in the first direction in the corresponding target microlens; and combining the phase information of the target pixels at the same position to generate a plurality of groups of first combined phase information.
In one embodiment, if at least two pixels corresponding to the target microlens in the second direction are taken as the target pixel, the phased array generating unit 1746 is further configured to obtain the target microlenses in at least four second directions in the two panchromatic pixel groups of the target pixel group, and determine the target pixel in the same position from the pixels corresponding to the target microlenses in at least four second directions; the target pixels at the same position are at the same orientation in the second direction in the corresponding target microlens; and combining the phase information of the target pixels at the same position to generate a plurality of groups of second combined phase information.
In one embodiment, if at least two pixels corresponding to the target microlens in the first direction and at least two pixels corresponding to the target microlens in the second direction are used as the target pixels, the phase array generating unit 1746 is further configured to
Acquiring at least four target microlenses in a first direction of two panchromatic pixel groups in a target pixel group, and determining target pixels at the same position from pixels corresponding to the target microlenses in the at least four first directions; the co-located target pixels are in the same orientation in the first direction in the corresponding target microlens; combining the phase information of the target pixels at the same position to generate a plurality of groups of first combined phase information;
acquiring at least four target microlenses in the second direction of two panchromatic pixel groups in a target pixel group, and determining target pixels at the same position from pixels corresponding to the target microlenses in the at least four second directions; the target pixels at the same position are at the same orientation in the second direction in the corresponding target microlens; and combining the phase information of the target pixels at the same position to generate a plurality of groups of second combined phase information.
In one embodiment, if the target pixel group is two color pixel groups adjacent to each other along a first diagonal direction and two panchromatic pixel groups adjacent to each other along a second diagonal direction in the pixel array, the phase array generating unit 1746 is further configured to determine a third phase weight corresponding to the color pixel group and a fourth phase weight corresponding to the panchromatic pixel group according to the light intensity of the current shooting scene; the third phase weights corresponding to the color pixel groups under different light intensities are different, and the fourth phase weights corresponding to the panchromatic pixel groups under different light intensities are different; and generating a second-size phase array of the pixel array based on the multiple groups of target merging phase information and third phase weights corresponding to the color pixel groups and the multiple groups of target merging phase information and fourth phase weights corresponding to the panchromatic pixel groups.
In one embodiment, there is provided a focus control apparatus, further comprising:
the target pixel array determining module is used for determining a target pixel array from a plurality of pixel arrays in the image sensor according to a preset extraction proportion and a preset extraction position of the pixel array for focusing control;
the phase array output module 1740 is further configured to output a phase array corresponding to the target pixel array according to the phase information output mode.
In one embodiment, there is provided a focus control apparatus, further comprising:
and the threshold value determining module is used for determining a first preset threshold value, a second preset threshold value and a third preset threshold value of the light intensity according to the exposure parameters and the size of the pixel.
It should be understood that, although the steps in the above-described flowcharts are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the above-described flowcharts may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or the stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least a portion of the sub-steps or stages of other steps.
The division of the modules in the focusing control device is merely for illustration, and in other embodiments, the focusing control device may be divided into different modules as needed to complete all or part of the functions of the focusing control device.
For specific definition of the focus control device, reference may be made to the definition of the focus control method above, and details are not repeated here. The modules in the focusing control device can be realized by software, hardware and their combination in whole or in part. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
Fig. 19 is a schematic diagram of an internal structure of an electronic device in one embodiment. The electronic device may be any terminal device such as a mobile phone, a tablet computer, a notebook computer, a desktop computer, a PDA (Personal Digital Assistant), a POS (Point of Sales), a vehicle-mounted computer, and a wearable device. The electronic device includes a processor and a memory connected by a system bus. The processor may include one or more processing units, among others. The processor may be a CPU (Central Processing Unit), a DSP (Digital Signal processor), or the like. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program can be executed by a processor for implementing a focus control method provided in the following embodiments. The internal memory provides a cached execution environment for the operating system computer programs in the non-volatile storage medium.
The implementation of each module in the focus control apparatus provided in the embodiments of the present application may be in the form of a computer program. The computer program may be run on an electronic device. Program modules constituted by such computer programs may be stored on the memory of the electronic device. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of the focus control method.
Embodiments of the present application also provide a computer program product containing instructions that, when run on a computer, cause the computer to perform a focus control method.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. The nonvolatile Memory may include a ROM (Read-Only Memory), a PROM (Programmable Read-Only Memory), an EPROM (Erasable Programmable Read-Only Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), or a flash Memory. Volatile Memory can include RAM (Random Access Memory), which acts as external cache Memory. By way of illustration and not limitation, RAM is available in many forms, such as SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), SDRAM (Synchronous Dynamic Random Access Memory), Double Data Rate DDR SDRAM (Double Data Rate Synchronous Random Access Memory), ESDRAM (Enhanced Synchronous Dynamic Random Access Memory), SLDRAM (Synchronous Link Dynamic Random Access Memory), RDRAM (Random Dynamic Random Access Memory), and DRmb DRAM (Dynamic Random Access Memory).
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (33)

1. An image sensor comprising a microlens array, a pixel array, and a filter array, the filter array comprising a minimal repeating unit comprising a plurality of filter sets, the filter sets comprising a color filter and a panchromatic filter; the color filter has a narrower spectral response than the panchromatic filter, and the color filter and the panchromatic filter each comprise 9 sub-filters arranged in an array;
wherein the pixel array comprises a plurality of pixel groups, the pixel groups are panchromatic pixel groups or color pixel groups, each panchromatic pixel group corresponds to the panchromatic filter, and each color pixel group corresponds to the color filter; the panchromatic pixel group and the color pixel group respectively comprise 9 pixels, the pixels of the pixel array are arranged corresponding to the sub-filters of the filter array, and each pixel corresponds to one photosensitive element;
the micro lens array comprises a plurality of micro lens groups, each micro lens group corresponds to the pixel group, the micro lens groups comprise a plurality of micro lenses, and at least one micro lens in the micro lenses corresponds to at least two pixels.
2. The image sensor of claim 1, wherein the number of filter sets is 4, and 4 of the filter sets are arranged in a matrix.
3. The image sensor of claim 2, wherein in each of the filter sets, the panchromatic filter is disposed in a first diagonal direction and the color filter is disposed in a second diagonal direction, the first diagonal direction being different from the second diagonal direction.
4. The image sensor of claim 3, wherein the filter set comprises 2 panchromatic filters and 2 color filters, and the minimum repeating unit is 12 rows, 12 columns and 144 sub-filters, and is arranged in a manner that:
Figure FDA0003436988150000011
where w denotes a panchromatic sub-filter, and a, b, and c each denote a color sub-filter.
5. The image sensor of claim 3, wherein the filter set comprises 2 panchromatic filters and 2 color filters, and the minimum repeating unit is 12 rows, 12 columns and 144 sub-filters, and is arranged in a manner that:
Figure FDA0003436988150000012
where w denotes a panchromatic sub-filter, and a, b, and c each denote a color sub-filter.
6. The image sensor of claim 1, wherein the microlens set comprises 5 microlenses, wherein the 5 microlenses comprise 4 first microlenses and 1 second microlens; the first micro lenses correspond to 2 pixels respectively, and the second micro lenses correspond to 1 pixel.
7. The image sensor as claimed in claim 6, wherein the 4 first microlenses comprise 2 first microlenses in a first direction and 2 first microlenses in a second direction; first microlenses of the first direction are arranged in the microlens group along the first direction; the first microlenses of the second direction are arranged in the microlens group along the second direction;
the second microlens is located at the center of the microlens set and corresponds to a central pixel of the pixel group corresponding to the microlens set, or the second microlens is located at one of four corners of the microlens set and corresponds to a pixel at one of the four corners in the pixel group corresponding to the microlens set.
8. The image sensor of claim 7, wherein if the second microlens is located at the center of the microlens set, 2 first microlenses in the first direction and 2 first microlenses in the second direction are disposed around the second microlens;
the 2 first microlenses in the first direction are arranged along a first diagonal line in a central symmetry mode relative to the second microlenses; the 2 first microlenses in the second direction are arranged in central symmetry along a second diagonal direction relative to the second microlenses.
9. The image sensor of claim 1, wherein one of the plurality of microlenses corresponds to at least four pixels.
10. The image sensor of claim 9, wherein the microlens set comprises 4 microlenses, the 4 microlenses comprising 1 third microlens, 2 fourth microlenses, and 1 fifth microlens; the third microlenses correspond to 4 pixels, the fourth microlenses correspond to 2 pixels, and the fifth microlenses correspond to 1 pixel.
11. A focus control method applied to the image sensor according to any one of claims 1 to 10, the method comprising:
determining a phase information output mode adaptive to the light intensity of the current shooting scene according to the light intensity of the current shooting scene; wherein, under different phase information output modes, the output phase arrays have different sizes;
outputting a phase array corresponding to the pixel array according to the phase information output mode; the phase array comprises phase information corresponding to a target pixel in the pixel array;
and calculating the phase difference of the pixel array based on the phase array, and performing focusing control according to the phase difference.
12. The method of claim 11, wherein outputting a phased array corresponding to the pixel array in the phase information output mode comprises:
according to the phase information output mode, determining a target pixel group from the color pixel group and the panchromatic pixel group in the pixel array, determining a target microlens corresponding to the target pixel group, and taking at least two pixels corresponding to the target microlens as target pixels;
for each target pixel group, acquiring phase information of the target pixel;
and generating a phase array corresponding to the pixel array according to the phase information of the target pixel.
13. The method of claim 12, wherein determining a target microlens corresponding to the target pixel group, and wherein taking at least two pixels corresponding to the target microlens as a target pixel comprises:
determining a target microlens of a first direction corresponding to the target pixel group; the target micro-lenses in the first direction correspond to at least 2 adjacent target pixels which are arranged along the first direction;
and taking at least two pixels corresponding to the target micro-lens in the first direction as target pixels.
14. The method of claim 12, wherein determining a target microlens corresponding to the target pixel group, and wherein taking at least two pixels corresponding to the target microlens as a target pixel comprises:
determining a target microlens in a first direction and a target microlens in a second direction corresponding to the target pixel group; the target micro-lenses in the first direction correspond to at least 2 adjacent target pixels which are arranged along the first direction; the target micro-lenses in the second direction correspond to at least 2 adjacent target pixels which are arranged along the second direction; the second direction is perpendicular to the first direction;
and taking at least two pixels corresponding to the target micro-lens in the first direction and at least two pixels corresponding to the target micro-lens in the second direction as target pixels.
15. The method according to any one of claims 13-14, wherein determining the phase information output mode adapted to the light intensity of the current shooting scene according to the light intensity of the current shooting scene comprises:
determining a target light intensity range to which the light intensity of the current shooting scene belongs; wherein, different light intensity ranges correspond to different phase information output modes;
and determining a phase information output mode adaptive to the light intensity of the current shooting scene according to the target light intensity range.
16. The method of claim 15, wherein the phase information output mode comprises a full-scale output mode and a first-scale output mode, and wherein a size of the phase array in the full-scale output mode is greater than or equal to a size of the phase array in the first-scale output mode;
the determining a phase information output mode adapted to the light intensity of the current shooting scene according to the target light intensity range includes:
if the light intensity of the current shooting scene is larger than a first preset threshold value, determining that a phase information output mode adaptive to the light intensity of the current shooting scene is the full-size output mode;
if the light intensity of the current shooting scene is larger than a second preset threshold and smaller than or equal to the first preset threshold, determining that the phase information output mode adaptive to the light intensity of the current shooting scene is the first size output mode; the first preset threshold is greater than the second preset threshold.
17. The method of claim 16, wherein said selecting the set of color pixels and the set of panchromatic pixels in the pixel array as a target set of pixels if the phase information output mode is a full-scale output mode comprises:
taking a color pixel group in the pixel array as a target pixel group, or taking a color pixel group and a panchromatic pixel group in the pixel array as a target pixel group; wherein one of the target pixel groups corresponds to one pixel group;
accordingly, the method can be used for solving the problems that,
generating a phase array corresponding to the pixel array according to the phase information of the target pixel, including:
and generating a full-size phase array corresponding to the pixel array according to the phase information of the target pixel.
18. The method of claim 16, wherein said selecting the set of color pixels and the set of panchromatic pixels in the pixel array as a target set of pixels if the phase information output mode is a first size output mode comprises:
at least one of a color pixel group and a panchromatic pixel group in the pixel array is taken as a target pixel group; one said target pixel group corresponds to one pixel group;
accordingly, the method can be used for solving the problems that,
generating a phase array corresponding to the pixel array according to the phase information of the target pixel, including:
combining the phase information of the target pixels in the target pixel group to generate a plurality of groups of intermediate combined phase information;
and generating a first size phase array corresponding to the pixel array according to the plurality of sets of intermediate combined phase information.
19. The method of claim 18, wherein if the target pixel is at least two pixels corresponding to the target microlens in the first direction, the combining the phase information of the target pixel to generate a plurality of sets of intermediate combined phase information comprises:
acquiring at least two target microlenses in the first direction in the target pixel group, and determining target pixels at the same position from pixels corresponding to the at least two target microlenses in the first direction; the co-located target pixels are in the same orientation in the first direction in the corresponding target microlens;
and combining the phase information of the target pixels at the same position to generate a plurality of groups of first combined phase information.
20. The method of claim 18, wherein if the target pixel is at least two pixels corresponding to the target microlens in the first direction and at least two pixels corresponding to the target microlens in the second direction, the combining the phase information of the target pixels in the target pixel group to generate a plurality of sets of intermediate combined phase information comprises:
acquiring at least two target microlenses in the first direction in the target pixel group, and determining target pixels at the same position from pixels corresponding to the at least two target microlenses in the first direction; the co-located target pixels are in the same orientation in the first direction in the corresponding target microlens;
combining the phase information of the target pixels at the same position to generate a plurality of groups of first combined phase information;
acquiring at least two target microlenses in the second direction in the target pixel group, and determining target pixels at the same position from pixels corresponding to the target microlenses in the at least two second directions; the target pixels at the same position are at the same orientation in the second direction in the corresponding target microlens;
combining the phase information of the target pixels at the same position to generate a plurality of groups of second combined phase information;
and generating a plurality of groups of intermediate combined phase information based on the plurality of groups of first combined phase information and the plurality of groups of second combined phase information.
21. The method of claim 18, wherein if the target set of pixels comprises a set of color pixels and a set of panchromatic pixels, the generating a first size phased array corresponding to the pixel array based on the plurality of sets of intermediate merged phase information comprises:
determining a first phase weight corresponding to the color pixel group and a second phase weight corresponding to the panchromatic pixel group according to the light intensity of the current shooting scene;
generating a first size phase array of the pixel array based on the sets of intermediate combined phase information and the first phase weights corresponding to the color pixel sets and the sets of intermediate combined phase information and the second phase weights corresponding to the panchromatic pixel sets.
22. The method of claim 16, wherein the phase information output pattern further comprises a second size output pattern; the size of the phase array in the first dimension output mode is larger than or equal to the size of the phase array in the second dimension output mode;
the determining a phase information output mode adapted to the light intensity of the current shooting scene according to the target light intensity range includes:
and if the light intensity of the current shooting scene is smaller than a second preset threshold, determining that the phase information output mode adaptive to the light intensity of the current shooting scene is the second size output mode.
23. The method of claim 22, wherein said selecting the set of color pixels and the set of panchromatic pixels in the pixel array as a target set of pixels if the phase information output mode is the second size output mode comprises:
two color pixel groups adjacent along a first diagonal direction in the pixel array are used as a target pixel group, and two panchromatic pixel groups adjacent along a second diagonal direction in the pixel array are used as a target pixel group;
generating a phase array corresponding to the pixel array according to the phase information of the target pixel, including:
combining the phase information of the target pixels of the two panchromatic pixel groups in the target pixel group, and combining the phase information of the target pixels of the two color pixel groups in the target pixel group to generate a plurality of groups of target combined phase information;
and generating a second-size phase array corresponding to the pixel array according to the multiple groups of target combined phase information.
24. The method of claim 22, wherein said selecting the set of color pixels and the set of panchromatic pixels in the pixel array as a target set of pixels if the phase information output mode is the second size output mode comprises:
taking the two panchromatic pixel groups adjacent in the second diagonal direction as a target pixel group;
generating a phase array corresponding to the pixel array according to the phase information of the target pixel, including:
combining the phase information of the target pixels of two panchromatic pixel groups in the target pixel group to generate a plurality of groups of target combined phase information;
and generating a second-size phase array corresponding to the pixel array according to the multiple groups of target combined phase information.
25. The method of claim 24, wherein if at least two pixels corresponding to the target microlens in the first direction are used as target pixels, the combining phase information of the target pixels of two panchromatic pixel groups in the target pixel group to generate a plurality of sets of target combined phase information comprises:
acquiring at least four target microlenses in a first direction of two panchromatic pixel groups in the target pixel group, and determining target pixels at the same position from pixels corresponding to the target microlenses in the at least four first directions; the co-located target pixels are in the same orientation in the first direction in the corresponding target microlens;
and combining the phase information of the target pixels at the same position to generate a plurality of groups of first combined phase information.
26. The method of claim 24, wherein if at least two pixels corresponding to the target microlens in the first direction and at least two pixels corresponding to the target microlens in the second direction are used as target pixels, the combining phase information of the target pixels of two panchromatic pixel groups in the target pixel group to generate a plurality of sets of target combined phase information comprises:
acquiring at least four target microlenses in a first direction of two panchromatic pixel groups in the target pixel group, and determining target pixels at the same position from pixels corresponding to the target microlenses in the at least four first directions; the co-located target pixels are in the same orientation in the first direction in the corresponding target microlens;
combining the phase information of the target pixels at the same position to generate a plurality of groups of first combined phase information;
acquiring at least four target microlenses in a second direction of two panchromatic pixel groups in the target pixel group, and determining target pixels at the same position from pixels corresponding to the target microlenses in the at least four second directions; the target pixels at the same position are at the same orientation in the second direction in the corresponding target microlens;
and combining the phase information of the target pixels at the same position to generate a plurality of groups of second combined phase information.
27. The method of claim 23, wherein generating a second size phased array corresponding to the pixel array based on the plurality of sets of target combined phase information comprises:
determining a third phase weight corresponding to the color pixel group and a fourth phase weight corresponding to the panchromatic pixel group according to the light intensity of the current shooting scene;
generating a second-sized phased array of the pixel array based on the sets of target merged phase information and the third phase weights corresponding to the color pixel sets and the sets of target merged phase information and the fourth phase weights corresponding to the panchromatic pixel sets.
28. The method according to claim 11, further comprising, before said outputting a phase array corresponding to said pixel array in said phase information output mode:
determining a target pixel from a plurality of pixel arrays in the image sensor according to a preset extraction ratio and a preset extraction position of the pixel array for focus control;
the outputting a phase array corresponding to the pixel array according to the phase information output mode includes:
and outputting a phase array corresponding to the pixel array according to the phase information output mode.
29. The method of claim 16, further comprising:
and determining a first preset threshold and a second preset threshold of the light intensity according to the exposure parameters and the size of the pixel.
30. A focus control apparatus applied to the image sensor according to any one of claims 1 to 10, the apparatus comprising:
the phase information output mode determining module is used for determining a phase information output mode adaptive to the light intensity of the current shooting scene according to the light intensity of the current shooting scene; wherein, under different phase information output modes, the output phase arrays have different sizes;
the phase array output module is used for outputting a phase array corresponding to the pixel array according to the phase information output mode; the phase array comprises phase information corresponding to a target pixel in the pixel array;
and the focusing control module is used for calculating the phase difference of the pixel array based on the phase array and carrying out focusing control according to the phase difference.
31. An electronic device comprising a memory and a processor, the memory having a computer program stored thereon, wherein the computer program, when executed by the processor, causes the processor to perform the steps of the focus control method according to any one of claims 11 to 29.
32. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the focus control method according to any one of claims 11 to 29.
33. A computer program product comprising computer program/instructions, characterized in that the computer program/instructions, when executed by a processor, implement the steps of the focus control method according to any of claims 11 to 29.
CN202111617501.0A 2021-12-27 2021-12-27 Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium Pending CN114222047A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111617501.0A CN114222047A (en) 2021-12-27 2021-12-27 Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium
PCT/CN2022/132425 WO2023124611A1 (en) 2021-12-27 2022-11-17 Focus control method and apparatus, image sensor, electronic device, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111617501.0A CN114222047A (en) 2021-12-27 2021-12-27 Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN114222047A true CN114222047A (en) 2022-03-22

Family

ID=80706335

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111617501.0A Pending CN114222047A (en) 2021-12-27 2021-12-27 Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN114222047A (en)
WO (1) WO2023124611A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023124611A1 (en) * 2021-12-27 2023-07-06 Oppo广东移动通信有限公司 Focus control method and apparatus, image sensor, electronic device, and computer-readable storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112118378A (en) * 2020-10-09 2020-12-22 Oppo广东移动通信有限公司 Image acquisition method and device, terminal and computer readable storage medium
CN113660415A (en) * 2021-08-09 2021-11-16 Oppo广东移动通信有限公司 Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2028843B1 (en) * 2007-08-21 2013-10-23 Ricoh Company, Ltd. Focusing device and imaging apparatus using the same
US9692992B2 (en) * 2013-07-01 2017-06-27 Omnivision Technologies, Inc. Color and infrared filter array patterns to reduce color aliasing
CN213279832U (en) * 2020-10-09 2021-05-25 Oppo广东移动通信有限公司 Image sensor, camera and terminal
CN113286067B (en) * 2021-05-25 2023-05-26 Oppo广东移动通信有限公司 Image sensor, image pickup apparatus, electronic device, and imaging method
CN114222047A (en) * 2021-12-27 2022-03-22 Oppo广东移动通信有限公司 Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112118378A (en) * 2020-10-09 2020-12-22 Oppo广东移动通信有限公司 Image acquisition method and device, terminal and computer readable storage medium
CN113660415A (en) * 2021-08-09 2021-11-16 Oppo广东移动通信有限公司 Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023124611A1 (en) * 2021-12-27 2023-07-06 Oppo广东移动通信有限公司 Focus control method and apparatus, image sensor, electronic device, and computer-readable storage medium

Also Published As

Publication number Publication date
WO2023124611A1 (en) 2023-07-06

Similar Documents

Publication Publication Date Title
CN102197639B (en) For the formation of method and the digital imaging apparatus of image
CN213279832U (en) Image sensor, camera and terminal
US7812869B2 (en) Configurable pixel array system and method
TWI504257B (en) Exposing pixel groups in producing digital images
US10128284B2 (en) Multi diode aperture simulation
RU2490715C1 (en) Image capturing device
WO2023087908A1 (en) Focusing control method and apparatus, image sensor, electronic device, and computer readable storage medium
CN112118378A (en) Image acquisition method and device, terminal and computer readable storage medium
CN111711755B (en) Image processing method and device, terminal and computer readable storage medium
CN116684752A (en) Image sensor and method of operating the same
CN113660415A (en) Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium
CN111131798B (en) Image processing method, image processing apparatus, and imaging apparatus
CN113840067B (en) Image sensor, image generation method and device and electronic equipment
CN112866549A (en) Image processing method and device, electronic equipment and computer readable storage medium
US11245878B2 (en) Quad color filter array image sensor with aperture simulation and phase detection
WO2023124611A1 (en) Focus control method and apparatus, image sensor, electronic device, and computer-readable storage medium
US9497427B2 (en) Method and apparatus for image flare mitigation
CN114125318A (en) Image sensor, camera module, electronic equipment, image generation method and device
CN112866655A (en) Image processing method and device, electronic equipment and computer readable storage medium
US8049796B2 (en) Method of correcting sensitivity and imaging apparatus
CN112866554B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN113676617A (en) Motion detection method, motion detection device, electronic equipment and computer-readable storage medium
CN112866548A (en) Phase difference acquisition method and device and electronic equipment
JP2019140696A (en) Solid-state imaging device
CN112866552B (en) Focusing method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination