CN113660415A - Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium - Google Patents

Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium Download PDF

Info

Publication number
CN113660415A
CN113660415A CN202110909146.8A CN202110909146A CN113660415A CN 113660415 A CN113660415 A CN 113660415A CN 202110909146 A CN202110909146 A CN 202110909146A CN 113660415 A CN113660415 A CN 113660415A
Authority
CN
China
Prior art keywords
pixel
pixels
sub
array
phase information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110909146.8A
Other languages
Chinese (zh)
Inventor
王文涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110909146.8A priority Critical patent/CN113660415A/en
Publication of CN113660415A publication Critical patent/CN113660415A/en
Priority to PCT/CN2022/103859 priority patent/WO2023016144A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

A focus control method, an apparatus, an imaging device, an electronic device, and a computer-readable storage medium. The application relates to a focusing control method, which is applied to electronic equipment, wherein the electronic equipment comprises an image sensor, the image sensor comprises an RGBW pixel array, and the method comprises the following steps: determining a target pixel corresponding to the light intensity of the current shooting scene from the RGBW pixel array according to the light intensity of the current shooting scene; the target pixel includes a W pixel or at least one color pixel in an RGBW pixel array. And acquiring phase information of the target pixel, calculating a phase difference according to the phase information of the target pixel, and performing focusing control based on the phase difference. By adopting the focusing control method, the target pixel corresponding to the light intensity of the current shooting scene can be determined from the W pixel or at least one color pixel of the RGBW pixel array under different light intensities, and the phase difference is calculated based on the phase information of the selected target pixel, so that the accuracy of phase focusing is finally improved.

Description

Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a focus control method and apparatus, an imaging device, an electronic device, and a computer-readable storage medium.
Background
With the development of electronic devices, more and more users take images through the electronic devices. In order to ensure that a shot image is clear, a camera module of the electronic device generally needs to be focused, that is, a distance between a lens and an image sensor is adjusted so that a shot object is on a focal plane. The conventional focusing method includes Phase Detection Auto Focus (PDAF).
The traditional phase detection automatic focusing mainly calculates a phase difference based on an RGB pixel array, then controls a motor based on the phase difference, and further drives a lens to move to a proper position for focusing by the motor so as to enable a shooting object to be imaged on a focal plane.
However, since the RGB pixel arrays have different sensitivities under different light intensities, the accuracy of the phase difference calculated by the RGB pixel arrays is low under a portion of the light intensities, and the focusing accuracy is greatly reduced.
Disclosure of Invention
The embodiment of the application provides a focusing control method and device, imaging equipment, electronic equipment and a computer readable storage medium, which can improve the accuracy of focusing control.
A focus control method applied to an electronic device including an image sensor including an RGBW pixel array, the method comprising:
determining a target pixel corresponding to the light intensity of the current shooting scene from the RGBW pixel array according to the light intensity of the current shooting scene; the target pixel comprises a W pixel or at least one color pixel in the RGBW pixel array;
acquiring phase information of the target pixel, and calculating a phase difference according to the phase information of the target pixel;
and performing focusing control based on the phase difference.
The imaging device comprises a lens, an optical filter and an image sensor, wherein the lens, the optical filter and the image sensor are sequentially positioned on an incident light path;
the image sensor comprises a plurality of RGBW pixel arrays arranged in an array, each RGBW pixel array comprises a plurality of pixel units, each pixel unit comprises a W pixel arranged in a diagonal line and a color pixel arranged in another diagonal line, and each pixel corresponds to one micro lens and a plurality of photosensitive elements; each pixel comprises a plurality of sub-pixels arranged in an array, and each sub-pixel corresponds to one photosensitive element; the color pixels include R pixels, G pixels and B pixels.
A focus control apparatus applied to an electronic device including an image sensor including an RGBW pixel array, the apparatus comprising:
the target pixel determining module is used for determining a target pixel corresponding to the light intensity of the current shooting scene from the RGBW pixel array according to the light intensity of the current shooting scene; the target pixel comprises a W pixel or at least one color pixel in the RGBW pixel array;
the phase difference calculation module is used for acquiring the phase information of the target pixel and calculating the phase difference according to the phase information of the target pixel;
and the focusing control module is used for carrying out focusing control on the basis of the phase difference.
An electronic device comprising a memory and a processor, the memory having stored therein a computer program, the computer program, when executed by the processor, causing the processor to perform the steps of the focus control method as described above.
A computer-readable storage medium, on which a computer-readable program is stored, which, when being executed by a processor, carries out the steps of the focus control method as described above.
In the above focusing control method, because the RGB pixel arrays have different sensitivities under different light intensities, the accuracy of the phase difference calculated by the RGB pixel arrays is low under a portion of the light intensities, and the focusing accuracy is greatly reduced. In the application, according to the light intensity of the current shooting scene, a target pixel corresponding to the light intensity of the current shooting scene is determined from a W pixel or at least one color pixel of an RGBW pixel array. Therefore, under different light intensities, if the accuracy of the phase difference calculated based on the phase information of at least one color pixel in the RGBW pixel array is low, the phase difference calculated based on the phase information of the W pixel is selected, and finally the accuracy of phase focusing is improved. Similarly, if the accuracy of the phase difference calculated based on the phase information of the W pixels in the RGBW pixel array is low, the phase difference calculated based on the phase information of at least one color pixel is selected, and the accuracy of phase focusing is finally improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of phase detection autofocus;
fig. 2 is a schematic diagram of arranging phase detection pixels in pairs among pixels included in an image sensor;
FIG. 3 is a schematic diagram of a portion of an RGBW pixel array in one embodiment;
FIG. 4 is a flow chart of a focus control method in one embodiment;
FIG. 5 is a diagram illustrating a focus control method according to an embodiment;
FIG. 6 is a flow diagram of a method for generating a target image after focus control based on phase difference in one embodiment;
FIG. 7 is a diagram illustrating a method of generating a target image after focus control based on a phase difference in one embodiment;
FIG. 8 is a flowchart of a method for obtaining phase information collected by the target pixel and calculating a phase difference according to the phase information of the target pixel in FIG. 4;
FIG. 9 is a diagram illustrating a focus control method according to another embodiment;
FIG. 10 is a flowchart of a method of generating a target image after focus control is performed based on a phase difference in another embodiment;
FIG. 11 is a diagram showing a method of generating a target image after focus control is performed based on a phase difference in another embodiment;
FIG. 12 is a flowchart of a method for obtaining phase information collected by the target pixel and calculating a phase difference according to the phase information of the target pixel in FIG. 4;
FIG. 13 is a diagram illustrating a focus control method according to still another embodiment;
FIG. 14 is a schematic diagram of an RGBW pixel array in yet another embodiment;
FIG. 15 is a schematic diagram of an RGBW pixel array in yet another embodiment;
FIG. 16 is a block diagram showing the structure of a focus control apparatus according to an embodiment;
fig. 17 is a schematic diagram of an internal structure of an electronic device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
Fig. 1 is a schematic diagram of a Phase Detection Auto Focus (PDAF) principle. As shown in fig. 1, M1 is the position of the image sensor when the imaging device is in the in-focus state, where the in-focus state refers to a successfully focused state. When the image sensor is located at the position M1, the imaging light rays g reflected by the object W in different directions toward the Lens converge on the image sensor, that is, the imaging light rays g reflected by the object W in different directions toward the Lens are imaged at the same position on the image sensor, and at this time, the image sensor is imaged clearly.
M2 and M3 indicate positions where the image sensor may be located when the imaging device is not in focus, and as shown in fig. 1, when the image sensor is located at the M2 position or the M3 position, the imaging light rays g reflected by the object W in different directions toward the Lens will be imaged at different positions. Referring to fig. 1, when the image sensor is located at the position M2, the imaging light rays g reflected by the object W in different directions toward the Lens are imaged at the position a and the position B, respectively, and when the image sensor is located at the position M3, the imaging light rays g reflected by the object W in different directions toward the Lens are imaged at the position C and the position D, respectively, and at this time, the image sensor is not clear.
In the PDAF technique, the difference in the position of the image formed by the imaging light rays entering the lens from different directions in the image sensor can be obtained, for example, as shown in fig. 1, the difference between the position a and the position B, or the difference between the position C and the position D can be obtained; after acquiring the difference of the positions of images formed by imaging light rays entering the lens from different directions in the image sensor, obtaining the out-of-focus distance according to the difference and the geometric relationship between the lens and the image sensor in the camera, wherein the out-of-focus distance refers to the distance between the current position of the image sensor and the position where the image sensor is supposed to be in the in-focus state; the imaging device can focus according to the obtained defocus distance.
From this, it is understood that the calculated PD value is 0 at the time of focusing, whereas the larger the calculated value is, the farther the position of the clutch focus is indicated, and the smaller the value is, the closer the clutch focus is indicated. When PDAF focusing is adopted, the PD value is calculated, the corresponding relation between the PD value and the defocusing distance is obtained according to calibration, the defocusing distance can be obtained, and then the lens is controlled to move to reach the focusing point according to the defocusing distance, so that focusing is realized.
In the related art, some phase detection pixel points may be provided in pairs among the pixel points included in the image sensor, and as shown in fig. 2, a phase detection pixel point pair (hereinafter, referred to as a pixel point pair) a, a pixel point pair B, and a pixel point pair C may be provided in the image sensor. In each pixel point pair, one phase detection pixel point performs Left shielding (English), and the other phase detection pixel point performs Right shielding (English).
For the phase detection pixel point which is shielded on the left side, only the light beam on the right side in the imaging light beam which is emitted to the phase detection pixel point can image on the photosensitive part (namely, the part which is not shielded) of the phase detection pixel point, and for the phase detection pixel point which is shielded on the right side, only the light beam on the left side in the imaging light beam which is emitted to the phase detection pixel point can image on the photosensitive part (namely, the part which is not shielded) of the phase detection pixel point. Therefore, the imaging light beam can be divided into a left part and a right part, and the phase difference can be obtained by comparing images formed by the left part and the right part of the imaging light beam.
The electronic device comprises an image sensor, and the image sensor comprises a plurality of RGBW pixel arrays arranged in an array. Fig. 3 is a schematic diagram of an RGBW pixel array. RGBW pattern (pixel array) improves the signal-to-noise ratio of the acquired signal due to the increased amount of light passing compared to a general Bayer pattern (Bayer pixel array). Each RGBW pixel array includes a plurality of pixel units Z, and as shown in fig. 3, each RGBW pixel array includes 4 pixel units Z. The 4 pixel units Z are respectively a red pixel unit, a green pixel unit and a red pixel unit. Of course, in other embodiments, each RGBW pixel array includes 6 or 8 pixel units Z, which is not limited in this application.
Each pixel unit Z includes a W pixel (white pixel) D arranged in a diagonal line and a color pixel D arranged in another diagonal line, and each pixel D corresponds to one microlens and a plurality of photosensitive elements; each pixel comprises a plurality of sub-pixels arranged in an array, and each sub-pixel corresponds to one photosensitive element. The color pixels D include R pixels (red pixels), G pixels (green pixels), and B pixels (blue pixels). Specifically, the red pixel unit comprises 2W pixels arranged in a diagonal line and 2R pixels arranged in another diagonal line; aiming at the green pixel unit, 2W pixels arranged in a diagonal line and 2G pixels arranged in another diagonal line are included; the blue pixel unit comprises 2W pixels arranged in a diagonal line and 2B pixels arranged in another diagonal line.
Each W pixel D includes a plurality of sub-pixels D arranged in an array, each color pixel D includes a plurality of sub-pixels D arranged in an array, and each sub-pixel D corresponds to one photosensitive element. The photosensitive element is an element capable of converting an optical signal into an electrical signal. For example, the light sensing element may be a photodiode. As shown in fig. 3, each W pixel D includes 4 sub-pixels D (i.e., 4 photodiodes) arranged in an array, and each color pixel D includes 4 sub-pixels D (i.e., 4 photodiodes) arranged in an array. For example, 4 photodiodes (Up-Left PhotoDiode, Up-Right PhotoDiode, Down-Left PhotoDiode, and Down-Right PhotoDiode) are included in an array for the green pixel D.
FIG. 4 is a flowchart of a focus control method according to an embodiment. The focusing control method in the embodiment of the present application is described by taking an example of the focusing control method being executed on an electronic device with a shooting function. The electronic device may be any terminal device such as a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a wearable device (smart band, smart watch, smart glasses, smart gloves, smart socks, smart belt, etc.), a VR (virtual reality) device, a smart home, and an unmanned vehicle. The electronic device includes an image sensor including an RGBW pixel array, as shown in fig. 4, the focus control method includes steps 420 to 460.
Step 420, according to the light intensity of the current shooting scene, determining a target pixel corresponding to the light intensity of the current shooting scene from the RGBW pixel array; the target pixel includes a W pixel or at least one color pixel in an RGBW pixel array.
In different shooting scenes or different moments, the light intensity of the current shooting scene is different, and because the light sensitivity of the RGB pixel array is different under different light intensities, the accuracy of the phase difference calculated by the RGB pixel array is lower under partial light intensity, and the focusing accuracy is further greatly reduced. The intensity of light is also called illumination intensity, which is a physical term referring to the luminous flux of visible light received per unit area, and is called illuminance, unit Lux or lx for short. The illumination intensity is used to indicate the intensity of the illumination and the amount of illumination to which the surface area of the object is illuminated. The following table shows the illumination intensity values for different weather and locations:
TABLE 1-1
Weather and location Illumination intensity value
Sunshine direct-irradiating ground in sunny day 100000lx
Indoor center of sunny day 200lx
Outside cloudy sky 50-500lx
In the cloudy sky 5-50lx
Moonlight (full moon) 2500lx
Clear moon night 0.2lx
Night 0.0011lx
As can be seen from table 1-1, the light intensities of the current shooting scene are greatly different at the shooting scene or different moments.
In order to solve this problem, the RGB pixel array of the image sensor in the conventional method is replaced with an RGBW pixel array. Compared with the RGB pixel array, the RGBW pixel array has the advantage that the transmittance of light can be improved by adding a white area in the RGB three-Color Filter. Because the sensitivity of the W pixel is strong, the RGBW pixel array can calculate the phase difference more accurately in a scene with weak light intensity compared with the RGB pixel array, and further improve the focusing accuracy.
Specifically, a target pixel corresponding to the light intensity of the current shooting scene is determined from the W pixel or at least one color pixel of the RGBW pixel array according to the light intensity of the current shooting scene. First, the light intensity, i.e. the illumination intensity, of the current shooting scene is obtained, which may be obtained by a sensor on the electronic device. Then, based on the magnitude relation between the light intensity of the current shooting scene and a preset threshold value of the light intensity, a target pixel corresponding to the light intensity of the current shooting scene is determined from the RGBW pixel array. For example, if the light intensity of the current shooting scene is smaller than the preset threshold of the light intensity, which indicates that the light is weak at this time, the W pixel is determined as the target pixel, so as to acquire more phase information through the W pixel. And if the light intensity of the current shooting scene is greater than or equal to the preset threshold value of the light intensity, determining at least one of the RGB pixels as a target pixel. At this time, accurate phase information can be acquired through the RGB pixels, and the sensitivity of the W pixel is strong, but the W pixel is easily saturated, which affects the accuracy of the acquired phase information.
Step 440, obtaining the phase information of the target pixel, and calculating the phase difference according to the phase information of the target pixel.
After the target pixel is determined, the phase information acquired by the sub-pixels included in the target pixel may be read. Then, the signal differences of the phase signals of the sub-pixels in the target pixel in the four directions of the first direction, the second direction, and the diagonal line (including the first diagonal line direction and the second diagonal line direction perpendicular to the first diagonal line direction) can be calculated respectively, and the phase differences in the four directions can be obtained. The first direction is the vertical direction of the RGBW pixel array, the second direction is the horizontal direction of the RGBW pixel array, and the first direction is perpendicular to the second direction. Of course, the phase difference of the sub-pixels included in the target pixel in other directions may also be calculated, which is not limited in the present application.
Step 460, focus control is performed based on the phase difference.
When focus control is performed based on the calculated phase difference, since the acquired phase difference parallel to a certain direction on the preview image is almost 0 for the texture feature of the direction, it is apparent that focus cannot be performed based on the acquired phase difference parallel to the direction. Therefore, if the preview image corresponding to the current shooting scene includes the texture feature in the second direction, the focusing control is performed based on the phase difference in the first direction. For example, it is assumed that the first direction is a vertical direction of the RGBW pixel array, the second direction is a horizontal direction of the RGBW pixel array, and the first direction and the second direction are perpendicular to each other. The texture feature in the second direction is included in the preview image, which means that the preview image includes horizontal stripes, and the horizontal stripes may be pure color stripes. At this time, if the preview image corresponding to the current shooting scene includes texture features in the horizontal direction, focus control is performed based on the phase difference in the vertical direction.
And if the preview image corresponding to the current shooting scene comprises the texture features in the first direction, performing focusing control based on the phase difference in the second direction. And if the preview image corresponding to the current shooting scene comprises the texture features in the first diagonal direction, performing focusing control based on the phase difference in the second diagonal direction, and vice versa. Therefore, the phase difference can be accurately acquired according to the texture features in different directions.
In the focusing control method in the embodiment of the application, because the RGB pixel arrays have different sensitivities under different light intensities, the accuracy of the phase difference calculated by the RGB pixel arrays is low under a part of the light intensities, and thus the focusing accuracy is greatly reduced. In the application, according to the light intensity of the current shooting scene, a target pixel corresponding to the light intensity of the current shooting scene is determined from a W pixel or at least one color pixel of an RGBW pixel array. Therefore, under different light intensities, if the accuracy of the phase difference calculated based on the phase information of at least one color pixel in the RGBW pixel array is low, the phase difference calculated based on the phase information of the W pixel is selected, and finally the accuracy of phase focusing is improved. Similarly, if the accuracy of the phase difference calculated based on the phase information of the W pixels in the RGBW pixel array is low, the phase difference calculated based on the phase information of at least one color pixel is selected, and finally the accuracy of phase focusing is improved.
In one embodiment, the step 420 of determining a target pixel corresponding to the light intensity of the current shooting scene from the RGBW pixel array according to the light intensity of the current shooting scene includes:
and determining a target pixel corresponding to the light intensity of the current shooting scene from the RGBW pixel array according to the light intensity of the current shooting scene and a preset threshold value of the light intensity.
The preset threshold of the light intensity is the light intensity threshold, and based on the table 1-1, the light intensity values 50lx in the cloudy day and outdoors can be set as a first preset light intensity threshold (hereinafter referred to as a first preset threshold). Of course, the specific value of the first preset threshold is not limited in this application.
If the light intensity of the current shooting scene is smaller than or equal to the first preset threshold, the light is weaker at the moment, and then the W pixel is determined as the target pixel, so that more phase information can be obtained through the W pixel. And if the light intensity of the current shooting scene is greater than a first preset threshold value, determining at least one of the RGB pixels as a target pixel. At this time, accurate phase information can be acquired through the RGB pixels, and the sensitivity of the W pixel is strong, but the W pixel is easily saturated, which affects the accuracy of the acquired phase information.
In the embodiment of the application, when the light is weak, the W pixel is adopted as the target pixel due to the strong sensitivity of the W pixel, and then the phase difference can be accurately calculated through the W pixel, so that the focusing control is performed. On the contrary, when the light is weak, at least one of the RGB pixels is used as the target pixel, and then the phase difference can be accurately calculated through at least one of the RGB pixels, thereby performing the focusing control. Finally, accurate focusing control can be realized under different light intensities.
In one embodiment, determining a target pixel corresponding to the light intensity of the current shooting scene from the RGBW pixel array according to the light intensity of the current shooting scene and a preset light intensity threshold comprises:
and if the light intensity of the current shooting scene exceeds a first preset threshold value, taking at least one color pixel in the RGBW pixel array as a target pixel.
In the embodiment of the application, if the light intensity of the current shooting scene is greater than or equal to the first preset threshold, at least one of the RGB pixels is determined as the target pixel. At this time, accurate phase information can be acquired through the RGB pixels, and the sensitivity of the W pixel is strong, but the W pixel is easily saturated, which affects the accuracy of the acquired phase information.
In one embodiment, step 440, acquiring the phase information collected by the target pixel, and calculating the phase difference according to the phase information of the target pixel includes:
if the light intensity of the current shooting scene exceeds a second preset threshold, phase information of sub-pixels of each pixel in the target pixel is obtained; the second preset threshold is greater than the first preset threshold;
calculating the phase difference of the target pixel according to the phase information of each pair of sub-pixels in the two pixels with the same color aiming at the two pixels with the same color in the target pixel; the two pixels with the same color are adjacent along the diagonal line of the pixel array, and each pair of sub-pixels are respectively positioned in the two pixels with the same color and have the same position in each pixel.
Specifically, if the light intensity of the current shooting scene exceeds a first preset threshold, at least one color pixel in the RGBW pixel array is used as the target pixel. And the second preset threshold is larger than the first preset threshold, so that if the light intensity of the current shooting scene exceeds the second preset threshold, the target pixel is also at least one color pixel in the RGBW pixel array.
Therefore, if the light intensity of the current shooting scene exceeds the second preset threshold, the phase information of the sub-pixels of each pixel in the target pixel is acquired. Namely, phase information of sub-pixels in at least one color pixel in the RGBW pixel array is obtained. And determining two pixels with the same color which are adjacent along the diagonal line of the pixel array from the target pixel. Then, for the two pixels of the target pixel having the same color, the phase difference of the target pixel is calculated based on the phase information of each pair of sub-pixels in the two pixels having the same color. Each pair of sub-pixels are respectively located in two pixels with the same color, and the positions of the sub-pixels in each pixel are the same.
And assuming that the light intensity of the current shooting scene exceeds a second preset threshold value, taking at least one color pixel in the RGBW pixel array as a target pixel. Here, any one of the R pixel, the G pixel, and the B pixel may be used as the target pixel, and for example, the R pixel, the G pixel, or the B pixel may be used as the target pixel. Any two of the R pixel, the G pixel, and the B pixel may be used as the target pixel, for example, an RG pixel may be used as the target pixel, an RB pixel may be used as the target pixel, or a GB pixel may be used as the target pixel. All of the R, G, and B pixels may be set as target pixels. This is not a limitation in the present application.
Next, a case where all of the R pixel, the G pixel, and the B pixel are target pixels will be described as an example. FIG. 5 is a diagram illustrating focus control in one embodiment. After the phase information of each sub-pixel of the R pixel, the G pixel, and the B pixel is read, respectively, two pixels of the same color are determined from the R pixel, the two pixels of the same color being adjacent along a diagonal line of the pixel array. And determining each pair of sub-pixels from the two pixels with the same color, wherein the sub-pixels of each pair are respectively positioned in the two pixels with the same color and have the same position in each pixel. The phase information of each pair of sub-pixels is input to the ISP, and the phase difference of the R pixels is calculated by the ISP. Here, the RGBW pixel array is divided into a first pixel unit (R pixel unit), a second pixel unit (G pixel unit), a third pixel unit (G pixel unit), and a fourth pixel unit (B pixel unit). For example, the 4 sub-pixels of the R pixel at the upper left corner in the first pixel unit are numbered as sub-pixel 1, sub-pixel 2, sub-pixel 3, and sub-pixel 4 in the direction from top to bottom and from left to right. The 4 sub-pixels of the R pixel at the bottom right corner in the first pixel unit are numbered as sub-pixel 5, sub-pixel 6, sub-pixel 7 and sub-pixel 8 in the top-to-bottom and left-to-right directions. Then, a first phase difference of the R pixel is calculated based on phase information of two sub-pixels at the same position in the R pixel. Calculating a second phase difference of the R pixel according to the phase information of the sub-pixel 1 and the sub-pixel 5 in the R pixel; calculating the phase difference of the R pixel according to the phase information of the sub-pixel 2 and the sub-pixel 6 in the R pixel; calculating a third phase difference of the R pixel according to the phase information of the sub-pixel 3 and the sub-pixel 7 in the R pixel; and calculating a fourth phase difference of the R pixel according to the phase information of the sub-pixel 4 and the sub-pixel 8 in the R pixel. Finally, the phase difference of the R pixel can be obtained based on the first phase difference, the second phase difference, the third phase difference, and the fourth phase difference of the R pixel. Here, the calculation may be performed by a method of calculating a weighted average, which is not limited in the present application.
Similarly, the above operation is also performed for G pixels adjacent along the diagonal line of the pixel array in each pixel unit, and the phase difference of the G pixels is obtained. The above operation is also performed for B pixels adjacent along the diagonal of the pixel array in each pixel unit, and phase differences of the B pixels are obtained, respectively.
Then, the distance of the lens from the clear position is calculated based on the phase difference of the R pixel, the G pixel, and the B pixel, and then a code value driven by the motor is calculated according to the distance, and then the Driver IC of the motor converts the code value into a driving current and drives the lens to move to the clear position based on the driving current. Thus, the process of focus control is realized.
In the embodiment of the application, if the light intensity of the current shooting scene exceeds a second preset threshold, phase information of sub-pixels of each pixel in a target pixel is acquired; the second preset threshold is greater than the first preset threshold. Because the light intensity is relatively high at this time, the phase information acquired by the sub-pixels of the RGB pixels is relatively accurate, and therefore, for the pixels of the same color adjacent to each other along the diagonal of the pixel array in the target pixel, the phase difference of the target pixel is calculated directly according to the phase information of each pair of sub-pixels in the pixels of the same color. Focusing is performed based on the phase difference of the target pixel, and finally the accuracy of phase focusing is improved.
In the previous embodiment, as shown in fig. 6, after performing focus control based on the phase difference, the method further includes:
step 620, controlling exposure of the RGBW pixel array, and acquiring pixel values of all sub-pixels in the RGBW pixel array.
After focusing is realized by the focusing control method in the previous embodiment, the exposure of the RGBW pixel array is controlled, and the pixel values of all the sub-pixels in the RGBW pixel array are acquired. Namely, the pixel values of the sub-pixels of each of the R pixel, the G pixel, the B pixel and the W pixel in the RGBW pixel array are obtained. FIG. 7 is a diagram illustrating generation of a target image in one embodiment. Wherein pixel values of sub-pixels in the RGBW pixel array are obtained, constituting the original RAW image 702.
In step 640, the pixel values of the sub-pixels of the color pixels are obtained from the pixel values of the sub-pixels, and interpolation is performed on the pixel values of the sub-pixels of the color pixels to generate a bayer array image.
The pixel values of sub-pixels of R pixels, G pixels, and B pixels are acquired from the original RAW image 702, and a RAW image 704 corresponding to RGB pixels is generated. The bayer array image 706 is generated by interpolating pixel values of sub-pixels of R pixels, G pixels, and B pixels in the RAW image 704 corresponding to RGB pixels. The bayer array image is a 4 × 4 array, and is composed of 8 green, 4 blue, and 4 red pixels, and when the gray image is converted into a color image, 9 operations are performed with a 2 × 2 matrix, and finally, a color image is generated. Specifically, a Remosaic interpolation algorithm may be used to perform the interpolation, where the Remosaic interpolation algorithm mainly calculates the weight ratio according to the distance between the pixel and the surrounding related pixels through pixel interchange or through the relationship between the pixel and the surrounding related pixels. Then, the pixel values of the surrounding relevant pixels are generated based on the weight ratio and the pixel value of the pixel.
Step 660 is to acquire a pixel value of the sub-pixel of the W pixel from the pixel values of the sub-pixels, and perform interpolation operation on the pixel value of the sub-pixel of the W pixel to generate a W pixel image.
The pixel value of a sub-pixel of the W pixel is acquired from the original RAW image 702, and a RAW image 708 corresponding to the W pixel is generated. The W pixel image 710 is generated by interpolating the pixel values of the sub-pixels of the W pixel in the RAW image 708 corresponding to the W pixel.
And step 680, fusing the Bayer array image and the W pixel image to generate a target image.
Finally, the bayer array image 706 and the W pixel image 710 are fused to generate a target image 712. Here, when fusing the bayer array image 706 and the W pixel image 710, the pixel value of each sub-pixel in the bayer array image 706 and the pixel value of each sub-pixel in the W pixel image 710 may be directly merged to generate the pixel value of the sub-pixel at the corresponding position in the target image 712.
In the embodiment of the application, if the light intensity of the current shooting scene exceeds a second preset threshold, phase information of sub-pixels of each pixel in the target pixel is acquired. Because the light intensity is relatively high at this time, the phase information acquired by the sub-pixels of the RGB pixels is relatively accurate, and therefore, for the adjacent pixels with the same color in the target pixel, the phase difference of the target pixel is directly calculated according to the phase information of the two sub-pixels at the same position in the pixels with the same color. Focusing is performed based on the phase difference of the target pixel, and finally the accuracy of phase focusing is improved.
Then, the exposure of the RGBW pixel array is controlled, and the pixel values of all sub-pixels in the RGBW pixel array are obtained. Since the intensity of the light is relatively high, the signal-to-noise ratio of the pixel value of each sub-pixel is relatively high. Therefore, the pixel values of the sub-pixels of the color pixels are obtained from the pixel values of the sub-pixels, and the interpolation operation is directly performed on the pixel values of the sub-pixels of the color pixels, thereby generating a bayer array image. The pixel value of the sub-pixel of the W pixel is directly obtained from the pixel value of the sub-pixel, and the pixel value of the sub-pixel of the W pixel is interpolated to generate a W pixel image. And fusing the Bayer array image and the W pixel image to generate a target image. Because the light intensity is higher at this moment, the pixel value of each sub-pixel is directly interpolated, so that the resolution of the finally generated target image can be improved while the higher signal-to-noise ratio is ensured.
In one embodiment, as shown in fig. 8, each RGBW pixel array includes a plurality of pixel units, each pixel unit including a plurality of pixels, each pixel including a plurality of sub-pixels; step 440, obtaining the phase information collected by the target pixel, and calculating the phase difference according to the phase information of the target pixel, includes:
in step 820, if the light intensity of the current shooting scene exceeds a first preset threshold and is not greater than a second preset threshold, phase information of sub-pixels of each pixel in the target pixel is obtained.
Specifically, if the light intensity of the current shooting scene exceeds a first preset threshold, at least one color pixel in the RGBW pixel array is used as the target pixel. Therefore, if the light intensity of the current shooting scene exceeds the first preset threshold and is not greater than the second preset threshold, the target pixel is also at least one color pixel in the RGBW pixel array. Then, phase information of sub-pixels of each of the target pixels is acquired. Namely, phase information of sub-pixels in at least one color pixel in the RGBW pixel array is obtained.
And assuming that the light intensity of the current shooting scene exceeds a first preset threshold and is not greater than a second preset threshold, taking at least one color pixel in the RGBW pixel array as a target pixel. Here, any one of the R pixel, the G pixel, and the B pixel may be used as the target pixel, and for example, the R pixel, the G pixel, or the B pixel may be used as the target pixel. Any two of the R pixel, the G pixel, and the B pixel may be used as the target pixel, for example, an RG pixel may be used as the target pixel, an RB pixel may be used as the target pixel, or a GB pixel may be used as the target pixel. All of the R, G, and B pixels may be set as target pixels. This is not a limitation in the present application.
Step 840, combining phase information of sub-pixels in the same region in the same color pixel in the first direction in the same color pixel in each pixel unit to obtain combined phase information of the same color pixel in each pixel unit in the first direction, and calculating a phase difference in the first direction according to the combined phase information of each pixel in the first direction; or the like, or, alternatively,
as shown in fig. 3, the RGBW pixel array includes 4 pixel units. For each pixel unit, first, sub-pixels in the same region in the same-color pixel in the first direction among the same-color pixels are determined. The first direction is the vertical direction of the RGBW pixel array, the second direction is the horizontal direction of the RGBW pixel array, and the first direction is perpendicular to the second direction. Of course, the phase difference of the sub-pixels included in the target pixel in other directions may also be calculated, which is not limited in the present application.
The following description will be given taking an example in which the R pixel is a target pixel.
FIG. 9 is a diagram illustrating focus control in one embodiment. For an R pixel unit in the RGBW pixel array 920, first, a sub-pixel in which the R pixel is in the same region within the R pixel in the first direction is determined. For example, the 4 subpixels of the R pixel at the upper left corner in the first pixel unit (R pixel unit) are numbered as subpixel 1, subpixel 2, subpixel 3, and subpixel 4 in the top-to-bottom and left-to-right directions (see fig. 5). The 4 sub-pixels of the R pixel at the bottom right corner in the first pixel unit (R pixel unit) are numbered as sub-pixel 5, sub-pixel 6, sub-pixel 7, and sub-pixel 8 in the top-to-bottom and left-to-right directions (refer to fig. 5). Then, the sub-pixels of the R pixel at the upper left corner in the first direction in the same area in the R pixel are determined as sub-pixel 1 and sub-pixel 3, and the sub-pixels of the R pixel at the lower right corner in the first direction in the same area in the R pixel are determined as sub-pixel 4 and sub-pixel 6.
And secondly, combining the phase information of the sub-pixels of the R pixel unit, wherein the R pixels are in the same area in the first direction. Combining the phase information (Left signal) of the sub-pixel 1 and the sub-pixel 3 to generate Left phase information; combining the phase information (right signal) of the sub-pixels 4 and 6 to generate right phase information; and finally, combining the left side phase information and the right side phase information to obtain combined phase information of the R pixels in each R pixel unit in the first direction, and generating a combined RGB pixel array 940.
The phase difference in the first direction is calculated from the combined phase information in the first direction for each pixel. For example, for the merged RGB pixel array 940, the phase difference of the R pixels in the first direction is calculated from the merged phase information of the two R pixels in the R pixel unit in the first direction.
Similarly, if the G pixel is set as the target pixel, the above operation is also performed for the G pixel in each pixel unit, and the phase difference of the G pixel in the first direction is calculated from the combined phase information of the two G pixels in each G pixel unit in the first direction.
Similarly, if the B pixel is set as the target pixel, the above operation is also performed for the B pixel in each pixel unit, and the phase difference of the B pixel in the first direction is calculated from the combined phase information of the two B pixels in each B pixel unit in the first direction.
If any two of the R pixel, the G pixel, and the B pixel are set as the target pixels, or all of the R pixel, the G pixel, and the B pixel are set as the target pixels. Then, the phase difference in the first direction is generated by selecting the corresponding phase difference from the phase difference in the first direction of the R pixel, the phase difference in the first direction of the G pixel, and the phase difference in the first direction of the B pixel, which are obtained by the above calculation, and combining them.
Step 860, for each pixel unit, combining the phase information of the sub-pixels in the same area in the same color pixel in the second direction in the same color pixel to obtain the combined phase information of the same color pixel in each pixel unit in the second direction, and calculating the phase difference in the second direction according to the combined phase information of each pixel in the second direction; the first direction and the second direction are perpendicular to each other.
The following description will be given taking an example in which the R pixel is a target pixel.
For an R pixel unit in the RGBW pixel array, first, sub-pixels of the R pixel in the same region within the pixel of the same color in the second direction are determined. For example, the 4 subpixels of the R pixel at the upper left corner in the first pixel unit (R pixel unit) are numbered as subpixel 1, subpixel 2, subpixel 3, and subpixel 4 in the top-to-bottom and left-to-right directions. The 4 sub-pixels of the R pixel at the lower right corner in the first pixel unit (R pixel unit) are numbered as sub-pixel 5, sub-pixel 6, sub-pixel 7, and sub-pixel 8 in the top-to-bottom and left-to-right directions. Then, the sub-pixels of the R pixel at the upper left corner in the second direction in the same area in the R pixel are determined as sub-pixel 1 and sub-pixel 2, and the sub-pixels of the R pixel at the lower right corner in the second direction in the same area in the R pixel are determined as sub-pixel 4 and sub-pixel 5.
And secondly, combining the phase information of the sub-pixels of the R pixel unit, wherein the R pixel is in the same region in the R pixel in the second direction. Combining the phase information of the sub-pixel 1 and the sub-pixel 2 to generate upper phase information; combining the phase information of the sub-pixel 4 and the sub-pixel 5 to generate lower phase information; and finally, combining the upper phase information and the lower phase information to obtain combined phase information of each pixel in the second direction, and generating a combined RGB pixel array.
And calculating the phase difference of the second direction according to the combined phase information of each pixel in the second direction. For example, for the merged RGB pixel array 940, the phase difference of the R pixels in the second direction is calculated from the merged phase information of the two R pixels in the R pixel unit in the second direction.
Similarly, if the G pixel is set as the target pixel, the above operation is also performed for the G pixel in each pixel unit, and the phase difference of the G pixel in the second direction is calculated from the combined phase information of the two G pixels in each G pixel unit in the second direction.
Similarly, if the B pixel is taken as the target pixel, the above operation is also performed for the B pixel in each pixel unit, and the phase difference of the B pixel in the second direction is calculated from the combined phase information of the two B pixels in each B pixel unit in the second direction.
If any two of the R pixel, the G pixel, and the B pixel are set as the target pixels, or all of the R pixel, the G pixel, and the B pixel are set as the target pixels. Then, the phase difference in the second direction is generated by selecting the corresponding phase difference from the phase difference in the second direction of the R pixel, the phase difference in the second direction of the G pixel, and the phase difference in the second direction of the B pixel, which are obtained by the above calculation, and combining them.
In the embodiment of the application, if the light intensity of the current shooting scene exceeds a first preset threshold and is smaller than a second preset threshold, phase information of sub-pixels of each pixel in a target pixel is obtained. Since the light intensity is slightly weak at this time, the phase information acquired by the sub-pixels of the RGB pixels is not very accurate, and some of the RGB pixels may not acquire the phase information. Therefore, for each pixel unit, the phase information of the sub-pixels in the same area in the pixels with the same color in the first direction/the second direction in the pixels with the same color is combined to obtain the combined phase information of the pixels with the same color in the first direction/the second direction in each pixel unit, and the phase difference in the first direction/the second direction is calculated according to the combined phase information of each pixel in the first direction/the second direction. By combining the phase information, the accuracy of the acquired phase information can be improved, and the signal-to-noise ratio of the phase information can be improved. Focusing is performed based on the phase difference between the first direction and the second direction, and finally the accuracy of phase focusing is improved.
In one embodiment, as shown in fig. 10, after performing focus control based on the phase difference, the method further includes:
step 1010, controlling exposure of the RGBW pixel array to obtain pixel values of sub-pixels in the RGBW pixel array.
After focusing is realized by the focusing control method in the previous embodiment, the RGBW pixel array is controlled to be exposed, and pixel values of sub-pixels in the RGBW pixel array are obtained. Namely, the pixel values of the sub-pixels of each of the R pixel, the G pixel, the B pixel and the W pixel in the RGBW pixel array are obtained. FIG. 11 is a diagram illustrating generation of a target image in one embodiment. Wherein pixel values of sub-pixels in the RGBW pixel array are obtained, constituting the original RAW image 1102.
Step 1030, calculating pixel values of the color pixels according to the pixel values of the sub-pixels of each color pixel.
The method comprises the steps of obtaining pixel values of sub-pixels of an R pixel, a G pixel and a B pixel from pixel values of the sub-pixels, combining the pixel values of the sub-pixels of the R pixel to obtain a pixel value of the R pixel, combining the pixel values of the sub-pixels of the G pixel to obtain a pixel value of the G pixel, and combining the pixel values of the sub-pixels of the B pixel to obtain a pixel value of the B pixel.
As shown in fig. 11, the RAW image 1104 corresponding to RGB pixels is generated by acquiring pixel values of sub-pixels of R pixels, G pixels, and B pixels from the original RAW image 1102. The pixel values of the sub-pixels of the R pixels are combined to obtain the pixel value of the R pixel, the pixel values of the sub-pixels of the G pixel are combined to obtain the pixel value of the G pixel, and the pixel values of the sub-pixels of the B pixel are combined to obtain the pixel value of the B pixel. Based on the pixel value of the R pixel, the pixel value of the G pixel, and the pixel value of the B pixel, a combined RAW image 1106 corresponding to the RGB pixels is generated.
Step 1050, performing interpolation operation on the pixel values of the color pixels to generate a bayer array image.
The bayer array image 1108 is generated by performing interpolation operation on the pixel values of R pixels, G pixels, and B pixels in the combined RAW image 1106 corresponding to the RGB pixels. Specifically, a Remosaic interpolation algorithm may be used to perform the interpolation, where the Remosaic interpolation algorithm mainly calculates the weight ratio according to the distance between the pixel and the surrounding related pixels through pixel interchange or through the relationship between the pixel and the surrounding related pixels. Then, the pixel values of the surrounding relevant pixels are generated based on the weight ratio and the pixel value of the pixel.
Step 1070 is to calculate the pixel value of the W pixel from the pixel values of the sub-pixels of the W pixel, and perform interpolation operation on the pixel value of the W pixel to generate a W pixel image.
And acquiring the pixel value of the sub-pixel of the W pixel from the pixel values of the sub-pixels, and combining the pixel values of the sub-pixels of the W pixel to obtain the pixel value of the W pixel.
The RAW image 1110 corresponding to the W pixel is generated by acquiring the pixel value of the sub-pixel of the W pixel from the original RAW image 1102. The pixel values of the sub-pixels of the W pixel are combined to obtain a pixel value of the W pixel, and a combined W image 1112 corresponding to the W pixel is generated. The W pixel image 1114 is generated by interpolating the pixel values of the W pixels in the combined RAW image 1212 corresponding to the W pixels.
And step 1190, fusing the Bayer array image and the W pixel image to generate a target image.
The bayer array image 1108 and the W pixel image 1114 are fused to generate a target image 1116. Here, when fusing the bayer array image 1108 with the W pixel image 1114, the pixel value of each pixel in the bayer array image 1108 and the pixel value of each pixel in the W pixel image 1114 may be directly merged to generate the pixel value of the pixel at the corresponding position in the target image 1116.
In the embodiment of the application, if the light intensity of the current shooting scene exceeds the first preset threshold and is not greater than the second preset threshold, after focusing is performed according to the focusing method in the embodiment, the RGBW pixel array is controlled to be exposed, and the pixel value of the sub-pixel in the RGBW pixel array is obtained. Since the light intensity at this time is slightly weak, the signal-to-noise ratio of the pixel value of each sub-pixel is small. The pixel values of the sub-pixels of the color pixels are combined to generate the pixel values of the color pixels, so that the signal-to-noise ratio of the pixel values of the color pixels is improved. Then, interpolation operation is performed on the pixel values of the color pixels to generate a bayer array image. And the pixel values of the sub-pixels of the W pixels are combined to generate the pixel value of the W pixel, so that the signal-to-noise ratio of the pixel value of the W pixel is improved. And then carrying out interpolation operation on the pixel value of the W pixel to generate a W pixel image. And fusing the Bayer array image and the W pixel image to generate a target image.
Although the resolution of the finally generated target image is reduced, the signal corresponding to the acquired pixel value is increased, thereby improving the signal-to-noise ratio of the target image.
In one embodiment, the color pixels include R pixels, G pixels, and B pixels; calculating pixel values of the color pixels according to pixel values of sub-pixels of the color pixels, including:
the method comprises the steps of obtaining pixel values of sub-pixels of an R pixel, a G pixel and a B pixel from pixel values of the sub-pixels, combining the pixel values of the sub-pixels of the R pixel to obtain a pixel value of the R pixel, combining the pixel values of the sub-pixels of the G pixel to obtain a pixel value of the G pixel, and combining the pixel values of the sub-pixels of the B pixel to obtain a pixel value of the B pixel.
In the embodiment of the present application, when the pixel values of the sub-pixels of the R pixel are combined to obtain the pixel value of the R pixel, a weighted average value may be directly calculated for the pixel values of the sub-pixels of the R pixel to generate the pixel value of the R pixel. Similarly, the pixel value of the G pixel and the pixel value of the B pixel are calculated. The pixel values of the sub-pixels of the color pixels are combined to generate the pixel values of the color pixels, so that the signal-to-noise ratio of the pixel values of the color pixels is improved.
In one embodiment, calculating the pixel value of the W pixel from the pixel values of the sub-pixels of the W pixel comprises:
and acquiring the pixel value of the sub-pixel of the W pixel from the pixel values of the sub-pixels, and combining the pixel values of the sub-pixels of the W pixel to obtain the pixel value of the W pixel.
In the embodiment of the present application, the pixel values of the sub-pixels of the W pixel are combined to obtain the pixel value of the W pixel, and the pixel value of the W pixel may be generated by directly calculating a weighted average value for the pixel values of the sub-pixels of the W pixel. And the pixel values of the sub-pixels of the W pixels are combined to generate the pixel value of the W pixel, so that the signal-to-noise ratio of the pixel value of the W pixel is improved.
In one embodiment, determining a target pixel corresponding to the light intensity of the current shooting scene from the RGBW pixel array according to the light intensity of the current shooting scene and a preset threshold of the light intensity comprises:
and if the light intensity of the current shooting scene is smaller than or equal to a first preset threshold value, taking the W pixel in the RGBW pixel array as a target pixel.
In the embodiment of the application, if the light intensity of the current shooting scene is less than or equal to the first preset threshold, it indicates that the light is weak at the moment. And determining the W pixel as a target pixel because the sensitivity of the W pixel is stronger, so that more phase information can be acquired through the W pixel.
In the previous embodiment, each RGBW pixel array includes a plurality of pixel units, as shown in fig. 12, step 440, obtaining phase information acquired by the target pixel, and calculating a phase difference according to the phase information of the target pixel, includes:
in step 1220, phase information of each sub-pixel in the W pixel is obtained for the W pixel.
Fig. 13 is a schematic diagram of focus control in one embodiment. If the light intensity of the current scene is less than or equal to the first preset threshold, because the light intensity is very weak, the W pixel in the RGBW pixel array 1320 is taken as the target pixel. Further, phase information of sub-pixels in the W pixel in the RGBW pixel array is acquired, and the W pixel array 1320 is generated.
Step 1240, combining the phase information of the sub-pixels in the same area in the first direction in the W pixel to obtain combined phase information of the W pixel in the first direction, and calculating the phase difference in the first direction according to the combined phase information of the W pixel in the first direction; or the like, or, alternatively,
for a W pixel unit in the W pixel array 1320, first, a sub-pixel in which the W pixel is in the same region in the first direction is determined. For example, the 4 subpixels for the W pixel at the upper right corner in the first pixel unit (R pixel unit) are numbered as subpixel 1, subpixel 2, subpixel 3, and subpixel 4 in the top-to-bottom and left-to-right directions. The 4 subpixels of the W pixel at the lower left corner in the first pixel unit (R pixel unit) are numbered as subpixel 5, subpixel 6, subpixel 7, and subpixel 8 in the top-to-bottom and left-to-right directions. Then, the sub-pixels of the W pixel at the upper right corner in the first direction in the same area are determined as sub-pixel 1 and sub-pixel 3, and the sub-pixels of the W pixel at the lower left corner in the first direction in the same area are determined as sub-pixel 4 and sub-pixel 6.
Next, phase information of sub-pixels of the W pixel in the R pixel unit in the same region in the first direction is combined. Combining the phase information (Left signal) of the sub-pixel 1 and the sub-pixel 3 to generate Left phase information; combining the phase information (right signal) of the sub-pixels 4 and 6 to generate right phase information; finally, the left-side phase information and the right-side phase information are combined to obtain combined phase information of each pixel in the first direction, and a combined W pixel array 1340 is generated.
The phase difference in the first direction is calculated from the combined phase information in the first direction for each pixel. For example, for the combined W pixel array 1340, the phase difference of the W pixels in the first direction is calculated from the combined phase information of the two W pixels in the R pixel unit in the first direction.
Alternatively, the merging phase information of the W pixels in the merged W pixel array 1340 may continue to be merged again to generate the W pixel array 1360. A phase difference of the W pixels in the first direction is calculated from the W pixel array 1360.
Step 1260, combining the phase information of the sub-pixels in the same area in the second direction in the W pixel for each pixel unit to obtain combined phase information of the W pixel in the second direction, and calculating the phase difference in the second direction according to the combined phase information of the W pixel in the second direction; the first direction and the second direction are perpendicular to each other.
For the W pixel unit in the W pixel array 1320, first, the sub-pixels of the W pixel in the same area in the second direction are determined. For example, the 4 subpixels for the W pixel at the upper right corner in the first pixel unit (R pixel unit) are numbered as subpixel 1, subpixel 2, subpixel 3, and subpixel 4 in the top-to-bottom and left-to-right directions. The 4 subpixels of the W pixel at the lower left corner in the first pixel unit (R pixel unit) are numbered as subpixel 5, subpixel 6, subpixel 7, and subpixel 8 in the top-to-bottom and left-to-right directions. Then, the sub-pixels of the W pixel at the upper right corner in the second direction in the same area are determined as sub-pixel 1 and sub-pixel 2, and the sub-pixels of the W pixel at the lower left corner in the second direction in the same area are determined as sub-pixel 4 and sub-pixel 5.
Secondly, the phase information of the sub-pixels of the W pixel in the same area in the second direction in the R pixel unit is combined. Combining the phase information (Left signal) of the sub-pixel 1 and the sub-pixel 2 to generate upper phase information; combining the phase information (right signal) of the sub-pixels 4 and 5 to generate lower phase information; and finally, combining the upper phase information and the lower phase information to obtain combined phase information of each pixel in the second direction, and generating a combined W pixel array.
And calculating the phase difference of the second direction according to the combined phase information of each pixel in the second direction. For example, for the combined W pixel array 1340, the phase difference of the W pixels in the second direction is calculated from the combined phase information of the two W pixels in the R pixel unit in the second direction.
Alternatively, the merging phase information of the W pixels in the merged W pixel array 1340 may continue to be merged again to generate the W pixel array 1360. A phase difference of the W pixels in the first direction is calculated from the W pixel array 1360.
In the embodiment of the application, if the light intensity of the current shooting scene is less than or equal to the first preset threshold, the W pixel in the RGBW pixel array is taken as the target pixel. Since the light intensity at this time is very weak, the W pixel in the RGBW pixel array is taken as the target pixel. And for each pixel unit, combining the phase information of the sub-pixels in the same area in the first direction/the second direction in the W pixel to obtain the combined phase information of the W pixel in the first direction/the second direction, and calculating the phase difference in the first direction/the second direction according to the combined phase information of the W pixel in the first direction/the second direction, wherein the first direction and the second direction are perpendicular to each other. By combining the phase information, the accuracy of the acquired phase information can be improved, and the signal-to-noise ratio of the phase information can be improved. Focusing is performed based on the phase difference between the first direction and the second direction, and finally the accuracy of phase focusing is improved.
In one embodiment, the plurality of photosensitive elements corresponding to the pixels are arranged in a centrosymmetric manner.
Fig. 3 is a schematic structural diagram of a part of an image sensor in an embodiment. The image sensor comprises a plurality of RGBW pixel arrays arranged in an array. Fig. 3 is a schematic diagram of an RGBW pixel array. Each RGBW pixel array includes a plurality of pixel units Z, and as shown in fig. 3, each RGBW pixel array includes 4 pixel units Z. The 4 pixel units Z are respectively a red pixel unit, a green pixel unit and a red pixel unit.
Each pixel unit Z includes a W pixel D and a color pixel D arranged diagonally, and each pixel D corresponds to one microlens. The color pixels D include R pixels, G pixels and B pixels. Specifically, the red pixel unit comprises 2W pixels and 2R pixels which are arranged in a diagonal manner; aiming at the green pixel unit, 2W pixels and 2G pixels which are arranged in a diagonal line are included; the blue pixel unit comprises 2W pixels and 2B pixels which are arranged in a diagonal manner.
Each W pixel D includes a plurality of sub-pixels D arranged in an array, each color pixel D includes a plurality of sub-pixels D arranged in an array, and each sub-pixel D corresponds to one photosensitive element. Because the plurality of photosensitive elements corresponding to the pixels are arranged in a central symmetry manner, the W pixel, the R pixel, the G pixel and the B pixel comprise a plurality of sub-pixels arranged in a central symmetry manner. That is, the light-sensing elements corresponding to the sub-pixels may be arranged in a central symmetry manner in various arrangement modes or various shapes, and are not limited to the square arrangement shown in fig. 3.
In this embodiment, the photosensitive elements corresponding to the sub-pixels may be arranged in a central symmetry manner in various arrangement modes or in various shapes, and each sub-pixel d corresponds to one photosensitive element. Therefore, the W pixel, the R pixel, the G pixel, and the B pixel include a plurality of sub-pixels arranged in a central symmetrical manner. The method provides diversified arrangement modes for the sub-pixels, so that the sub-pixels can acquire diversified phase information, and the accuracy of subsequent focusing is improved.
In one embodiment, the plurality of photosensitive elements corresponding to the pixels are arranged in a central symmetry manner in a trapezoidal manner.
Fig. 14 is a schematic diagram of an RGBW pixel array. Each RGBW pixel array includes 4 pixel cells Z. The 4 pixel units Z are respectively a red pixel unit, a green pixel unit and a red pixel unit. Each pixel unit Z includes a W pixel D and a color pixel D arranged diagonally, and each pixel D corresponds to one microlens. The color pixels D include R pixels, G pixels and B pixels.
Each W pixel includes a plurality of sub-pixels d arranged in an array, and the sub-pixels are arranged in a central symmetry manner in a trapezoidal manner. Similarly, each R pixel includes a plurality of sub-pixels d arranged in an array, and the sub-pixels are arranged in a central symmetry manner in a trapezoidal manner. Each G pixel includes a plurality of sub-pixels d arranged in an array, and the sub-pixels are arranged in a central symmetry manner in a trapezoidal manner. Each B pixel includes a plurality of sub-pixels d arranged in an array, and the sub-pixels are arranged in a central symmetry manner in a trapezoidal manner. And each sub-pixel d corresponds to a light sensing element. The photosensitive element may be a PhotoDiode (PD). As shown in fig. 14, the left PD and the right PD are both in a trapezoidal structure, and are arranged in a central symmetry manner.
Optionally, the W pixel, the R pixel, the G pixel, and the B pixel in the RGBW pixel array may also be combined in a plurality of different arrangement modes, which is not specifically limited in this application.
In this embodiment, the photosensitive elements corresponding to the sub-pixels may be arranged in a central symmetry manner in various arrangement modes or in various shapes, and each sub-pixel d corresponds to one photosensitive element. Therefore, the W pixel, the R pixel, the G pixel, and the B pixel include a plurality of sub-pixels arranged in a center-symmetrical manner in a trapezoidal manner. The method provides diversified arrangement modes for the sub-pixels, so that the sub-pixels can acquire diversified phase information, and the accuracy of subsequent focusing is improved.
In one embodiment, the plurality of photosensitive elements corresponding to the pixels are arranged in a central symmetry manner in an L-shaped manner.
Fig. 15 is a schematic diagram of an RGBW pixel array. Each RGBW pixel array includes 4 pixel cells Z. The 4 pixel units Z are respectively a red pixel unit, a green pixel unit and a red pixel unit. Each pixel unit Z includes a W pixel D and a color pixel D arranged diagonally, and each pixel D corresponds to one microlens. The color pixels D include R pixels, G pixels and B pixels.
Each W pixel includes a plurality of sub-pixels d arranged in an array, and the sub-pixels are arranged in a central symmetry manner in an L-shaped manner. Similarly, each R pixel includes a plurality of sub-pixels d arranged in an array, and the sub-pixels are arranged in an L-shaped manner with central symmetry. Each G pixel includes a plurality of sub-pixels d arranged in an array, and the sub-pixels are arranged in a central symmetry manner in an L-shaped manner. Each B pixel includes a plurality of sub-pixels d arranged in an array, and the sub-pixels are arranged in a central symmetry manner in an L-shaped manner. And each sub-pixel d corresponds to a light sensing element. The photosensitive element may be a PhotoDiode (PD). As shown in fig. 15, the left PD and the right PD are both L-shaped, and are arranged in a central symmetry.
Optionally, the W pixel, the R pixel, the G pixel, and the B pixel in the RGBW pixel array may also be combined in a plurality of different arrangement modes, which is not specifically limited in this application.
In this embodiment, the photosensitive elements corresponding to the sub-pixels may be arranged in a central symmetry manner in various arrangement modes or in various shapes, and each sub-pixel d corresponds to one photosensitive element. Therefore, the W pixel, the R pixel, the G pixel, and the B pixel include a plurality of sub-pixels arranged in a central symmetry manner in an L-shape. The method provides diversified arrangement modes for the sub-pixels, so that the sub-pixels can acquire diversified phase information, and the accuracy of subsequent focusing is improved.
In one embodiment, as shown in fig. 16, there is provided a focus control apparatus 1600 applied to an electronic device including an image sensor including an RGBW pixel array, the apparatus including:
the target pixel determining module 1620 is configured to determine, according to the light intensity of the current shooting scene, a target pixel corresponding to the light intensity of the current shooting scene from the RGBW pixel array; the target pixel comprises a W pixel or at least one color pixel in an RGBW pixel array;
the phase difference calculation module 1640 is used for acquiring the phase information of the target pixel and calculating the phase difference according to the phase information of the target pixel;
a focus control module 1660 for performing focus control based on the phase difference.
In an embodiment, the target pixel determining module 1620 is further configured to determine a target pixel corresponding to the light intensity of the current shooting scene from the RGBW pixel array according to the light intensity of the current shooting scene and a preset threshold of the light intensity.
In one embodiment, the target pixel determination module 1620 comprises:
the first target pixel determining unit is used for taking at least one color pixel in the RGBW pixel array as a target pixel if the light intensity of the current shooting scene exceeds a first preset threshold value.
In one embodiment, the phase difference calculating module 1640 is configured to obtain phase information of a sub-pixel of each pixel in the target pixel if the light intensity of the current shooting scene exceeds a second preset threshold; the second preset threshold is greater than the first preset threshold; calculating the phase difference of the target pixel according to the phase information of each pair of sub-pixels in the two pixels with the same color aiming at the two pixels with the same color in the target pixel; the two pixels with the same color are adjacent along the diagonal line of the pixel array, and each pair of sub-pixels are respectively positioned in the two pixels with the same color and have the same position in each pixel.
In one embodiment, there is provided a focus control apparatus, the apparatus further comprising:
the first target image generation module is used for controlling exposure of the RGBW pixel array and obtaining pixel values of sub-pixels in the RGBW pixel array; acquiring a pixel value of a sub-pixel of a color pixel from a pixel value of the sub-pixel, and performing interpolation operation on the pixel value of the sub-pixel of the color pixel to generate a Bayer array image; acquiring a pixel value of a sub-pixel of the W pixel from pixel values of the sub-pixels, and performing interpolation operation on the pixel value of the sub-pixel of the W pixel to generate a W pixel image; and fusing the Bayer array image and the W pixel image to generate a target image.
In one embodiment, each RGBW pixel array comprises a plurality of pixel units, each pixel unit comprising a plurality of pixels, each pixel comprising a plurality of sub-pixels; the phase difference calculation module 1640 includes:
a first phase difference calculation unit configured to: if the light intensity of the current shooting scene exceeds a first preset threshold and is not greater than a second preset threshold, phase information of sub-pixels of each pixel in the target pixel is obtained;
for each pixel unit, combining the phase information of the sub-pixels in the same region in the same color pixel in the first direction in the same color pixel to obtain the combined phase information of the same color pixel in each pixel unit in the first direction, and calculating the phase difference in the first direction according to the combined phase information of each pixel in the first direction; or the like, or, alternatively,
for each pixel unit, combining the phase information of the sub-pixels in the same area in the pixels with the same color in the second direction in the pixels with the same color to obtain the combined phase information of the pixels with the same color in each pixel unit in the second direction, and calculating the phase difference in the second direction according to the combined phase information of each pixel in the second direction; the first direction and the second direction are perpendicular to each other.
In one embodiment, the focusing control module 1660 is further configured to perform focusing control based on the phase difference in the first direction if the preview image corresponding to the currently captured scene includes texture features in the second direction; or if the preview image corresponding to the current shooting scene comprises the texture features in the first direction, performing focusing control based on the phase difference in the second direction.
In one embodiment, there is provided a focus control apparatus, the apparatus further comprising:
the second target image generation module is used for controlling the RGBW pixel array to be exposed and acquiring the pixel values of sub-pixels in the RGBW pixel array;
calculating the pixel value of the color pixel according to the pixel value of the sub-pixel of each color pixel;
carrying out interpolation operation on the pixel values of the color pixels to generate a Bayer array image;
calculating the pixel value of the W pixel according to the pixel value of the sub-pixel of the W pixel, and performing interpolation operation on the pixel value of the W pixel to generate a W pixel image;
and fusing the Bayer array image and the W pixel image to generate a target image.
In one embodiment, the color pixels include R pixels, G pixels, and B pixels; the second target image generation module is further configured to obtain pixel values of sub-pixels of the R pixel, the G pixel and the B pixel from the pixel values of the sub-pixels, combine the pixel values of the sub-pixels of the R pixel to obtain a pixel value of the R pixel, combine the pixel values of the sub-pixels of the G pixel to obtain a pixel value of the G pixel, and combine the pixel values of the sub-pixels of the B pixel to obtain a pixel value of the B pixel.
In an embodiment, the second target image generation module is further configured to obtain a pixel value of a sub-pixel of the W pixel from pixel values of the sub-pixels, and combine the pixel values of the sub-pixels of the W pixel to obtain the pixel value of the W pixel.
In one embodiment, the target pixel determination module 1620 comprises:
and the second target pixel determination unit is used for taking the W pixel in the RGBW pixel array as the target pixel if the light intensity of the current shooting scene is less than or equal to the first preset threshold.
In one embodiment, the phase difference calculation module 1640 includes:
the second phase difference calculation unit is used for acquiring the phase information of each sub-pixel in the W pixel aiming at the W pixel;
for each pixel unit, combining the phase information of the sub-pixels in the same area in the first direction in the W pixel to obtain combined phase information of the W pixel in the first direction, and calculating the phase difference in the first direction according to the combined phase information of the W pixel in the first direction; or the like, or, alternatively,
for each pixel unit, combining the phase information of the sub-pixels in the same area in the second direction in the W pixel to obtain combined phase information of the W pixel in the second direction, and calculating the phase difference in the second direction according to the combined phase information of the W pixel in the second direction; the first direction and the second direction are perpendicular to each other.
In one embodiment, the image sensor comprises a plurality of RGBW pixel arrays arranged in an array, each RGBW pixel array comprises a plurality of pixel units, each pixel unit comprises a W pixel arranged in a diagonal line and a color pixel arranged in another diagonal line, and each pixel corresponds to one micro lens and a plurality of photosensitive elements; each pixel comprises a plurality of sub-pixels arranged in an array, and each sub-pixel corresponds to one photosensitive element; the color pixels include R pixels, G pixels, and B pixels.
In one embodiment, an imaging device is provided, which comprises a lens, a filter and an image sensor, wherein the lens, the filter and the image sensor are sequentially positioned on an incident light path;
the image sensor comprises a plurality of RGBW pixel arrays which are arranged in an array, each RGBW pixel array comprises a plurality of pixel units, each pixel unit comprises a W pixel which is arranged in a diagonal line and a color pixel which is arranged in another diagonal line, and each pixel corresponds to one micro lens and a plurality of photosensitive elements; each pixel comprises a plurality of sub-pixels arranged in an array, and each sub-pixel corresponds to one photosensitive element; the color pixels include R pixels, G pixels, and B pixels.
In one embodiment, the plurality of photosensitive elements corresponding to the pixels are arranged in a centrosymmetric manner.
It should be understood that, although the steps in the above-described flowcharts are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the above-described flowcharts may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or the stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least a portion of the sub-steps or stages of other steps.
The division of the modules in the focusing control device is merely for illustration, and in other embodiments, the focusing control device may be divided into different modules as needed to complete all or part of the functions of the focusing control device.
For specific definition of the focus control device, reference may be made to the definition of the focus control method above, and details are not repeated here. The modules in the focusing control device can be realized by software, hardware and their combination in whole or in part. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
Fig. 17 is a schematic diagram of an internal structure of an electronic device in one embodiment. The electronic device may be any terminal device such as a mobile phone, a tablet computer, a notebook computer, a desktop computer, a PDA (Personal Digital Assistant), a POS (Point of Sales), a vehicle-mounted computer, and a wearable device. The electronic device includes a processor and a memory connected by a system bus. The processor may include one or more processing units, among others. The processor may be a CPU (Central Processing Unit), a DSP (Digital Signal processor), or the like. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program can be executed by a processor for implementing a focus control X method provided in the following embodiments. The internal memory provides a cached execution environment for the operating system computer programs in the non-volatile storage medium.
The implementation of each module in the focus control apparatus provided in the embodiments of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. Program modules constituted by such computer programs may be stored on the memory of the electronic device. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of the focus control method.
Embodiments of the present application also provide a computer program product containing instructions that, when run on a computer, cause the computer to perform a focus control method.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. The nonvolatile Memory may include a ROM (Read-Only Memory), a PROM (Programmable Read-Only Memory), an EPROM (Erasable Programmable Read-Only Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), or a flash Memory. Volatile Memory can include RAM (Random Access Memory), which acts as external cache Memory. By way of illustration and not limitation, RAM is available in many forms, such as SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), SDRAM (Synchronous Dynamic Random Access Memory), Double Data Rate DDR SDRAM (Double Data Rate Synchronous Random Access Memory), ESDRAM (Enhanced Synchronous Dynamic Random Access Memory), SLDRAM (Synchronous Link Dynamic Random Access Memory), RDRAM (Random Dynamic Random Access Memory), and DRmb DRAM (Dynamic Random Access Memory).
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (18)

1. A focus control method applied to an electronic device including an image sensor including an RGBW pixel array, the method comprising:
determining a target pixel corresponding to the light intensity of the current shooting scene from the RGBW pixel array according to the light intensity of the current shooting scene; the target pixel comprises a W pixel or at least one color pixel in the RGBW pixel array;
acquiring phase information of the target pixel, and calculating a phase difference according to the phase information of the target pixel;
and performing focusing control based on the phase difference.
2. The method of claim 1, wherein determining a target pixel corresponding to the light intensity of the current shooting scene from the RGBW pixel array according to the light intensity of the current shooting scene comprises:
and determining a target pixel corresponding to the light intensity of the current shooting scene from the RGBW pixel array according to the light intensity of the current shooting scene and a preset threshold value of the light intensity.
3. The method of claim 2, wherein the determining a target pixel corresponding to the light intensity of the current shooting scene from the RGBW pixel array according to the light intensity of the current shooting scene and the preset threshold of the light intensity comprises:
and if the light intensity of the current shooting scene exceeds a first preset threshold value, taking at least one color pixel in the RGBW pixel array as a target pixel.
4. The method of claim 3, wherein the obtaining the phase information collected by the target pixel, and the calculating the phase difference according to the phase information of the target pixel comprises:
if the light intensity of the current shooting scene exceeds a second preset threshold, phase information of sub-pixels of each pixel in the target pixel is obtained; the second preset threshold is greater than the first preset threshold;
for two pixels with the same color in the target pixel, calculating the phase difference of the target pixel according to the phase information of each pair of sub-pixels in the two pixels with the same color; the two pixels with the same color are adjacent along the diagonal line of the pixel array, and each pair of sub-pixels are respectively located in the two pixels with the same color and have the same position in each pixel.
5. The method according to claim 4, wherein after the focus control based on the phase difference, the method further comprises:
controlling exposure of the RGBW pixel array to obtain pixel values of all sub-pixels in the RGBW pixel array;
acquiring the pixel value of the sub-pixel of the color pixel from the pixel value of the sub-pixel, and performing interpolation operation on the pixel value of the sub-pixel of the color pixel to generate a Bayer array image;
acquiring a pixel value of a sub-pixel of the W pixel from the pixel values of the sub-pixels, and performing interpolation operation on the pixel value of the sub-pixel of the W pixel to generate a W pixel image;
and fusing the Bayer array image and the W pixel image to generate a target image.
6. The method of claim 3, wherein each of the RGBW pixel arrays comprises a plurality of pixel cells, each of the pixel cells comprises a plurality of pixels, each of the pixels comprises a plurality of sub-pixels; the acquiring the phase information acquired by the target pixel and calculating the phase difference according to the phase information of the target pixel comprise:
if the light intensity of the current shooting scene exceeds the first preset threshold and is not greater than the second preset threshold, phase information of sub-pixels of each pixel in the target pixel is obtained;
for each pixel unit, combining phase information of sub-pixels, located in the same region of the same color pixel in the first direction, in the same color pixel to obtain combined phase information of the same color pixel in the first direction in each pixel unit, and calculating a phase difference in the first direction according to the combined phase information of each pixel in the first direction; or the like, or, alternatively,
for each pixel unit, combining phase information of sub-pixels, located in the same region in the same pixel with the same color in the second direction, in the pixels with the same color to obtain combined phase information of the pixels with the same color in the second direction in each pixel unit, and calculating a phase difference in the second direction according to the combined phase information of each pixel in the second direction; the first direction and the second direction are perpendicular to each other.
7. The method of claim 6, wherein performing focus control based on the phase difference comprises:
if the preview image corresponding to the current shooting scene comprises the texture features in the second direction, performing focusing control based on the phase difference in the first direction; or the like, or, alternatively,
and if the preview image corresponding to the current shooting scene comprises the texture features in the first direction, performing focusing control based on the phase difference in the second direction.
8. The method of claim 7, wherein after the focus control based on the phase difference, the method further comprises:
controlling exposure of the RGBW pixel array to obtain pixel values of the sub-pixels in the RGBW pixel array;
calculating a pixel value of each color pixel according to a pixel value of a sub-pixel of the color pixel;
carrying out interpolation operation on the pixel values of the color pixels to generate a Bayer array image;
calculating the pixel value of the W pixel according to the pixel value of the sub-pixel of the W pixel, and performing interpolation operation on the pixel value of the W pixel to generate a W pixel image;
and fusing the Bayer array image and the W pixel image to generate a target image.
9. The method of claim 8, wherein the color pixels comprise R pixels, G pixels, and B pixels; the calculating the pixel value of the color pixel according to the pixel value of the sub-pixel of each color pixel comprises:
acquiring pixel values of sub-pixels of an R pixel, a G pixel and a B pixel from pixel values of the sub-pixels, combining the pixel values of the sub-pixels of the R pixel to obtain the pixel value of the R pixel, combining the pixel values of the sub-pixels of the G pixel to obtain the pixel value of the G pixel, and combining the pixel values of the sub-pixels of the B pixel to obtain the pixel value of the B pixel.
10. The method of claim 8, wherein the calculating the pixel value of the W pixel from the pixel values of the sub-pixels of the W pixel comprises:
and acquiring the pixel value of the sub-pixel of the W pixel from the pixel values of the sub-pixels, and combining the pixel values of the sub-pixels of the W pixel to obtain the pixel value of the W pixel.
11. The method of claim 2, wherein the determining a target pixel corresponding to the light intensity of the current shooting scene from the RGBW pixel array according to the light intensity of the current shooting scene and the preset threshold of the light intensity comprises:
and if the light intensity of the current shooting scene is smaller than or equal to a first preset threshold value, taking the W pixel in the RGBW pixel array as a target pixel.
12. The method of claim 11, wherein each RGBW pixel array comprises a plurality of pixel units, wherein obtaining the phase information collected by the target pixel, and wherein calculating the phase difference from the phase information of the target pixel comprises:
for the W pixel, acquiring phase information of each sub-pixel in the W pixel;
for each pixel unit, combining the phase information of the sub-pixels in the same area in the W pixel in the first direction to obtain combined phase information of the W pixel in the first direction, and calculating the phase difference in the first direction according to the combined phase information of the W pixel in the first direction; or the like, or, alternatively,
for each pixel unit, combining the phase information of the sub-pixels in the same area in the W pixel in the second direction to obtain combined phase information of the W pixel in the second direction, and calculating the phase difference in the second direction according to the combined phase information of the W pixel in the second direction; the first direction and the second direction are perpendicular to each other.
13. The method of claim 1, wherein the image sensor comprises a plurality of RGBW pixel arrays arranged in an array, each RGBW pixel array comprises a plurality of pixel units, each pixel unit comprises a W pixel arranged in a diagonal and a color pixel arranged in another diagonal, and each pixel corresponds to one microlens and a plurality of photosensitive elements; each pixel comprises a plurality of sub-pixels arranged in an array, and each sub-pixel corresponds to one photosensitive element; the color pixels include R pixels, G pixels and B pixels.
14. An imaging device comprises a lens, an optical filter and an image sensor, and is characterized in that the lens, the optical filter and the image sensor are sequentially positioned on an incident light path;
the image sensor comprises a plurality of RGBW pixel arrays arranged in an array, each RGBW pixel array comprises a plurality of pixel units, each pixel unit comprises a W pixel arranged in a diagonal line and a color pixel arranged in another diagonal line, and each pixel corresponds to one micro lens and a plurality of photosensitive elements; each pixel comprises a plurality of sub-pixels arranged in an array, and each sub-pixel corresponds to one photosensitive element; the color pixels include R pixels, G pixels and B pixels.
15. The imaging apparatus of claim 14, wherein the plurality of photosensitive elements corresponding to the pixels are arranged in a centrosymmetric manner.
16. A focus control apparatus applied to an electronic device including an image sensor including an RGBW pixel array, the apparatus comprising:
the target pixel determining module is used for determining a target pixel corresponding to the light intensity of the current shooting scene from the RGBW pixel array according to the light intensity of the current shooting scene; the target pixel comprises a W pixel or at least one color pixel in the RGBW pixel array;
the phase difference calculation module is used for acquiring the phase information of the target pixel and calculating the phase difference according to the phase information of the target pixel;
and the focusing control module is used for carrying out focusing control on the basis of the phase difference.
17. An electronic device comprising a memory and a processor, the memory having a computer program stored thereon, wherein the computer program, when executed by the processor, causes the processor to perform the steps of the focus control method according to any one of claims 1 to 13.
18. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 13.
CN202110909146.8A 2021-08-09 2021-08-09 Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium Pending CN113660415A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110909146.8A CN113660415A (en) 2021-08-09 2021-08-09 Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium
PCT/CN2022/103859 WO2023016144A1 (en) 2021-08-09 2022-07-05 Focusing control method and apparatus, imaging device, electronic device, and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110909146.8A CN113660415A (en) 2021-08-09 2021-08-09 Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN113660415A true CN113660415A (en) 2021-11-16

Family

ID=78478635

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110909146.8A Pending CN113660415A (en) 2021-08-09 2021-08-09 Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN113660415A (en)
WO (1) WO2023016144A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113891006A (en) * 2021-11-22 2022-01-04 Oppo广东移动通信有限公司 Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium
CN114222047A (en) * 2021-12-27 2022-03-22 Oppo广东移动通信有限公司 Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium
WO2023016144A1 (en) * 2021-08-09 2023-02-16 Oppo广东移动通信有限公司 Focusing control method and apparatus, imaging device, electronic device, and computer readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013055623A (en) * 2011-09-06 2013-03-21 Sony Corp Image processing apparatus, image processing method, information recording medium, and program
CN105210369A (en) * 2013-04-17 2015-12-30 法国甫托尼公司 Device for acquiring bimodal images
CN105611125A (en) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 Imaging method, imaging device and electronic device
CN110087065A (en) * 2019-04-30 2019-08-02 德淮半导体有限公司 Semiconductor device and its manufacturing method
CN110996077A (en) * 2019-11-25 2020-04-10 Oppo广东移动通信有限公司 Image sensor, camera assembly and mobile terminal
CN111741277A (en) * 2020-07-13 2020-10-02 深圳市汇顶科技股份有限公司 Image processing method and image processing device
CN112235494A (en) * 2020-10-15 2021-01-15 Oppo广东移动通信有限公司 Image sensor, control method, imaging apparatus, terminal, and readable storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113660415A (en) * 2021-08-09 2021-11-16 Oppo广东移动通信有限公司 Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium
CN113891006A (en) * 2021-11-22 2022-01-04 Oppo广东移动通信有限公司 Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013055623A (en) * 2011-09-06 2013-03-21 Sony Corp Image processing apparatus, image processing method, information recording medium, and program
CN105210369A (en) * 2013-04-17 2015-12-30 法国甫托尼公司 Device for acquiring bimodal images
CN105611125A (en) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 Imaging method, imaging device and electronic device
CN110087065A (en) * 2019-04-30 2019-08-02 德淮半导体有限公司 Semiconductor device and its manufacturing method
CN110996077A (en) * 2019-11-25 2020-04-10 Oppo广东移动通信有限公司 Image sensor, camera assembly and mobile terminal
CN111741277A (en) * 2020-07-13 2020-10-02 深圳市汇顶科技股份有限公司 Image processing method and image processing device
CN112235494A (en) * 2020-10-15 2021-01-15 Oppo广东移动通信有限公司 Image sensor, control method, imaging apparatus, terminal, and readable storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023016144A1 (en) * 2021-08-09 2023-02-16 Oppo广东移动通信有限公司 Focusing control method and apparatus, imaging device, electronic device, and computer readable storage medium
CN113891006A (en) * 2021-11-22 2022-01-04 Oppo广东移动通信有限公司 Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium
CN114222047A (en) * 2021-12-27 2022-03-22 Oppo广东移动通信有限公司 Focus control method, apparatus, image sensor, electronic device, and computer-readable storage medium
WO2023124611A1 (en) * 2021-12-27 2023-07-06 Oppo广东移动通信有限公司 Focus control method and apparatus, image sensor, electronic device, and computer-readable storage medium

Also Published As

Publication number Publication date
WO2023016144A1 (en) 2023-02-16

Similar Documents

Publication Publication Date Title
JP6878604B2 (en) Imaging method and electronic device
CN108141571B (en) Maskless phase detection autofocus
CN113660415A (en) Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium
US20190281226A1 (en) Image sensor including phase detection pixels and image pickup device
US20150098005A1 (en) Image sensor and image capturing system
CN110959285B (en) Imaging system, imaging method, and non-transitory machine-readable storage medium
CN112866549B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN112866511B (en) Imaging assembly, focusing method and device and electronic equipment
WO2023087908A1 (en) Focusing control method and apparatus, image sensor, electronic device, and computer readable storage medium
CN112866675B (en) Depth map generation method and device, electronic equipment and computer-readable storage medium
JP2017138199A (en) Image processing device, imaging device, and image processing method
CN112866655B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN112866510B (en) Focusing method and device, electronic equipment and computer readable storage medium
WO2023124611A1 (en) Focus control method and apparatus, image sensor, electronic device, and computer-readable storage medium
CN113676617B (en) Motion detection method, motion detection device, electronic device and computer-readable storage medium
CN112866554B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866546B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866548B (en) Phase difference acquisition method and device and electronic equipment
CN112866547B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866552B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112862880A (en) Depth information acquisition method and device, electronic equipment and storage medium
CN112866550B (en) Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium
CN112866674B (en) Depth map acquisition method and device, electronic equipment and computer readable storage medium
CN112866543B (en) Focusing control method and device, electronic equipment and computer readable storage medium
CN117835054A (en) Phase focusing method, device, electronic equipment, storage medium and product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination