CN117835054A - Phase focusing method, device, electronic equipment, storage medium and product - Google Patents

Phase focusing method, device, electronic equipment, storage medium and product Download PDF

Info

Publication number
CN117835054A
CN117835054A CN202211162623.XA CN202211162623A CN117835054A CN 117835054 A CN117835054 A CN 117835054A CN 202211162623 A CN202211162623 A CN 202211162623A CN 117835054 A CN117835054 A CN 117835054A
Authority
CN
China
Prior art keywords
phase
curve
detection image
phase detection
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211162623.XA
Other languages
Chinese (zh)
Inventor
巫吉辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Realme Chongqing Mobile Communications Co Ltd
Original Assignee
Realme Chongqing Mobile Communications Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Realme Chongqing Mobile Communications Co Ltd filed Critical Realme Chongqing Mobile Communications Co Ltd
Priority to CN202211162623.XA priority Critical patent/CN117835054A/en
Publication of CN117835054A publication Critical patent/CN117835054A/en
Pending legal-status Critical Current

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The application relates to a phase focusing method, a phase focusing device, electronic equipment, a storage medium and a product, wherein the phase focusing method comprises the following steps: the first phase detection image and the second phase detection image are acquired by the phase detection pixels in the image sensor. And calculating a cross-correlation function curve between pixel values in the first phase detection image and the second phase detection image, calculating a phase difference according to the cross-correlation function curve, and driving the lens to perform phase focusing based on the phase difference. Because the correlation function can better represent the correlation between the time-shifted signals of one signal and the other signal, the correlation function curve between the pixel values in the first phase detection image and the second phase detection image is calculated, the correlation function curve can better represent the correlation between the pixel values in the first phase detection image and the second phase detection image, the phase difference can be calculated based on the correlation function curve, the accuracy of the calculated phase difference is improved, and the focusing accuracy is further improved.

Description

Phase focusing method, device, electronic equipment, storage medium and product
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a phase focusing method, a phase focusing device, an electronic device, a storage medium, and a product.
Background
With the development of electronic device technology, more and more users capture images through electronic devices. In order to ensure that the photographed image is clear, it is generally necessary to focus the image capturing module of the electronic device, that is, to adjust the distance between the lens and the image sensor so that the photographed object is on the focal plane. The conventional focusing mode includes phase detection auto focusing (English: phase detection auto focus; PDAF for short).
However, the phase difference calculated by the conventional phase detection auto-focusing method is not accurate, and thus the effect of controlling the lens to focus is poor.
Disclosure of Invention
The embodiment of the application provides a phase focusing method, a phase focusing device, electronic equipment and a computer readable storage medium, which can improve the phase focusing effect.
In one aspect, a phase focusing method is provided, and is applied to an electronic device, wherein the electronic device comprises an image sensor, and the method comprises the following steps:
acquiring a first phase detection image and a second phase detection image through phase detection pixels in the image sensor;
Calculating a cross-correlation function curve between pixel values in the first phase detection image and the second phase detection image;
and calculating a phase difference according to the cross correlation function curve, and driving the lens to perform phase focusing based on the phase difference.
In one aspect, a phase focusing apparatus is provided, applied to an electronic device, the electronic device including an image sensor, the apparatus including:
a phase detection image acquisition module for acquiring a first phase detection image and a second phase detection image through phase detection pixels in the image sensor;
the cross-correlation function curve calculation module is used for calculating a cross-correlation function curve between pixel values in the first phase detection image and the second phase detection image;
and the phase difference calculating and focusing module is used for calculating a phase difference according to the cross correlation function curve and driving the lens to perform phase focusing based on the phase difference.
In another aspect, there is provided an electronic device comprising a memory and a processor, the memory storing a computer program which, when executed by the processor, causes the processor to perform the steps of the phase focusing method as described above.
In another aspect, a computer readable storage medium is provided, on which a computer program is stored which, when executed by a processor, implements the steps of the phase focusing method as described above.
In another aspect, a computer program product is provided comprising a computer program which, when executed by a processor, implements the steps of the phase focusing method as described above.
The phase focusing method, the phase focusing device, the electronic equipment, the storage medium and the product acquire a first phase detection image and a second phase detection image through the phase detection pixels in the image sensor. And calculating a cross-correlation function curve between pixel values in the first phase detection image and the second phase detection image, calculating a phase difference according to the cross-correlation function curve, and driving the lens to perform phase focusing based on the phase difference. Because the correlation function can better represent the correlation between one signal (function) and the time-shifted signal (function) of the other signal, the correlation function curve between the pixel values in the first phase detection image and the second phase detection image is calculated, and the correlation function curve can better represent the correlation between the pixel values in the first phase detection image and the second phase detection image, so that the phase difference can be calculated based on the correlation function curve, and the accuracy of the calculated phase difference is improved. Finally, the phase difference drives the lens to perform phase focusing, so that the focusing accuracy is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a phase detection autofocus principle;
fig. 2 is a schematic diagram of arranging phase detection pixels in pairs among pixels included in an image sensor;
FIG. 3 is a diagram of an application environment of a phase focusing method according to an embodiment;
FIG. 4 is a flow chart of a phase focusing method in one embodiment;
FIG. 5 is a schematic diagram of a portion of an image sensor according to an embodiment;
FIG. 6 is a schematic diagram of a pixel structure in an embodiment;
FIG. 7 is a flowchart of the method of FIG. 4 for calculating a cross-correlation function curve between pixel values in a first phase detection image and a second phase detection image;
FIG. 8 is a schematic illustration of a Bayer array image obtained by the image sensor shown in FIG. 2 in one embodiment;
FIG. 9 is a schematic diagram of a first phase curve and a second phase curve according to an embodiment;
FIG. 10 is a flowchart of a method for calculating a cross-correlation function curve between a first phase curve and a second phase curve according to the first phase curve and the second phase curve in one embodiment;
FIG. 11 is a schematic diagram of a cross-correlation function curve between a first phase curve and a second phase curve in one embodiment;
FIG. 12 is a schematic diagram of a phase focusing method according to one embodiment;
FIG. 13 is a schematic diagram of a phase focusing method according to another embodiment;
FIG. 14 is a block diagram of a phase focusing apparatus according to an embodiment;
fig. 15 is a schematic diagram of an internal structure of an electronic device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
It will be understood that the terms "first," "second," and the like, as used herein, may be used to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another element. For example, the first phase detection image may be referred to as a second phase detection image, and similarly, the second phase detection image may be referred to as a first phase detection image, without departing from the scope of the present application. Both the first phase detection image and the second phase detection image are phase detection images, but they are not the same phase detection image. As used herein, "a plurality" may be interpreted as two or more unless explicitly stated otherwise.
Fig. 1 is a schematic diagram of phase detection autofocus (phase detection auto focus, PDAF). As shown in fig. 1, M1 is a position where the image sensor is located when the imaging apparatus is in a focusing state, wherein the focusing state refers to a state of successful focusing. When the image sensor is located at the M1 position, the imaging light rays g reflected by the object W in different directions toward the Lens converge on the image sensor, that is, the imaging light rays g reflected by the object W in different directions toward the Lens are imaged at the same position on the image sensor, at which time the imaging of the image sensor is clear.
M2 and M3 are positions where the image sensor may be when the imaging apparatus is not in an in-focus state, and as shown in fig. 1, when the image sensor is located at the M2 position or the M3 position, imaging light rays g reflected by the object W in different directions toward the Lens are imaged at different positions. Referring to fig. 1, when the image sensor is located at the M2 position, the imaging light rays g reflected by the object W in different directions toward the Lens are respectively imaged at the positions a and B, and when the image sensor is located at the M3 position, the imaging light rays g reflected by the object W in different directions toward the Lens are respectively imaged at the positions C and D, and at this time, the imaging of the image sensor is unclear.
In the PDAF technique, a difference in position of an image formed in an image sensor by imaging light rays entering a lens from different directions may be acquired, for example, as shown in fig. 1, a difference in position a and position B may be acquired, or a difference in position C and position D may be acquired; after the difference of imaging light rays entering the lens from different directions in the position of the imaging light rays in the image sensor is obtained, the defocusing distance can be obtained according to the difference and the geometric relationship between the lens in the camera and the image sensor, wherein the defocusing distance refers to the distance between the current position of the image sensor and the position where the image sensor is supposed to be in the focusing state; the imaging device may perform focusing according to the obtained defocus distance.
From this, it is clear that the calculated PD value is 0 at the time of focusing, whereas the larger the calculated value is, the farther the position of the clutch focus is, the smaller the value is, and the nearer the clutch focus is. When PDAF is adopted for focusing, the defocusing distance can be obtained by calculating the PD value and then obtaining the corresponding relation between the PD value and the defocusing distance according to calibration, and then the lens is controlled to move to reach the focusing point according to the defocusing distance so as to realize focusing.
In the related art, some of the phase detection pixel points may be provided in pairs among the pixel points included in the image sensor, and as shown in fig. 2, a phase detection pixel point pair (hereinafter referred to as a pixel point pair) a, a pixel point pair B, and a pixel point pair C may be provided in the image sensor. In each pixel pair, one phase detection pixel performs Left shielding (english: left Shield), and the other phase detection pixel performs Right shielding (english: right Shield).
For the phase detection pixel point with left side shielding, only the right side beam of the imaging beam emitted to the phase detection pixel point can image on the photosensitive part (i.e. the part which is not shielded) of the phase detection pixel point, and for the phase detection pixel point with right side shielding, only the left side beam of the imaging beam emitted to the phase detection pixel point can image on the photosensitive part (i.e. the part which is not shielded) of the phase detection pixel point. Thus, the imaging light beam can be divided into a left part and a right part, and the phase difference can be obtained by comparing the images formed by the left part and the right part of the imaging light beam.
However, the phase difference calculated by the above method is not accurate, and thus, the effect of controlling the lens to perform focusing is poor.
In order to solve the problem of poor focusing effect in the phase detection automatic focusing process, a phase focusing method is provided in the embodiment of the application. Fig. 3 is a schematic diagram of an application environment of a phase focusing method in an embodiment. As shown in fig. 3, the application environment includes an electronic device 320, and the electronic device 320 includes an image sensor and a lens. The electronic device 320 obtains a first phase detection image and a second phase detection image through the phase detection pixels in the image sensor; calculating a cross-correlation function curve between pixel values in the first phase detection image and the second phase detection image; and calculating a phase difference according to the cross correlation function curve, and driving the lens to perform phase focusing based on the phase difference. The electronic device 320 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, internet of things devices and portable wearable devices, and the internet of things devices may be smart speakers, smart televisions, smart air conditioners, smart vehicle devices, smart automobiles, etc. The portable wearable device may be a smart watch, smart bracelet, headset, or the like.
FIG. 4 is a flow chart of a phase focusing method in one embodiment. The phase focusing method in this embodiment is described by taking the electronic device in fig. 3 as an example, and the electronic device 320 includes an image sensor and a lens. As shown in fig. 4, the phase focusing method includes steps 420 to 460, wherein,
In step 420, a first phase detection image and a second phase detection image are acquired by the phase detection pixels in the image sensor.
In one case, as shown in fig. 2, some of the phase detection pixel points (abbreviated as PD pixel pairs) are provided in pairs among the pixel points included in the image sensor of the electronic device. In another case, fig. 5 is a schematic structural diagram of a portion of an image sensor in one embodiment. The image sensor comprises a plurality of pixel point groups Z which are arranged in an array manner, each pixel point group Z comprises a plurality of pixel points D which are arranged in an array manner, and each pixel point D corresponds to one photosensitive unit. The plurality of pixel points comprises M x N pixel points, wherein M and N are natural numbers which are more than or equal to 2. Each pixel point D comprises a plurality of sub-pixel points D which are arranged in an array. That is, each photosensitive unit may be composed of a plurality of photosensitive elements arranged in an array. The photosensitive element is an element capable of converting an optical signal into an electrical signal. In one embodiment, the photosensitive element may be a photodiode. In this embodiment, each pixel group Z includes 4 pixels D arranged in an array of 2×2, and each pixel may include 4 sub-pixels D arranged in an array of 2×2. Each pixel point D includes 2×2 photodiodes, and the 2×2 photodiodes are correspondingly arranged with 4 sub-pixel points D arranged in a 2×2 array. Each photodiode is configured to receive an optical signal and perform photoelectric conversion, thereby converting the optical signal into an electrical signal for output. The 4 sub-pixel points D included in each pixel point D are disposed corresponding to the same color filter, so that each pixel point D corresponds to one color channel, such as a red R channel, or a green channel G, or a blue channel B.
As shown in fig. 6, taking an example that each pixel point D includes a sub-pixel point 1, a sub-pixel point 2, a sub-pixel point 3 and a sub-pixel point 4, signals of the sub-pixel point 1 and the sub-pixel point 2 may be combined and output, and signals of the sub-pixel point 3 and the sub-pixel point 4 may be combined and output, thereby two PD pixel pairs along the second direction (i.e., the vertical direction) are configured, and a PD value (a phase difference value) of each sub-pixel point in the pixel point D along the second direction may be determined according to the phase values of the two PD pixel pairs. Signals of the sub-pixel point 1 and the sub-pixel point 3 are combined and output, signals of the sub-pixel point 2 and signals of the sub-pixel point 4 are combined and output, so that two PD pixel pairs along a first direction (namely, the horizontal direction) are constructed, and PD values (phase difference values) of all the sub-pixel points in the pixel point D along the first direction can be determined according to the phase values of the two PD pixel pairs.
In some embodiments, the first phase detection image and the second phase detection image may be acquired using a pair of PD pixels as in fig. 2 or a pair of PD pixels as in fig. 5, respectively. The phase detection image comprises pixel values of PD pixels positioned in the same azimuth in the PD pixel pairs. For example, assuming that a PD pixel pair as in fig. 2 is employed, a first phase detection image may be obtained based on the pixel value of the PD pixel occluded on the left side, and a second phase detection image may be obtained based on the pixel value of the PD pixel occluded on the right side. Of course, the first phase detection image may be obtained based on the pixel value of the PD pixel blocked on the right side, and the second phase detection image may be obtained based on the pixel value of the PD pixel blocked on the left side, which is not limited in this application. Of course, the left side shielding may be replaced with an upper side shielding, and the right side shielding may be replaced with a lower side shielding, which is not limited in this application.
Assuming that the PD pixel pairs as in fig. 5 are employed, a first phase detection image may be obtained based on the pixel values of two PD pixel pairs along the first direction (i.e., the horizontal direction), and a second phase detection image may be obtained based on the pixel values of the other two PD pixel pairs along the first direction (i.e., the horizontal direction). Of course, here, two PD pixel pairs along the first direction (i.e., the horizontal direction) may be replaced by two PD pixel pairs along the second direction (i.e., the vertical direction), which is not limited in this application.
Step 440, calculating a cross-correlation function curve between pixel values in the first phase detection image and the second phase detection image.
Wherein the cross-correlation function is used for representing the correlation between one signal (function) and a time-shifted signal (function) of another signal, and the cross-correlation function curve is a curve corresponding to the cross-correlation function. Here, the cross-correlation function is a function regarding a time shift variable, and the time shift signal of the other signal is a signal obtained by time shifting the other signal based on the time shift variable. In this embodiment, the cross-correlation function curve may be calculated from the pixel values in the first phase detection image and the pixel values in the second phase detection image. Here, the correlation may reflect the similarity between the two signals (functions).
In some embodiments, the pixel values in the first phase detection image may be converted into a first phase function, and similarly, the pixel values in the second phase detection image may be also converted into a second phase function, and then, the correlation between the first phase function and the second phase function is calculated, so as to obtain a cross-correlation function curve. The cross-correlation function curve. Namely a cross-correlation function curve between pixel values in the first phase detection image and the second phase detection image.
Step 460, calculating phase difference according to the cross correlation function curve, and driving the lens to perform phase focusing based on the phase difference.
Since the cross-correlation function curve is a function curve with respect to time-shifted variables, the phase difference can be calculated based on the cross-correlation results at different time-shifted variables in the cross-correlation function curve. In some embodiments, the smaller or smallest target cross-correlation results may be selected from the cross-correlation results under different time-shift variables, and the phase difference is calculated based on these smaller or smallest target cross-correlation results.
Finally, calculating the defocusing distance according to the phase difference, and controlling the lens to move to the focusing area or the preset range of the focusing area according to the defocusing distance so as to realize focusing. In some embodiments, the defocus distance can be obtained according to the phase difference and the geometric relationship between the lens and the image sensor in the camera, where defocus distance refers to the distance between the current position of the image sensor and the position where the image sensor should be in the in-focus state; the imaging device may perform focusing according to the obtained defocus distance.
From this, it is clear that the calculated PD value is 0 at the time of focusing, whereas the larger the calculated value is, the farther the position of the clutch focus is, the smaller the value is, and the nearer the clutch focus is. When PDAF is adopted for focusing, the defocusing distance can be obtained by calculating the PD value and then obtaining the corresponding relation between the PD value and the defocusing distance according to calibration, and then the lens is controlled to move to reach the focusing point according to the defocusing distance so as to realize focusing.
In the process of controlling the lens to focus to the focusing area or the preset range of the focusing area, other automatic focusing modes can be adopted to match with the phase difference focusing PDAF. For example, one or more of time-of-flight autofocus (Time of flight Auto Focus, TOFAF), contrast autofocus (Contrast Auto Focus, CAF), laser focus, etc. may be used to mix focus with the phase focus approach. In some embodiments, TOF may be used for coarse focusing first, and then PDAF may be used for fine focusing; or firstly adopting PDAF to perform rough focusing, then adopting CAF to perform precise focusing, and the like. The automatic focusing mode of rough focusing by PDAF and then precise focusing by CAF can combine the speed of phase automatic focusing and the accuracy of contrast automatic focusing. Firstly, the phase automatic focusing adjusts the distance setting rapidly, and at this time, the photographed object is clear. Then fine adjustment is performed by contrast focusing, and the maximum contrast can be determined in less time due to the pre-adjustment of the focusing position, so that more accurate focusing can be performed.
In the embodiment of the application, the first phase detection image and the second phase detection image are acquired through the phase detection pixels in the image sensor. And calculating a cross-correlation function curve between pixel values in the first phase detection image and the second phase detection image, calculating a phase difference according to the cross-correlation function curve, and driving the lens to perform phase focusing based on the phase difference. Because the correlation function can better represent the correlation between one signal (function) and the time-shifted signal (function) of the other signal, the correlation function curve between the pixel values in the first phase detection image and the second phase detection image is calculated, and the correlation function curve can better represent the correlation between the pixel values in the first phase detection image and the second phase detection image, so that the phase difference can be calculated based on the correlation function curve, and the accuracy of the calculated phase difference is improved. Finally, the phase difference drives the lens to perform phase focusing, so that the focusing accuracy is improved.
In the above embodiments, a phase focusing method is described in which the phase difference is obtained by calculating the cross correlation function curve between the pixel values in the first phase detection image and the second phase detection image. In this embodiment, as shown in fig. 7, a specific implementation step of calculating a cross-correlation function curve between pixel values in the first phase detection image and the second phase detection image in step 440 is further described, including:
In step 442, a first phase curve corresponding to the first phase detection image is calculated based on the pixel values in the first phase detection image.
Fig. 8 is a schematic diagram of a bayer array image obtained by the image sensor shown in fig. 2 in one embodiment. It is known that the bayer array image includes RGB pixel values and pixel values corresponding to occlusion pixels. Here, the pixel value corresponding to the occlusion pixel may be an RGB pixel value or a gray value, which is not limited in this application.
In some embodiments, the first phase detection image and the second phase detection image are separated from the bayer array image, and if the pair of phase detection pixels of the image sensor shown in fig. 2 includes left/right-side blocked phase detection pixels, a right-side phase detection image (abbreviated as right PD image) is obtained based on pixel values corresponding to the left-side blocked phase detection pixels in the bayer array image, that is, the first phase detection image. And obtaining a left phase detection image (left PD image for short) based on pixel values corresponding to the phase detection pixel points blocked on the right side in the Bayer array image, namely a second phase detection image.
Then, a first phase curve corresponding to the first phase detection image is calculated from pixel values in the first phase detection image. Here, the abscissa of the first phase curve may be each pixel point in the first phase detection image, and the ordinate of the first phase curve may be the pixel value of each pixel point. Of course, the abscissa of the first phase curve may be a result obtained by merging or screening each pixel point in the first phase detection image, and the ordinate of the first phase curve may be a result obtained by processing the pixel value of each pixel point, which is not limited in this application.
Step 444, calculating a second phase curve corresponding to the second phase detection image according to the pixel values in the second phase detection image.
Similarly, a second phase curve corresponding to the second phase detection image is calculated from the pixel values in the second phase detection image. Here, the abscissa of the second phase curve may be each pixel point in the second phase detection image, and the ordinate of the second phase curve may be the pixel value of each pixel point. Of course, the abscissa of the second phase curve may be a result obtained by merging or screening each pixel point in the second phase detection image, and the ordinate of the second phase curve may be a result obtained by processing the pixel value of each pixel point, which is not limited in this application.
Step 446, calculating a cross-correlation function curve between the first phase curve and the second phase curve according to the first phase curve and the second phase curve, and using the cross-correlation function curve as the cross-correlation function curve between the pixel values in the first phase detection image and the second phase detection image.
Since the cross-correlation function can better characterize the correlation between one signal (function) and the time-shifted signal (function) of another signal, first, the second phase curve is time-shifted based on the time-shifted variable, generating a third phase curve. Here, the time shift variable includes one or more specific time shift data, and for each time shift data, a third phase curve is corresponding, so that the second phase curve is time-shifted based on the time shift variable, and one or more third phase curves are generated.
And secondly, calculating a correlation result between the first phase curve and the third phase curve according to each third phase curve, wherein the correlation result is a cross-correlation result corresponding to time shift data of the third phase curve. Based on the time shift data and the relation between the cross-correlation results corresponding to the time shift data, a cross-correlation function curve between the first phase curve and the second phase curve is obtained. Finally, since the first phase curve is obtained based on the pixel values in the first phase detection image and the second phase curve is obtained based on the pixel values in the second phase detection image, the cross-correlation function curve can be used as the cross-correlation function curve between the pixel values in the first phase detection image and the second phase detection image.
In the embodiment of the present application, when calculating the cross-correlation function curve, first, a first phase curve corresponding to a first phase detection image is calculated according to pixel values in the first phase detection image. The first phase detection image is a two-dimensional image, the first phase curve is a one-dimensional curve, and the first phase curve corresponding to the first phase detection image is calculated, so that the two-dimensional image is converted into the one-dimensional curve, the dimension reduction processing is realized, and the calculated amount in the subsequent phase difference calculation process is reduced. Next, a second phase curve corresponding to the second phase detection image is calculated from the pixel values in the second phase detection image. By the same token, the two-dimensional image is converted into a one-dimensional curve, the dimension reduction processing is realized, and the calculated amount in the subsequent phase difference calculation process is reduced. And finally, calculating a cross-correlation function curve between the first phase curve and the second phase curve according to the first phase curve and the second phase curve, and taking the cross-correlation function curve as a pixel value between the first phase detection image and the second phase detection image.
Since the first phase curve is obtained based on the pixel values in the first phase detection image and the second phase curve is obtained based on the pixel values in the second phase detection image, the cross correlation function can better characterize the correlation between one signal (function) and the time shifted signal (function) of the other signal. Therefore, the correlation function curve between the first phase curve and the second phase curve can more accurately represent the correlation between the pixel values in the first phase detection image and the second phase detection image. Further, the phase difference can be accurately calculated based on the cross-correlation function curve.
In the above-described embodiments, the process of calculating the cross-correlation function curve between the pixel values in the first phase detection image and the second phase detection image from the first phase curve and the second phase curve is described. In some embodiments, further describing step 442, calculating a first phase curve corresponding to the first phase detection image from pixel values in the first phase detection image, comprising:
summing pixel values of each group of first type pixels in the first phase detection image to generate a first summation result corresponding to each group of first type pixels; each group of first-class pixels includes a plurality of pixels having the same coordinate value in a first direction (horizontal axis);
for each group of first-class pixels, taking a first summation result corresponding to the first-class pixels as a second direction coordinate value, and taking the first direction coordinate value of the first-class pixels as a first direction coordinate value to generate a first phase curve corresponding to a first phase detection image; wherein, the first direction and the second direction form a preset included angle.
As shown in fig. 8, a right-side phase detection image (right PD image) is separated from the bayer array image, that is, a first phase detection image, and a left-side phase detection image (left PD image) is separated from the bayer array image, that is, a second phase detection image. Because the left PD image includes pixel values corresponding to a plurality of left-side blocked phase detection pixel points, the left-side blocked phase detection pixel points can be divided into a plurality of groups of first-class pixels. Wherein the first type of pixels includes a plurality of pixels having the same coordinate value in a first direction (horizontal axis). The first direction and the second direction form a preset included angle, for example, the preset included angle is 90 degrees, the first direction is the horizontal direction at this time, the second direction is the vertical direction, the first direction coordinate value is the horizontal axis coordinate value, and the second direction coordinate value is the vertical axis coordinate value. Of course, the present application is not limited thereto. Then, the pixels with the same coordinate value on the horizontal axis in the left PD image are divided into a group of first-class pixels, that is, the pixels in the same column are divided into a group of first-class pixels along the vertical direction.
Then, the pixel values of the groups of first-class pixels in the first phase detection image are summed to generate a first summation result corresponding to the groups of first-class pixels. That is, the pixel values of the pixels in the same column in the left PD map are summed along the vertical direction, and a first summation result corresponding to the column of pixels (the group of pixels of the first type) is generated. At this time, the pixel values of all the pixels in each group of the first type pixels may be summed to generate a first summation result corresponding to each group of the first type pixels; the pixel values of partial pixels in each group of first type pixels can be summed to generate a first summation result corresponding to each group of first type pixels; of course, the present application is not limited thereto.
Finally, for each group of first-class pixels, a first summation result corresponding to the first-class pixels is taken as a second direction coordinate value, and a first direction coordinate value of the first-class pixels is taken as a first direction coordinate value, so that a first phase curve corresponding to the first phase detection image is generated. For example, as shown in fig. 8, the image size of the left PD map is 3×3, and then for the first group of first-class pixels (first-column pixels), the pixel values of the first-column pixels are summed to generate a first summation result y11 corresponding to the first-column pixels; similarly, for a second group of pixels of the first type (second column pixels), the pixel values of the second column pixels are summed to generate a first summation result y12 corresponding to the second column pixels; for the third group of first class pixels (third column pixels), the pixel values of the third column pixels are summed to generate a first summation result y13 corresponding to the third column pixels.
Assuming that the first direction coordinate value of the first column of pixels is x11, a first target point (x 11, y 11) on a first phase curve corresponding to the first phase detection image is formed with x11 as the first direction coordinate value and the first summation result y11 corresponding to the first column of pixels as the first direction coordinate value. Similarly, a plurality of first target points (x 12, y 12) and first target points (x 13, y 13) are formed. A new coordinate system is established, and the formed first target point (x 11, y 11), first target point (x 12, y 12) and first target point (x 13, y 13) are drawn in the new coordinate system, so that a first phase curve corresponding to the first phase detection image is generated. Fig. 9 is a schematic diagram of a first phase curve and a second phase curve according to an embodiment. As shown in fig. 9 (a), a schematic diagram of a first phase curve is shown, where first direction coordinates of the first phase curve represent first direction coordinates of each group of first type pixels, and second direction coordinates of the first phase curve represent first summation results corresponding to each group of first type pixels.
In this embodiment, when calculating the first phase curve corresponding to the first phase detection image according to the pixel values in the first phase detection image, since the first phase detection image is a two-dimensional image and has a large size, the first summing result corresponding to each group of first type pixels is generated by summing the pixel values of each group of first type pixels in the first phase detection image, and for each group of first type pixels, the first summing result corresponding to the first type pixels is used as the second direction coordinate value, and the first direction coordinate value of the first type pixels is used as the first direction coordinate value, so as to generate the first phase curve corresponding to the first phase detection image. Therefore, the two-dimensional image is converted into the one-dimensional curve, the dimension reduction processing is realized, and the calculated amount in the subsequent phase difference calculation process is reduced.
In the above-described embodiment, the process of calculating the first phase curve corresponding to the first phase detection image from the pixel values in the first phase detection image is described. In this embodiment, further describing that summing pixel values of each group of first-class pixels in the first phase detection image, another specific implementation manner of generating a first summation result corresponding to each group of first-class pixels includes:
extracting first type target pixels from among the first type pixels along a second direction for each group of first type pixels in the first phase detection image;
the pixel values of the first class target pixels are summed to generate a first summed result corresponding to each group of first class pixels.
In some embodiments, the pixels in the first phase detection image are classified according to a classification rule with the same coordinate value in the first direction, so as to obtain a plurality of groups of first-class pixels. Wherein each group of first-class pixels includes a plurality of pixels having the same first-direction coordinate value. When the pixel values of the first-class pixels of each group in the first phase detection image are summed, the pixel values of part of the pixels in each group of the first-class pixels may be summed, and a first summation result corresponding to each group of the first-class pixels is generated.
First, for each group of first-type pixels in a first phase detection image, a part of pixels are extracted from a group of first-type pixels. Alternatively, the first type target pixels may be extracted from the group of first type pixels in the second direction, where the interval extraction refers to extracting the first type target pixels from the group of first type pixels by a preset number of first type pixels. Here, the preset number of first type pixels is not limited, and may be 1 first type pixel, 2 first type pixels, or 5 first type pixels, which is not limited in this application.
Assuming that the first direction is a horizontal direction and the second direction is a vertical direction, and the first type target pixels are extracted from a group of first type pixels along the second direction at intervals of 1 first type pixel, then the first type target pixels are extracted from the first type pixels along the vertical direction for each group of first type pixels in the first phase detection image. For example, from a first group of first-type pixels (first column pixels) in the first phase detection image, pixels of the first row, the third row are extracted as first-type target pixels.
Then, the pixel values of the first class target pixels are summed to generate a first summation result corresponding to each group of first class pixels. That is, for each group of first-type pixels in the first phase detection image, the pixel values of the first-type target pixels in each group of first-type pixels are summed, and a first summation result corresponding to the group of first-type pixels is generated.
In this embodiment of the present application, when summing the pixel values of the first type pixels in each group of the first phase detection image, the pixel values of the partial pixels in each group of the first type pixels may be summed to generate a first summation result corresponding to each group of the first type pixels. In some embodiments, the first type of target pixels are extracted from each group of first type of pixels in a sampling manner, and then the pixel values of the first type of target pixels in each group of first type of pixels are summed to generate a first summation result corresponding to the group of first type of pixels. Therefore, on one hand, the calculation amount can be greatly reduced by adopting an interval sampling mode, and on the other hand, the image details can be better reserved by adopting an interval sampling mode aiming at a fine stripe scene, and the accuracy of a subsequent phase difference calculation result is improved.
In the above embodiment, it is described that the first type target pixels are extracted from each group of the first type pixels by means of interval sampling, and then the pixel values of the first type target pixels in each group of the first type pixels are summed to generate the first summation result corresponding to the group of the first type pixels. In this embodiment, further describing a specific implementation manner of summing pixel values of each group of first-class pixels in the first phase detection image to generate a first summation result corresponding to each group of first-class pixels, the method includes:
clipping the first phase detection image according to a preset focusing area to generate a first target phase detection image;
the pixel values of the first type pixels of each group in the first target phase detection image are summed to generate a first summation result corresponding to each group of first type pixels.
In some embodiments, after the first phase detection image and the second phase detection image are separated from the bayer array image, pixel values of each group of first type pixels in the first phase detection image may be directly summed to generate a first summation result corresponding to each group of first type pixels. Wherein each group of first-class pixels includes a plurality of pixels having the same first-direction (horizontal axis) coordinate value. Then, for each group of first-class pixels, a first summation result corresponding to the first-class pixels is used as a second direction coordinate value, and a first direction coordinate value of the first-class pixels is used as a first direction coordinate value, so that a first phase curve corresponding to the first phase detection image is generated.
Since the number of pixels included in the first phase detection image is large, the first phase detection image may be clipped to reduce the image to generate the first target phase detection image. In some embodiments, the first phase detection image may be cropped according to a preset focusing area, and the first phase detection image may also be cropped based on an area with higher definition, and the first phase detection image may be cropped based on a central area, which is, of course, not limited in this application. At this time, the size of the first target phase detection image is smaller than that of the first phase detection image, and it is apparent that the calculation amount is greatly reduced in the subsequent phase difference calculation process. Here, the preset focusing area may be a focusing area selected by the user on the preview interface of the camera, or may be a focusing area automatically recognized by the electronic device, which is not limited in this application. In some embodiments, the preset focusing area may be an area including a subject, for example, the subject may be a human face, a human body, an animal, etc., which is not limited in this application.
For example, clipping an image area corresponding to a preset focusing area in the first phase detection image to be used as a first target phase detection image; clipping out a region with higher definition in the first phase detection image to be used as a first target phase detection image; the center area in the first phase detection image is clipped out as the first target phase detection image, which is, of course, not limited in this application.
And then, summing pixel values of all groups of first type pixels in the first target phase detection image to generate a first summation result corresponding to all groups of first type pixels. At this time, the pixel values of all the pixels in each group of the first type pixels may be summed to generate a first summation result corresponding to each group of the first type pixels; the pixel values of partial pixels in each group of first type pixels can be summed to generate a first summation result corresponding to each group of first type pixels; of course, the present application is not limited thereto. If the pixel values of all the pixels in each group of the first type pixels can be summed, the pixel values of all the pixels in the same column in the first target phase detection image along the vertical direction can be summed, so as to generate a first summation result corresponding to the column of pixels (the group of the first type pixels). The first type of target pixels may also be extracted from among the first type of pixels along the second direction for each group of first type of pixels in the first target phase detection image; and then, summing the pixel values of the first type of target pixels to generate a first summation result corresponding to each group of first type of pixels.
Finally, for each group of first-class pixels, a first summation result corresponding to the first-class pixels is taken as a second direction coordinate value, and a first direction coordinate value of the first-class pixels is taken as a first direction coordinate value, so that a first phase curve corresponding to the first phase detection image is generated.
In this embodiment of the present application, when summing pixel values of each group of first-class pixels in the first phase detection image to generate a first summation result corresponding to each group of first-class pixels, the first phase detection image may be first cropped according to a preset focusing area to generate a first target phase detection image. At this time, the size of the first target phase detection image is smaller than the size of the first phase detection image. And then, summing pixel values of all groups of first type pixels in the first target phase detection image to generate a first summation result corresponding to all groups of first type pixels. Obviously, the calculated amount is greatly reduced in the subsequent phase difference calculation process, and the phase focusing rate is improved.
In the above-described embodiment, a detailed procedure of calculating the second phase curve corresponding to the second phase detection image from the pixel values in the second phase detection image is described. In this embodiment, step 444 is further described, which calculates a second phase curve corresponding to the second phase detection image according to the pixel values in the second phase detection image, including:
summing pixel values of each group of second type pixels in the second phase detection image to generate a second summation result corresponding to each group of second type pixels; each group of second type pixels comprises a plurality of pixels with the same first direction coordinate value;
For each group of second-class pixels, taking a second summation result corresponding to the second-class pixels as a second direction coordinate value, and taking a first direction coordinate value of the second-class pixels as a first direction coordinate value to generate a second phase curve corresponding to a second phase detection image; the first direction and the second direction form a preset included angle.
As shown in fig. 8, a right-side phase detection image (right PD image) is separated from the bayer array image, that is, a first phase detection image, and a left-side phase detection image (left PD image) is separated from the bayer array image, that is, a second phase detection image. Because the right PD image includes pixel values corresponding to a plurality of right-side blocked phase detection pixel points, the right-side blocked phase detection pixel points can be divided into a plurality of groups of second-class pixels. Wherein the second type of pixels includes a plurality of pixels having the same coordinate value in the first direction (horizontal axis). The first direction and the second direction form a preset included angle, for example, the preset included angle is 90 degrees, the first direction is the horizontal direction at this time, the second direction is the vertical direction, the first direction coordinate value is the horizontal axis coordinate value, and the second direction coordinate value is the vertical axis coordinate value. Of course, the present application is not limited thereto. Then, the plurality of pixels with the same coordinate value of the horizontal axis in the right PD image are divided into a group of second type pixels, namely, the pixels in the same column are divided into a group of second type pixels along the vertical direction.
Then, the pixel values of the groups of the second type pixels in the second phase detection image are summed to generate a second summation result corresponding to the groups of the second type pixels. That is, the pixel values of the pixels in the same column in the right PD diagram are summed along the vertical direction, and a second summation result corresponding to the pixel in the column (the group of pixels in the second class) is generated.
Finally, for each group of second-class pixels, a second summation result corresponding to the second-class pixels is used as a second direction coordinate value, and a first direction coordinate value of the second-class pixels is used as a first direction coordinate value, so that a second phase curve corresponding to the second phase detection image is generated. For example, as shown in connection with fig. 8, the image size of the right PD map is 3×3, and then for the first group of the second type pixels (first column pixels), the pixel values of the first column pixels are summed to generate a second summation result y21 corresponding to the first column pixels; similarly, for a second group of pixels of a second type (second column pixels), the pixel values of the pixels of the second column are summed to generate a second summation result y22 corresponding to the pixels of the second column; for the third group of pixels of the second type (third column of pixels), the pixel values of the pixels of the third column are summed, generating a second summation result y23 corresponding to the pixels of the third column.
Assuming that the first direction coordinate value of the first column of pixels is x21, a second target point (x 21, y 21) on a second phase curve corresponding to the second phase detection image is formed with x21 as the first direction coordinate value and a second summation result y21 corresponding to the second column of pixels as the second direction coordinate value. Similarly, a plurality of second target points (x 22, y 22) and second target points (x 23, y 23) are formed. A new coordinate system is established, and the formed second target point (x 21, y 21), second target point (x 22, y 22) and second target point (x 23, y 23) are drawn in the new coordinate system, so that a second phase curve corresponding to the second phase detection image is generated. Fig. 9 is a schematic diagram of a first phase curve and a second phase curve according to an embodiment. As shown in fig. 9 (b), a schematic diagram of a second phase curve is shown, where the first direction coordinates of the second phase curve represent the first direction coordinates of the second type pixels of each group, and the second direction coordinates of the second phase curve represent the second summation result corresponding to the second type pixels of each group.
In this embodiment, when calculating the second phase curve corresponding to the second phase detection image according to the pixel values in the second phase detection image, since the second phase detection image is a two-dimensional image and has a large size, the second summing result corresponding to each group of second type pixels is generated by summing the pixel values of each group of second type pixels in the second phase detection image, and for each group of second type pixels, the second summing result corresponding to the second type pixels is used as the second direction coordinate value, and the first direction coordinate value of the second type pixels is used as the first direction coordinate value, so as to generate the second phase curve corresponding to the second phase detection image. Therefore, the two-dimensional image is converted into the one-dimensional curve, the dimension reduction processing is realized, and the calculated amount in the subsequent phase difference calculation process is reduced.
In the above-described embodiments, a detailed procedure of calculating a first phase curve corresponding to a first phase detection image from pixel values in the first phase detection image and calculating a second phase curve corresponding to a second phase detection image from pixel values in the second phase detection image is described. In this embodiment, as shown in fig. 10, step 446 is further described, which calculates a cross-correlation function curve between the first phase curve and the second phase curve according to the first phase curve and the second phase curve, including:
in step 446a, in the first direction, the second phase curves are translated according to a plurality of preset translation distances, so as to generate a plurality of third phase curves.
Wherein the cross-correlation function is used for representing the correlation between one signal (function) and a time-shifted signal (function) of another signal, and the cross-correlation function curve is a curve corresponding to the cross-correlation function. Therefore, when calculating the cross-correlation function curve between the first phase curve and the second phase curve, the time-shifted signal corresponding to the second phase curve, that is, the third phase curve, can be calculated with the first phase curve as a reference. And then calculating the correlation between the first phase curve and the third phase curve, and further obtaining a cross-correlation function curve between the first phase curve and the second phase curve. The second phase curve may also be used as a reference to calculate a time-shifted signal corresponding to the first phase curve, i.e. a fourth phase curve. And then, calculating the correlation between the fourth phase curve and the second phase curve, and further obtaining a cross-correlation function curve between the first phase curve and the second phase curve.
In some embodiments, since the first direction coordinates of the second phase curves represent the first direction coordinates of the second type pixels of each group, the second phase curves may be translated according to a plurality of preset translation distances in the first direction, so as to generate a plurality of third phase curves. Assuming that the first direction is a horizontal direction and the second direction is a vertical direction, performing translation processing on the second phase curves according to a plurality of preset translation distances in the horizontal direction to generate a plurality of third phase curves. Here, the specific value of the preset translation distance may be determined according to the actual shooting situation or may be determined according to an empirical value, for example, the empirical value may be set to a value between-48 pixels and +48 pixels, which is, of course, not limited in this application. Here, -48 pixels means shifting the second phase curve 48 pixels to the left in the first direction (horizontal direction or horizontal axis direction of the second phase curve), and +48 pixels means shifting the second phase curve 48 pixels to the right in the first direction.
In this way, in the first direction (horizontal direction or horizontal axis direction of the second phase curve), the second phase curve is translated according to different preset translation distances S, so as to generate a plurality of different third phase curves. Fig. 10 is a schematic diagram of a third phase curve generated by performing a translation process on the second phase curve according to a preset translation distance S1 in a horizontal direction in an embodiment.
In step 446b, for each third phase curve, a cross-correlation operation is performed on the first phase curve and the third phase curve, and a cross-correlation function curve between the first phase curve and the second phase curve is generated.
After a plurality of different third phase curves are generated, for each third phase curve, performing a cross-correlation operation on the first phase curve and the third phase curve, and generating a cross-correlation operation result corresponding to the third phase curve, in other words, generating a cross-correlation operation result corresponding to a preset translation distance S of the third phase curve. Here, the cross-correlation operation refers to an operation of calculating the correlation or similarity between the first phase curve and the third phase curve. That is, assuming that m different third phase curves are generated and the preset translation distances corresponding to each third phase curve are S1, S2, … … Sm, then the cross-correlation operation result corresponding to S1 and the cross-correlation operation results corresponding to S2 … … and Sm are generated at this time. Here, the cross-correlation operation result may be used to characterize the degree of correlation or the degree of similarity between the first phase curve and the third phase curve. The higher the correlation degree or the similarity degree between the first phase curve and the third phase curve, namely the smaller the phase difference between the first phase curve and the third phase curve, the smaller the corresponding cross-correlation operation result.
Finally, based on the corresponding relation between the preset translation distance S and the cross-correlation operation result, a function curve can be constructed, wherein the function curve is the cross-correlation function curve between the first phase curve and the second phase curve.
In this embodiment of the present application, when calculating a cross-correlation function curve between a first phase curve and a second phase curve, first, in a first direction, translation processing is performed on the second phase curve according to a plurality of preset translation distances, so as to generate a plurality of third phase curves. And then, performing cross-correlation operation on the first phase curve and the third phase curve for each third phase curve to generate a cross-correlation function curve between the first phase curve and the second phase curve. And performing translation processing on the second phase curve according to a plurality of different preset translation distances to generate a plurality of third phase curves, and performing cross-correlation operation on the first phase curve and the third phase curve to generate a cross-correlation operation result corresponding to the first phase curve and the third phase curve. The cross-correlation operation results corresponding to a plurality of different preset translation distances can be obtained, and then the cross-correlation function curve between the first phase curve and the second phase curve is obtained.
The higher the correlation degree or the similarity degree between the first phase curve and the third phase curve, namely the smaller the phase difference between the first phase curve and the third phase curve, the smaller the corresponding cross-correlation operation result. Therefore, based on the cross-correlation operation results corresponding to a plurality of different preset translation distances, smaller cross-correlation operation results can be screened out. Further, the phase difference between the first phase curve and the second phase curve can be calculated based on the preset translation distance corresponding to the smaller cross-correlation calculation result. Because the cross-correlation operation can well reduce phase noise, the phase difference calculation accuracy can be greatly improved based on the cross-correlation operation mode.
In one embodiment, further describing step 446b, for each third phase curve, performing a cross-correlation operation on the first phase curve and the third phase curve to generate a cross-correlation function curve between the first phase curve and the second phase curve, including:
for each third phase curve, calculating a cross-correlation value between the first phase curve and the third phase curve; the cross-correlation value comprises a cross-correlation value between second direction coordinate values of the target point pairs; the target point pair comprises a first target point on the first phase curve and a second target point which is the same as the first direction coordinate value of the first target point on the third phase curve;
And generating a cross-correlation function curve between the first phase curve and the second phase curve based on the preset translation distance corresponding to each third phase curve and the cross-correlation value between the first phase curve and the third phase curve.
Firstly, in a first direction, carrying out translation processing on the second phase curves according to a plurality of preset translation distances to generate a plurality of third phase curves. Next, for each third phase curve, a cross-correlation value between the first phase curve and the third phase curve is calculated. In some embodiments, the first phase curve includes a plurality of first target points thereon, and the third phase curve includes a plurality of second target points thereon. When calculating the cross-correlation value between the first phase curve and the third phase curve for a certain third phase curve, calculating the cross-correlation value between the first phase curve and the second direction coordinate value of the target point pair on the third phase curve. The target point pair comprises a first target point on the first phase curve and a second target point which is the same as the first direction coordinate value of the first target point on the third phase curve.
For example, assuming that the first direction coordinate value of the first target point selected from the first phase curve is x, the first direction coordinate value of the second target point selected from the third phase curve to match the first target point is also x. Thus, a target point pair is obtained based on the first target point and the second target point, and for each target point pair, a cross correlation value between the second direction coordinate values of the target point pair is calculated. Based on the cross-correlation values between the second direction coordinate values of all the target point pairs on the first phase curve and the third phase curve, the cross-correlation values between the first phase curve and the third phase curve are obtained. Here, the cross-correlation value between the second direction coordinate values of all the target point pairs may be calculated to obtain the cross-correlation value between the first phase curve and the third phase curve.
And finally, obtaining a preset translation distance S corresponding to each third phase curve and a cross correlation value C between the first phase curve and the third phase curve. Then, the preset translation distance S corresponding to each third phase curve is used as a first direction coordinate value, the cross-correlation value C between the first phase curve and the third phase curve is used as a second direction coordinate value, so that a function curve is obtained, and the function curve is used as a cross-correlation function curve between the first phase curve and the second phase curve. For example, a preset translation distance S1 corresponding to a certain third phase curve is taken as a first direction coordinate value, and a cross correlation value C1 between the first phase curve and the third phase curve is taken as a second direction coordinate value; taking a preset translation distance S2 corresponding to a certain third phase curve as a first direction coordinate value, and taking a cross correlation value C2 between the first phase curve and the third phase curve as a second direction coordinate value; and so on, a function curve is obtained. The function curve is used as a cross-correlation function curve between the first phase curve and the second phase curve.
FIG. 11 is a schematic diagram of a cross-correlation function curve between a first phase curve and a second phase curve in one embodiment. The cross-correlation function curve comprises a plurality of points (S, C), wherein S represents a preset translation distance, and C represents a cross-correlation value between a first phase curve and a third phase curve after the second phase curve is translated according to the preset translation distance.
In the embodiment of the present application, when performing a cross-correlation operation on the first phase curve and the third phase curve with respect to each third phase curve to generate a cross-correlation function curve between the first phase curve and the second phase curve, in some embodiments, first, a cross-correlation value between the first phase curve and the third phase curve is calculated with respect to each third phase curve; the cross-correlation value includes a cross-correlation value between the second direction coordinate values of the target point pair. And generating a cross-correlation function curve between the first phase curve and the second phase curve based on the preset translation distance corresponding to each third phase curve and the cross-correlation value between the first phase curve and the third phase curve. Because the cross-correlation operation can well reduce phase noise, the calculation accuracy of the phase difference can be greatly improved by calculating the phase difference based on the cross-correlation function curve between the first phase curve and the second phase curve.
In the above embodiment, it is described that for each third phase curve, the cross-correlation value between the first phase curve and the third phase curve is calculated, and then the cross-correlation function curve between the first phase curve and the second phase curve is calculated based on the preset translation distance corresponding to each third phase curve and the cross-correlation value between the first phase curve and the third phase curve. In this embodiment, the specific implementation step of calculating the cross-correlation value between the first phase curve and the third phase curve is further described, including:
Calculating a difference value between second direction coordinate values of the target point pairs for each target point pair on the first phase curve and the third phase curve;
and calculating the sum of absolute values of differences of the second direction coordinate values of each target point pair, and taking the sum of absolute values as a cross-correlation value between the first phase curve and the third phase curve.
In some embodiments, for example, assuming that the first direction coordinate value of the first target point (x, y 1) selected from the first phase curve is x, the first direction coordinate value of the second target point (x, y 2) selected from the third phase curve to match the first target point is also x. Thus, a target point pair is obtained based on the first target point and the second target point. In one implementation, for each pair of target points, a cross-correlation value between the second direction coordinate values of the target point pair is calculated. In some embodiments, first, a difference y1-y2 between the second direction coordinate values of the target point pair is calculated; next, the sum of absolute values |y1-y2| of the differences in the second direction coordinate values of the respective target point pairs is calculated, and the sum of absolute values is used as a cross-correlation value SAD (Sum of Absolute D-value, sum of absolute values of differences) between the first phase curve and the third phase curve.
In another implementation, for each pair of target points, a cross-correlation value between the second direction coordinate values of the target point pair is calculated. In some embodiments, first, the mean square error of the difference y1-y2 between the coordinate values of the second direction of each target point pair is calculated, and the mean square error is used as the cross-correlation value between the first phase curve and the third phase curve. Of course, in other implementations, other arithmetic operations may be performed on the difference value y1-y2 of the second direction coordinate values of each target point pair, which is not limited in this application.
In this embodiment, for each third phase curve, the cross-correlation value between the first phase curve and the third phase curve is calculated, and in some embodiments, for each target point pair on the first phase curve and the third phase curve, the difference value between the second direction coordinate values of the target point pair may be calculated. And calculating the sum of absolute values of the differences of the second direction coordinate values of each target point pair, and taking the sum of absolute values as a cross-correlation value between the first phase curve and the third phase curve. The mean square error of the difference value of the second direction coordinate values of each target point pair can be calculated, and the mean square error is used as a cross-correlation value between the first phase curve and the third phase curve. The above provides various ways of calculating the cross-correlation values between the first phase curve and the third phase curve so as to obtain cross-correlation values of different dimensions, and then obtain various cross-correlation function curves between the first phase curve and the second phase curve based on the cross-correlation values of different dimensions. Therefore, the phase difference information can be represented from more dimensions through different cross-correlation function curves, so that the accuracy of the finally calculated phase difference is improved, and the focusing precision is improved.
In the above embodiment, a procedure of how to obtain various cross-correlation function curves between the first phase curve and the second phase curve is described. In this embodiment, further describing step 460, calculating a phase difference according to the cross-correlation function curve, driving the lens to perform phase focusing based on the phase difference includes:
sorting the cross-correlation values of each point in the cross-correlation function curve according to the order from small to large to generate a sorting result;
and extracting a preset number of third target points which are ranked at the front from the sequencing result, calculating a phase difference according to preset translation distances corresponding to the preset number of third target points, and driving the lens to perform phase focusing based on the phase difference.
In some embodiments, after obtaining multiple cross-correlation function curves between the first phase curve and the second phase curve, the cross-correlation values of the points in the cross-correlation function curves may be ranked in order from small to large, to generate a ranking result. FIG. 11 is a schematic diagram of a cross-correlation function curve between a first phase curve and a second phase curve in one embodiment. The cross-correlation function curve comprises a plurality of points (S, C), wherein a first direction coordinate value (a horizontal axis coordinate value) S represents a preset translation distance, and a second direction coordinate value (a vertical axis coordinate value) C represents a cross-correlation value between a first phase curve and a third phase curve after the second phase curve is translated according to the preset translation distance.
Therefore, the cross-correlation values (vertical axis coordinate values) of the points in the cross-correlation function curve are sorted in order from small to large, and a sorting result is generated. And extracting a preset number of third target points which are ranked at the front from the sequencing result, wherein the preset number of third target points are target points corresponding to smaller cross correlation values. Since the smaller the cross-correlation value on the cross-correlation function curve between the first phase curve and the second phase curve, the closer the preset translation distance corresponding to the smaller cross-correlation value is to the phase difference, the phase difference can be accurately calculated based on the abscissa values (preset translation distances) of the preset number of third target points that are positioned at the front. In some embodiments, the preset number may be one or more. Assuming that the preset number is one here, the abscissa value (preset translation distance) of the point in the cross-correlation function curve where the cross-correlation value is the smallest can be directly used as the phase difference. Furthermore, the phase focusing is performed based on the phase difference driving lens, so that the accuracy of the phase focusing can be greatly improved.
In another implementation manner, when calculating the phase difference according to the preset translation distances corresponding to the preset number of third target points, a parabola may be fitted based on the coordinate values of the preset number of third target points, the vertex coordinates of the parabola may be obtained, and then the phase difference may be calculated according to the vertex coordinates of the parabola. Assuming that the preset number is more than one (e.g., three), the abscissa and ordinate values of the top three third target points are obtained, and a parabola is fitted according to the abscissa and ordinate values of the three third target points (y=ax 2 +bx+c), and specifically calculating coefficients a, b and c of the parabolas. Finally, calculating the vertex coordinates of the parabola based on the fitted parabola, and then calculating the phase difference according to the vertex coordinates of the parabola. Specifically, the abscissa value in the vertex coordinates of the parabola is taken as the phase difference. Here, the abscissa value in the vertex coordinates of the calculated parabola may not be an integer multiple of the pixel, for example, 12.1 times the pixel. Thus, since the vertex coordinates of the parabola are not integer multiples of the pixels, the phase difference calculated from the vertex coordinates of the parabolaSub-pixel level phase focusing can be achieved for the camera. Further, the accuracy of phase focusing is improved.
When phase focusing is performed by driving the lens based on the phase difference, the defocus distance may be calculated based on the phase difference. In some embodiments, the correspondence between the phase difference value and the defocus distance value may be obtained by calibration.
The corresponding relation between the defocus distance value and the phase difference value is as follows:
defocus=pd×slope (DCC), where DCC (Defocus Conversion Coefficient, defocus coefficient) is obtained by calibration, defocus is a defocus distance value, slope is a slope function; PD is the phase difference value.
The calibration process of the corresponding relation between the phase difference value and the defocus distance value comprises the following steps: dividing the effective focusing stroke of the camera module into 10 equal parts, namely (near focus DAC-far focus DAC)/10, so as to cover the focusing range of the motor; focusing at each focusing DAC (DAC can be 0 to 1023) position, and recording the phase difference of the current focusing DAC position; after finishing the motor focusing stroke, comparing a group of 10 focusing DACs with the obtained PD value; and generating 10 similar ratios K, and fitting two-dimensional data consisting of the DAC and the PD to obtain a straight line with a slope of K.
In the embodiment of the application, the cross-correlation values (vertical axis coordinate values) of each point in the cross-correlation function curve are ordered in order from small to large, and an ordering result is generated. And extracting a preset number of third target points which are ranked at the front from the sequencing result, wherein the preset number of third target points are target points corresponding to smaller cross correlation values. Since the smaller the cross-correlation value on the cross-correlation function curve between the first phase curve and the second phase curve, the closer the preset translation distance corresponding to the smaller cross-correlation value is to the phase difference, the phase difference can be accurately calculated based on the abscissa values (preset translation distances) of the preset number of third target points that are positioned at the front. Furthermore, the phase focusing is performed based on the phase difference driving lens, so that the accuracy of the phase focusing is improved.
In one embodiment, further describing step 420, acquiring the first phase detection image and the second phase detection image by the phase detection pixels in the image sensor includes:
acquiring a first initial phase detection image and a second initial phase detection image through phase detection pixels in an image sensor;
preprocessing a first initial phase detection image and a second initial phase detection image to generate the first phase detection image and the second phase detection image; the preprocessing includes at least one of brightness compensation and filtering processing.
In some embodiments, the first and second initial phase detection images may be acquired using a pair of PD pixels as in fig. 2 or a pair of PD pixels as in fig. 5, respectively. The initial phase detection image includes pixel values of PD pixels located in the same azimuth in the PD pixel pairs.
After the first initial phase detection image and the second initial phase detection image are acquired, the first initial phase detection image and the second initial phase detection image may be preprocessed to generate the first phase detection image and the second phase detection image. In some embodiments, at least one of the brightness compensation and the filtering process may be performed on the first initial phase detection image and the second initial phase detection image, respectively, to generate the first phase detection image and the second phase detection image.
For example, the first initial phase detection image and the second initial phase detection image may be subjected to brightness compensation, respectively, to generate the first phase detection image and the second phase detection image. In general, since the lens is circular, the light received by the image sensor also presents a circular gradient phenomenon, i.e. a bright middle and dark surrounding, and finally, the brightness of the obtained initial phase detection image also presents a non-uniform phenomenon, i.e. a bright middle and dark surrounding. In order to avoid the problem of inaccurate phase difference caused by calculating the phase difference based on the initial phase detection image with uneven brightness, the initial phase detection image with uneven brightness can be subjected to brightness compensation to obtain the initial phase detection image with even brightness. In some embodiments, the pixel values of the pixels in the initial phase detection image may be multiplied by different gain values, respectively, to perform brightness compensation. For example, since the brightness of the initial phase detection image exhibits a non-uniformity phenomenon in which the middle is bright and the periphery is dark, a larger gain value may be configured for the edge region of the initial phase detection image and a smaller gain value may be configured for the center region of the initial phase detection image. It is assumed that a larger gain value of 3.0 is arranged in the most edge region of the initial phase detection image, a smaller gain value of 1.0 is arranged in the center of the initial phase detection image, and the gain values are sequentially increased from the center to the outside.
For example, the first initial phase detection image and the second initial phase detection image may be subjected to filtering processing, respectively, to eliminate noise in the images, so as to generate the first phase detection image and the second phase detection image. Here, the bilateral filtering mode can be adopted to perform filtering processing, so that the edge details of the image can be better kept while noise is eliminated. Of course, other filtering methods may be used for processing, which is not limited in this application.
Of course, the first initial phase detection image and the second initial phase detection image may be subjected to brightness compensation and filtering, respectively, to generate the first phase detection image and the second phase detection image. Here, the order in which the luminance compensation and the filtering process are performed is not limited.
In the embodiment of the application, a first initial phase detection image and a second initial phase detection image are acquired through phase detection pixels in an image sensor. Preprocessing the first initial phase detection image and the second initial phase detection image to generate a first phase detection image and a second phase detection image; the preprocessing includes at least one of brightness compensation and filtering processing. Thus, the obtained initial phase detection image is preprocessed, image noise is eliminated, and the brightness of the image is more uniform. Therefore, the phase difference is calculated based on the phase detection image obtained after the processing, and it is apparent that the accuracy of the calculated phase difference is greatly improved.
In a specific embodiment, as shown in fig. 12, there is provided a phase focusing method, including:
step 1202, acquiring a first initial phase detection image and a second initial phase detection image through phase detection pixels in an image sensor;
step 1204, preprocessing (image calibration) the first initial phase detection image (right PD image) and the second initial phase detection image (left PD image) to generate a first phase detection image (right PD image after calibration) and a second phase detection image (left PD image after calibration); the preprocessing includes at least one of brightness compensation and filtering processing;
fig. 13 is a flowchart of a phase focusing method according to another embodiment. As shown in fig. 13, the right PD pattern and the left PD pattern are separated from the original image captured by the image sensor. And (3) carrying out brightness compensation on the left PD image aiming at the left PD image, and then carrying out filtering processing to obtain a calibrated left PD image. And (3) performing brightness compensation on the right PD image aiming at the right PD image, and performing filtering processing to obtain a calibrated right PD image.
Step 1206, calculating a first phase curve corresponding to the first phase detection image according to the pixel values in the first phase detection image; comprising the following steps:
Step 1206a, clipping the first phase detection image according to a preset focusing area to generate a first target phase detection image;
referring to fig. 13, the calibrated left PD map is clipped according to a preset focusing area. The first phase curve, i.e., the left PD curve, is calculated based on the cropped left PD map. And cutting the right PD image after calibration according to a preset focusing area aiming at the right PD image after calibration. The first phase curve, i.e., the right PD curve, is calculated based on the clipped right PD map.
Step 1206b, summing pixel values of each group of first type pixels in the first target phase detection image to generate a first summation result corresponding to each group of first type pixels; each group of first-class pixels comprises a plurality of pixels with the same first-direction coordinate value;
step 1206c, for each group of first-class pixels, generating a first phase curve corresponding to the first phase detection image by using the first summation result corresponding to the first-class pixels as the second direction coordinate value and the first direction coordinate value of the first-class pixels as the first direction coordinate value; wherein, the first direction and the second direction form a preset included angle.
Step 1208, calculating a second phase curve corresponding to the second phase detection image according to the pixel values in the second phase detection image; comprising the following steps:
Step 1208a, summing pixel values of each group of second type pixels in the second phase detection image to generate a second summation result corresponding to each group of second type pixels; each group of second type pixels comprises a plurality of pixels with the same first direction coordinate value;
step 1208b, for each group of second-class pixels, generating a second phase curve corresponding to the second phase detection image by using the second summation result corresponding to the second-class pixels as the second direction coordinate value and the first direction coordinate value of the second-class pixels as the first direction coordinate value.
Step 1210, performing a translation process on the second phase curve according to a plurality of preset translation distances in a first direction (horizontal axis), so as to generate a plurality of third phase curves;
step 1212, for each third phase curve, calculating a difference between the second direction coordinate values of the target point pairs for each target point pair on the first phase curve and the third phase curve;
step 1214, calculating the sum of absolute values of the differences of the second direction coordinate values of each target point pair, and taking the sum of absolute values as the cross-correlation value between the first phase curve and the third phase curve;
step 1216, generating a cross-correlation function curve between the first phase curve and the second phase curve based on the preset translation distance corresponding to each third phase curve and the cross-correlation value between the first phase curve and the third phase curve, and using the cross-correlation function curve as the cross-correlation function curve between the pixel values in the first phase detection image and the second phase detection image;
Referring to fig. 13, based on the method from step 1210 to step 1216, a cross-correlation function curve between the left PD curve and the right PD curve is calculated based on the left PD curve and the right PD curve. And based on the method of step 1218-step 1220, calculate the phase difference, drive the lens to focus phase based on the phase difference.
Step 1218, sorting the cross-correlation values of each point in the cross-correlation function curve according to the order from small to large, and generating a sorting result;
step 1220, extracting the first preset number of third target points from the sorting result, calculating a phase difference according to the preset translation distance corresponding to the preset number of third target points, and performing phase focusing based on the phase difference driving lens.
In the embodiment of the application, the first phase detection image and the second phase detection image are acquired through the phase detection pixels in the image sensor. And calculating a cross-correlation function curve between pixel values in the first phase detection image and the second phase detection image, calculating a phase difference according to the cross-correlation function curve, and driving the lens to perform phase focusing based on the phase difference. Because the correlation function can better represent the correlation between one signal (function) and the time-shifted signal (function) of the other signal, the correlation function curve between the pixel values in the first phase detection image and the second phase detection image is calculated, and the correlation function curve can better represent the correlation between the pixel values in the first phase detection image and the second phase detection image, so that the phase difference can be calculated based on the correlation function curve, and the accuracy of the calculated phase difference is improved. Finally, the phase difference drives the lens to perform phase focusing, so that the focusing accuracy is improved.
It should be understood that, although the steps in the above-described flowcharts are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described above may include a plurality of sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, and the order of execution of the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with at least a part of the sub-steps or stages of other steps or other steps.
In one embodiment, as shown in fig. 14, a phase focusing apparatus 1400 is provided, applied to an electronic device, the electronic device including an image sensor, the apparatus phase focusing 1400 including:
a phase detection image acquisition module 1420 for acquiring a first phase detection image and a second phase detection image by phase detection pixels in the image sensor;
The cross-correlation function curve calculation module 1440 is configured to calculate a cross-correlation function curve between pixel values in the first phase detection image and the second phase detection image;
the phase difference calculating and focusing module 1460 is configured to calculate a phase difference according to the cross correlation function curve, and drive the lens to perform phase focusing based on the phase difference.
In one embodiment, the cross-correlation function curve calculation module 1440 includes:
a first phase curve calculation unit for calculating a first phase curve corresponding to the first phase detection image based on pixel values in the first phase detection image;
a second phase curve calculation unit for calculating a second phase curve corresponding to the second phase detection image based on pixel values in the second phase detection image;
and the cross-correlation function curve calculation unit is used for calculating a cross-correlation function curve between the first phase curve and the second phase curve according to the first phase curve and the second phase curve, and taking the cross-correlation function curve as the cross-correlation function curve between pixel values in the first phase detection image and the second phase detection image.
In one embodiment, the first phase curve calculation unit includes:
a first summation result calculation subunit, configured to sum pixel values of each group of first type pixels in the first phase detection image, and generate a first summation result corresponding to each group of first type pixels; each group of first-class pixels comprises a plurality of pixels with the same first-direction coordinate value;
A first phase curve generating subunit, configured to generate, for each group of first-class pixels, a first phase curve corresponding to the first phase detection image, with a first summation result corresponding to the first-class pixels as a second direction coordinate value, and with a first direction coordinate value of the first-class pixels as a first direction coordinate value; wherein, the first direction and the second direction form a preset included angle.
In an embodiment, the first summation result calculation subunit is further configured to, for each group of pixels of the first class in the first phase detection image, extract a target pixel of the first class from among the pixels of the first class along the second direction; the pixel values of the first class target pixels are summed to generate a first summed result corresponding to each group of first class pixels.
In one embodiment, the first summation result calculation subunit is further configured to crop the first phase detection image according to a preset focusing area, to generate a first target phase detection image; the pixel values of the first type pixels of each group in the first target phase detection image are summed to generate a first summation result corresponding to each group of first type pixels.
In one embodiment, the second phase curve calculation unit includes:
A second summation result calculation subunit, configured to sum pixel values of each group of second class pixels in the second phase detection image, and generate a second summation result corresponding to each group of second class pixels; each group of second type pixels comprises a plurality of pixels with the same first direction coordinate value;
a second phase curve calculation subunit, configured to generate, for each group of second-class pixels, a second phase curve corresponding to the second-phase detection image by using, as a second direction coordinate value, a second summation result corresponding to the second-class pixel, and using, as a first direction coordinate value, a first direction coordinate value of the second-class pixel; the first direction and the second direction form a preset included angle.
In one embodiment, the cross-correlation function curve calculation unit includes:
the third phase curve generation subunit is used for carrying out translation processing on the second phase curve according to a plurality of preset translation distances in the first direction to generate a plurality of third phase curves;
and the cross-correlation operation subunit is used for carrying out cross-correlation operation on the first phase curve and the third phase curve aiming at each third phase curve to generate a cross-correlation function curve between the first phase curve and the second phase curve.
In one embodiment, the cross-correlation operator unit is further configured to calculate, for each third phase curve, a cross-correlation value between the first phase curve and the third phase curve; the cross-correlation value comprises a cross-correlation value between second direction coordinate values of the target point pairs; the target point pair comprises a first target point on the first phase curve and a second target point which is the same as the first direction coordinate value of the first target point on the third phase curve; and generating a cross-correlation function curve between the first phase curve and the second phase curve based on the preset translation distance corresponding to each third phase curve and the cross-correlation value between the first phase curve and the third phase curve.
In one embodiment, the cross-correlation operator unit is further configured to calculate, for each target point pair on the first phase curve and the third phase curve, a difference value between second direction coordinate values of the target point pair; and calculating the sum of absolute values of differences of the second direction coordinate values of each target point pair, and taking the sum of absolute values as a cross-correlation value between the first phase curve and the third phase curve.
In one embodiment, after calculating the difference between the second direction coordinate values of the target point pairs for each of the target point pairs on the first phase curve and the third phase curve, the cross-correlation operator unit is further configured to:
And calculating the mean square error of the difference value of the coordinate values of the second direction of each target point pair, and taking the mean square error as the cross correlation value between the first phase curve and the third phase curve.
In one embodiment, the phase difference calculating and focusing module 1460 is configured to sort the cross-correlation values of each point in the cross-correlation function curve in order from small to large, and generate a sorting result; and extracting a preset number of third target points which are ranked at the front from the sequencing result, calculating a phase difference according to preset translation distances corresponding to the preset number of third target points, and driving the lens to perform phase focusing based on the phase difference.
In one embodiment, the phase difference calculating and focusing module 1460 is configured to fit a parabola based on coordinate values of a preset number of third target points, and obtain vertex coordinates of the parabola; the phase difference is calculated from the vertex coordinates of the parabola.
In one embodiment, a phase detection image acquisition module 1420 is used to acquire a first initial phase detection image and a second initial phase detection image by phase detection pixels in an image sensor; preprocessing a first initial phase detection image and a second initial phase detection image to generate the first phase detection image and the second phase detection image; the preprocessing includes at least one of brightness compensation and filtering processing.
The division of the various modules in the phase focus device described above is for illustration only, and in other embodiments, the phase focus device may be divided into different modules as desired to perform all or part of the functions of the phase focus device described above.
For specific limitations of the phase focusing apparatus, reference may be made to the above limitations of the phase focusing method, and no further description is given here. The respective modules in the phase focusing apparatus described above may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
Fig. 15 is a schematic diagram of an internal structure of an electronic device in one embodiment. The electronic device may be any terminal device such as a mobile phone, a tablet computer, a notebook computer, a desktop computer, a PDA (Personal Digital Assistant ), a POS (Point of Sales), a car-mounted computer, and a wearable device. The electronic device includes a processor and a memory connected by a system bus. Wherein the processor may comprise one or more processing units. The processor may be a CPU (Central Processing Unit ) or DSP (Digital Signal Processing, digital signal processor), etc. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program is executable by a processor for implementing a phase focusing method provided in the following embodiments. The internal memory provides a cached operating environment for operating system computer programs in the non-volatile storage medium.
The implementation of each module in the phase focusing apparatus provided in the embodiments of the present application may be in the form of a computer program. The computer program may be run on an electronic device. Program modules of the computer program may be stored in the memory of the electronic device. Which when executed by a processor, performs the steps of the methods described in the embodiments of the present application.
Embodiments of the present application also provide a computer-readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of a phase focusing method.
Embodiments of the present application also provide a computer program product containing instructions that, when run on a computer, cause the computer to perform a phase focusing method.
It should be noted that, user information (including but not limited to user equipment information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. The nonvolatile Memory may include a ROM (Read-Only Memory), a PROM (Programmable Read-Only Memory ), an EPROM (Erasable Programmable Read-Only Memory, erasable programmable Read-Only Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), or a flash Memory. Volatile memory can include RAM (Random Access Memory ), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as SRAM (Static Random Access Memory ), DRAM (Dynamic Random Access Memory, dynamic random access memory), SDRAM (Synchronous Dynamic Random Access Memory ), double data rate DDR SDRAM (Double Data Rate Synchronous Dynamic Random Access memory, double data rate synchronous dynamic random access memory), ESDRAM (Enhanced Synchronous Dynamic Random Access memory ), SLDRAM (Sync Link Dynamic Random Access Memory, synchronous link dynamic random access memory), RDRAM (Rambus Dynamic Random Access Memory, bus dynamic random access memory), DRDRAM (Direct Rambus Dynamic Random Access Memory, interface dynamic random access memory).
The foregoing examples represent only a few embodiments of the present application, which are described in more detail and are not thereby to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application is to be determined by the claims appended hereto.

Claims (15)

1. A phase focusing method, characterized by being applied to an electronic device, the electronic device including an image sensor, the method comprising:
acquiring a first phase detection image and a second phase detection image through phase detection pixels in the image sensor;
calculating a cross-correlation function curve between pixel values in the first phase detection image and the second phase detection image;
and calculating a phase difference according to the cross correlation function curve, and driving the lens to perform phase focusing based on the phase difference.
2. The method of claim 1, wherein the calculating a cross-correlation function curve between pixel values in the first phase detection image and the second phase detection image comprises:
Calculating a first phase curve corresponding to the first phase detection image according to pixel values in the first phase detection image;
calculating a second phase curve corresponding to the second phase detection image according to the pixel values in the second phase detection image;
and calculating a cross-correlation function curve between the first phase curve and the second phase curve according to the first phase curve and the second phase curve, and taking the cross-correlation function curve as a cross-correlation function curve between pixel values in the first phase detection image and the second phase detection image.
3. The method of claim 2, wherein calculating a first phase curve corresponding to the first phase detection image from pixel values in the first phase detection image comprises:
summing pixel values of each group of first type pixels in the first phase detection image to generate a first summation result corresponding to each group of first type pixels; each group of the first type pixels comprises a plurality of pixels with the same first direction coordinate values;
for each group of the first type pixels, taking a first summation result corresponding to the first type pixels as a second direction coordinate value, and taking the first direction coordinate value of the first type pixels as a first direction coordinate value to generate a first phase curve corresponding to the first phase detection image; wherein, the first direction and the second direction form a preset included angle.
4. A method according to claim 3, wherein summing pixel values of each group of first class pixels in the first phase-detected image generates a first summation result corresponding to each group of first class pixels, comprising:
extracting first type target pixels from the first type pixels at intervals along the second direction for each group of the first type pixels in the first phase detection image;
and summing the pixel values of the first type target pixels to generate first summation results corresponding to each group of first type pixels.
5. A method according to claim 3, wherein summing pixel values of each group of first class pixels in the first phase-detected image generates a first summation result corresponding to each group of first class pixels, comprising:
clipping the first phase detection image according to a preset focusing area to generate a first target phase detection image;
and summing pixel values of each group of first type pixels in the first target phase detection image to generate a first summation result corresponding to each group of first type pixels.
6. The method of claim 2, wherein calculating a second phase curve corresponding to the second phase detection image from pixel values in the second phase detection image comprises:
Summing pixel values of each group of second type pixels in the second phase detection image to generate second summation results corresponding to each group of second type pixels; each group of the second type pixels comprises a plurality of pixels with the same coordinate value in the first direction;
for each group of the second-class pixels, taking a second summation result corresponding to the second-class pixels as a second direction coordinate value, and taking the first direction coordinate value of the second-class pixels as a first direction coordinate value to generate a second phase curve corresponding to the second phase detection image; the first direction and the second direction form a preset included angle.
7. The method according to any of claims 3-6, wherein said calculating a cross-correlation function curve between the first phase curve and the second phase curve from the first phase curve and the second phase curve comprises:
in the first direction, carrying out translation processing on the second phase curves according to a plurality of preset translation distances to generate a plurality of third phase curves;
and performing cross-correlation operation on the first phase curve and the third phase curve for each third phase curve to generate a cross-correlation function curve between the first phase curve and the second phase curve.
8. The method of claim 7, wherein the cross-correlating the first phase curve with the third phase curve for each of the third phase curves to generate a cross-correlation function curve between the first phase curve and the second phase curve comprises:
calculating a cross-correlation value between the first phase curve and the third phase curve for each of the third phase curves; the cross-correlation value comprises a cross-correlation value between coordinate values of a second direction of the target point pair; the target point pair comprises a first target point on the first phase curve and a second target point which is the same as the first direction coordinate value of the first target point on the third phase curve;
and generating a cross-correlation function curve between the first phase curve and the second phase curve based on a preset translation distance corresponding to each third phase curve and a cross-correlation value between the first phase curve and the third phase curve.
9. The method of claim 1, wherein calculating a phase difference from the cross-correlation function curve, driving a lens based on the phase difference for phase focusing, comprises:
Sorting the cross-correlation values of each point in the cross-correlation function curve according to the order from small to large to generate a sorting result;
and extracting a preset number of third target points which are ranked at the front from the sequencing result, calculating a phase difference according to preset translation distances corresponding to the preset number of third target points, and driving a lens to perform phase focusing based on the phase difference.
10. The method of claim 9, wherein calculating the phase difference according to the preset translation distance corresponding to the preset number of third target points comprises:
fitting a parabola based on the coordinate values of the preset number of third target points, and obtaining vertex coordinates of the parabola;
and calculating the phase difference according to the vertex coordinates of the parabola.
11. The method of claim 1, wherein the acquiring a first phase detection image and a second phase detection image by phase detection pixels in the image sensor comprises:
acquiring a first initial phase detection image and a second initial phase detection image through phase detection pixels in the image sensor;
preprocessing the first initial phase detection image and the second initial phase detection image to generate the first phase detection image and the second phase detection image; the preprocessing includes at least one of brightness compensation and filtering processing.
12. A phase focusing apparatus, characterized by being applied to an electronic device including an image sensor, comprising:
a phase detection image acquisition module for acquiring a first phase detection image and a second phase detection image through phase detection pixels in the image sensor;
the cross-correlation function curve calculation module is used for calculating a cross-correlation function curve between pixel values in the first phase detection image and the second phase detection image;
and the phase difference calculating and focusing module is used for calculating a phase difference according to the cross correlation function curve and driving the lens to perform phase focusing based on the phase difference.
13. An electronic device comprising a memory and a processor, the memory having stored therein a computer program which, when executed by the processor, causes the processor to perform the steps of the phase focusing method according to any one of claims 1 to 11.
14. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the phase focusing method according to any one of claims 1 to 11.
15. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, implements the steps of the phase focusing method as claimed in any one of claims 1 to 11.
CN202211162623.XA 2022-09-23 2022-09-23 Phase focusing method, device, electronic equipment, storage medium and product Pending CN117835054A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211162623.XA CN117835054A (en) 2022-09-23 2022-09-23 Phase focusing method, device, electronic equipment, storage medium and product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211162623.XA CN117835054A (en) 2022-09-23 2022-09-23 Phase focusing method, device, electronic equipment, storage medium and product

Publications (1)

Publication Number Publication Date
CN117835054A true CN117835054A (en) 2024-04-05

Family

ID=90513863

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211162623.XA Pending CN117835054A (en) 2022-09-23 2022-09-23 Phase focusing method, device, electronic equipment, storage medium and product

Country Status (1)

Country Link
CN (1) CN117835054A (en)

Similar Documents

Publication Publication Date Title
EP4013033A1 (en) Method and apparatus for focusing on subject, and electronic device, and storage medium
EP3048787B1 (en) Image processing apparatus, image pickup apparatus, image processing method, program, and storage medium
JP2018084982A (en) Image processing apparatus, image processing method, and program
EP2504992A2 (en) Image processing apparatus and method
CN112866511B (en) Imaging assembly, focusing method and device and electronic equipment
US8433187B2 (en) Distance estimation systems and method based on a two-state auto-focus lens
JPWO2011010438A1 (en) Parallax detection device, distance measuring device, and parallax detection method
CN113660415A (en) Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium
CN112866675B (en) Depth map generation method and device, electronic equipment and computer-readable storage medium
CN112866553B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866542A (en) Focus tracking method and apparatus, electronic device, and computer-readable storage medium
US10204400B2 (en) Image processing apparatus, imaging apparatus, image processing method, and recording medium
CN112866655B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN112866510B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN111325691B (en) Image correction method, apparatus, electronic device, and computer-readable storage medium
CN117835054A (en) Phase focusing method, device, electronic equipment, storage medium and product
KR100835058B1 (en) Image processing method for extending depth of field
CN112866548B (en) Phase difference acquisition method and device and electronic equipment
CN112866547B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866546B (en) Focusing method and device, electronic equipment and computer readable storage medium
JP2018148513A (en) Image processing system, image processing method, and program
CN112866545A (en) Focusing control method and device, electronic equipment and computer readable storage medium
CN112862880B (en) Depth information acquisition method, device, electronic equipment and storage medium
CN112866552B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866543B (en) Focusing control method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination