CN112866550B - Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium - Google Patents

Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium Download PDF

Info

Publication number
CN112866550B
CN112866550B CN201911101457.0A CN201911101457A CN112866550B CN 112866550 B CN112866550 B CN 112866550B CN 201911101457 A CN201911101457 A CN 201911101457A CN 112866550 B CN112866550 B CN 112866550B
Authority
CN
China
Prior art keywords
phase difference
image
pixel point
pixel
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911101457.0A
Other languages
Chinese (zh)
Other versions
CN112866550A (en
Inventor
贾玉虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911101457.0A priority Critical patent/CN112866550B/en
Priority to PCT/CN2020/122790 priority patent/WO2021093537A1/en
Publication of CN112866550A publication Critical patent/CN112866550A/en
Application granted granted Critical
Publication of CN112866550B publication Critical patent/CN112866550B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Abstract

The application relates to a phase difference acquisition method and device, an electronic device and a computer readable storage medium. The method comprises the following steps: acquiring a first phase difference and a second phase difference of a pixel point, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle; acquiring the position of the pixel point in the image; and determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point. By adopting the scheme of the application, the accuracy of phase difference acquisition can be improved.

Description

Phase difference acquisition method and device, electronic equipment and computer readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for acquiring a phase difference, an electronic device, and a computer-readable storage medium.
Background
When an image is captured, in order to ensure that the image is captured clearly, focusing is generally required on the electronic device, and the focusing refers to a process of adjusting the distance between a lens and an image sensor. Currently, a common focusing method includes phase detection auto-focusing. In order to perform phase detection automatic focusing, some phase detection pixel points may be generally set in pairs among pixel points included in an image sensor, and phase differences are acquired by the phase detection pixel points to perform focusing. Due to the existence of aberration in the camera, the traditional phase difference acquisition method has the problem of low precision.
Disclosure of Invention
The embodiment of the application provides a phase difference acquisition method, a phase difference acquisition device, an electronic device and a computer readable storage medium, which can improve the phase difference acquisition precision.
A phase difference acquisition method, the method comprising:
acquiring a first phase difference and a second phase difference of a pixel point in an image, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle;
acquiring the position of a pixel point in an image;
and determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point.
A phase difference acquisition apparatus comprising:
the first obtaining module is used for obtaining a first phase difference and a second phase difference of a pixel point in an image, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle;
the second acquisition module is used for acquiring the positions of the pixel points in the image;
and the target phase difference determining module is used for determining the target phase difference of the pixel points according to the positions of the pixel points in the image.
An electronic device comprising a memory and a processor, the memory having a computer program stored thereon, the computer program, when executed by the processor, causing the processor to perform the steps of:
acquiring a first phase difference and a second phase difference of a pixel point in an image, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle;
acquiring the positions of pixel points in an image;
and determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring a first phase difference and a second phase difference of a pixel point in an image, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle;
acquiring the positions of pixel points in an image;
and determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point.
According to the phase difference obtaining method and device, the electronic equipment and the computer readable storage medium, the first phase difference and the second phase difference of the pixel points in the image are obtained, wherein the first direction corresponding to the first phase difference and the second direction corresponding to the second phase difference form the preset included angle, one pixel point corresponds to two phase differences, the position of the pixel point in the image is obtained, the target phase difference of the pixel point is determined according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point, the target phase difference can be determined according to the position of the pixel point, the phase difference obtaining precision is improved, image blurring caused by the image difference is effectively avoided, and imaging is clearer.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the embodiments or the prior art descriptions will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic diagram of image processing circuitry in one embodiment;
FIG. 2 is a flow chart of a phase difference acquisition method in one embodiment;
FIG. 3 is a schematic illustration of a smear in one embodiment;
FIG. 4 is a schematic illustration of an image field range in one embodiment;
FIG. 5 is a diagram illustrating an exemplary angular division of pixels;
FIG. 6 is a flow diagram illustrating the configuration of the first weights and the second weights in one embodiment;
FIG. 7 is a diagram illustrating an exemplary pixel structure;
FIG. 8 is a schematic diagram of an image coordinate system and a pixel coordinate system in one embodiment;
FIG. 9 is a diagram showing an internal structure of an image sensor according to an embodiment;
FIG. 10 is a diagram illustrating a pixel group Z according to an embodiment;
FIG. 11 is a schematic diagram of a process for obtaining a first phase difference and a second phase difference according to one embodiment;
FIG. 12 is a schematic diagram of the determination of pixels at the same location from respective intermediate phase difference maps in one embodiment;
FIG. 13 is a schematic illustration of a target phase difference map in one embodiment;
FIG. 14 is a diagram illustrating a slicing process performed on a target luminance graph in a first direction according to one embodiment;
FIG. 15 is a diagram illustrating a slicing process performed on a target luminance graph in a second direction in one embodiment;
FIG. 16 is a flow chart illustrating the process of determining phase difference values between matched pixels according to one embodiment;
fig. 17 is a block diagram showing a configuration of a phase difference acquisition apparatus according to an embodiment;
fig. 18 is a schematic internal structure diagram of an electronic device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clearly understood, the present application is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that the terms "first," "second," and the like as used herein may be used herein to describe various data, but the data is not limited by these terms. These terms are only used to distinguish one type of data from another. For example, a first phase difference may be referred to as a second phase difference, and similarly, a second phase difference may be referred to as a first phase difference, without departing from the scope of the present application. The first phase difference and the second client are both phase differences, but they are not the same phase difference.
The embodiment of the application provides electronic equipment. The electronic device includes therein an Image Processing circuit, which may be implemented using hardware and/or software components, and may include various Processing units that define an ISP (Image Signal Processing) pipeline. FIG. 1 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 1, for convenience of explanation, only aspects of the image processing technology related to the embodiments of the present application are shown.
As shown in fig. 1, the image processing circuit includes an ISP processor 140 and control logic 150. The image data captured by the imaging device 110 is first processed by the ISP processor 140, and the ISP processor 140 analyzes the image data to capture image statistics that may be used to determine and/or control one or more parameters of the imaging device 110. The imaging device 110 may include a camera having one or more lenses 112 and an image sensor 114. The image sensor 114 may include an array of color filters (e.g., Bayer filters), and the image sensor 114 may acquire light intensity and wavelength information captured with each imaged pixel of the image sensor 114 and provide a set of raw image data that may be processed by the ISP processor 140. The attitude sensor 120 (e.g., three-axis gyroscope, hall sensor, accelerometer) may provide parameters of the acquired image processing (e.g., anti-shake parameters) to the ISP processor 140 based on the type of interface of the attitude sensor 120. The attitude sensor 120 interface may utilize an SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
In addition, the image sensor 114 may also send raw image data to the attitude sensor 120, the sensor 120 may provide the raw image data to the ISP processor 140 based on the type of interface of the attitude sensor 120, or the attitude sensor 120 may store the raw image data in the image memory 130.
The ISP processor 140 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel point may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 140 may perform one or more image processing operations on the raw image data, collecting statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
The ISP processor 140 may also receive image data from the image memory 130. For example, the attitude sensor 120 interface sends raw image data to the image memory 130, and the raw image data in the image memory 130 is then provided to the ISP processor 140 for processing. The image Memory 130 may be a portion of a Memory device, a storage device, or a separate dedicated Memory within an electronic device, and may include a DMA (Direct Memory Access) feature.
Upon receiving raw image data from the image sensor 114 interface or from the attitude sensor 120 interface or from the image memory 130, the ISP processor 140 may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to image memory 130 for additional processing before being displayed. ISP processor 140 receives the processed data from image memory 130 and performs image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. The image data processed by ISP processor 140 may be output to display 160 for viewing by a user and/or further processed by a Graphics engine or GPU (Graphics Processing Unit). Further, the output of the ISP processor 140 may also be sent to the image memory 130, and the display 160 may read image data from the image memory 130. In one embodiment, image memory 130 may be configured to implement one or more frame buffers.
The statistics determined by ISP processor 140 may be sent to control logic 150 unit. For example, the statistical data may include image sensor 114 statistics such as vibration frequency of a gyroscope, auto exposure, auto white balance, auto focus, flicker detection, black level compensation, lens 112 shading correction, and the like. Control logic 150 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of imaging device 110 and ISP processor 140 based on the received statistical data. For example, the control parameters of the imaging device 110 may include attitude sensor 120 control parameters (e.g., gain, integration time of exposure control, anti-shake parameters, etc.), camera flash control parameters, camera anti-shake displacement parameters, lens 112 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 112 shading correction parameters.
In one embodiment, the image sensor 114 in the imaging device (camera) may include an electronic device including an image sensor including a plurality of pixel groups arranged in an array, each pixel group including M × N pixels arranged in an array; each pixel point corresponds to a photosensitive unit, wherein M and N are both natural numbers which are greater than or equal to 2.
A first image is acquired by the lens 112 and the image sensor 114 in the imaging device (camera) 110 and sent to the ISP processor 140. After receiving the first image, the ISP processor 140 may perform main body detection on the first image to obtain the region of interest in the first image, or may obtain the region of interest by obtaining a region selected by the user as the region of interest, or by obtaining the region of interest in other manners, which is not limited to this.
The ISP processor 140 is configured to obtain a first phase difference and a second phase difference of a pixel point, where a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle; acquiring the positions of pixel points in an image; and determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point. The ISP processor 140 may send information regarding the phase, etc. to the control logic 150.
After receiving the information about the target area, the control logic 150 controls the lens 112 in the imaging device (camera) to move so as to focus on the position in the actual scene corresponding to the target area.
Fig. 2 is a flowchart of a phase difference acquisition method in one embodiment. The phase difference obtaining method in the present embodiment is described by taking the electronic device operating in fig. 1 as an example. As shown in fig. 2, the phase difference acquisition method includes steps 202 to 206.
Step 202, a first phase difference and a second phase difference of a pixel point in an image are obtained, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle.
The phase difference refers to a difference in position of an image formed by imaging light rays incident on the lens from different directions on the image sensor. Each pixel point has a corresponding first phase difference and a corresponding second phase difference.
Specifically, the electronic device obtains a first phase difference and a second phase difference of a pixel point in an image. The first direction of the first phase difference may be an image vertical direction, and the second direction corresponding to the second phase difference may be an image horizontal direction. Alternatively, the first direction corresponding to the first phase difference may be an image horizontal direction, and the second direction corresponding to the second phase difference may be an image vertical direction. A preset included angle is formed between the first direction corresponding to the first phase difference and the second direction corresponding to the second phase difference. Wherein the preset included angle may be any included angle other than 0 degrees, 180 degrees, and 360 degrees. The first direction corresponding to the first phase difference is a 45-degree direction, and the second direction corresponding to the second phase difference is a 135-degree direction, etc., but the present invention is not limited thereto.
In this embodiment, the electronic device may obtain the first phase difference and the second phase difference of the pixel points in the image, and determine the first phase difference and the second phase difference corresponding to the pixel point group according to the first phase difference and the second phase difference of the pixel points in the image.
And 204, acquiring the positions of the pixel points in the image.
Specifically, the position of the pixel point in the image may specifically be an angle of the pixel point in the image, an angle and a direction of the pixel point in the image, or a region of the pixel point in the image, which is not limited to this. A pixel in an image is a minimum constituent unit of the image, and is generally represented by a number sequence, and in general, the number sequence may be referred to as a pixel point value of the pixel. And the electronic equipment acquires the position of the pixel point in the image according to the coordinate of the pixel point. The coordinates of the pixel points may include pixel point coordinates of the pixel points and image coordinates of the pixel points.
In this embodiment, the electronic device may obtain, according to the pixel point coordinate, the angle at which the pixel point is located from the correspondence between the pixel point coordinate and the angle. Or, the electronic device may obtain the region corresponding to the pixel point from the correspondence between the pixel point coordinates and the region according to the pixel point coordinates. Or, the electronic device may calculate the angle of the pixel in the image coordinate system according to the coordinates of the pixel.
And step 206, determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point.
In the actual optical system, the result of non-paraxial ray tracing and the result of paraxial ray tracing do not match each other, and the deviation from the ideal state of gaussian optics is called aberration. The imaged aberrations include spherical aberration, coma, curvature of field, astigmatism, chromatic aberration, and distortion. Besides distortion, other kinds of aberrations may cause a reduction in the imaging quality of the edge. FIG. 3 is a schematic illustration of a smear in one embodiment. The circled portion of the ellipse in fig. 3 is the smear generated by the presence of the aberration. FIG. 4 is a diagram illustrating image field coverage in one embodiment. The meridional curve and the sagittal curve can reflect the optical quality, astigmatism, lens contrast, resolution and the like of the lens. Meridian plane: the plane formed by the principal ray of the off-axis object point and the principal axis of the optical system is called the meridian plane of the optical system. An image plane where the meridional image points are located is called a meridional image plane. Sagittal plane: the plane perpendicular to the meridian plane and the principal ray of the off-axis object point is called the sagittal plane of the optical system. And an image plane where the sagittal image points are located is called a sagittal image plane. FIG. 5 is a diagram illustrating an exemplary angular division of pixels. Because the shape of the lens is generally thick in the middle and thin at the edges. I.e. the larger the radius, the thinner the lens. The circumferences on any same radius are of the same thickness. Therefore, streaks are not likely to occur in lines extending from the center to the periphery, and streaks are likely to occur in vertical lines or horizontal lines as shown in fig. 3. Therefore, in fig. 5, lines near 0 degrees and 180 degrees can be calculated by using the vertical phase difference, which is the vertical phase difference, and lines at 90 degrees and 270 degrees can be calculated by using the horizontal phase difference, which is the left-right phase difference.
Specifically, the electronic device determines the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point. For example, the electronic device may determine whether to select the first phase difference or the second phase difference as the target phase difference according to the angle at which the pixel point is located. Or the electronic device may determine the angle range of the pixel point according to the angle of the pixel point, determine a first weight of the first phase difference and a second weight corresponding to the second phase difference, and perform weighting processing on the first phase difference and the second phase difference according to the first weight and the second weight to obtain the target phase difference. Or, the electronic device may determine the first weight of the first phase difference and the second weight corresponding to the second phase difference directly according to the target sub-region where the pixel point is located in the image, and the target phase difference of the pixel point is obtained through calculation. Or the electronic equipment determines to select the first phase difference or the second phase difference as the target phase difference according to the target subarea where the pixel points are located in the image.
In this embodiment, the electronic device may control the lens to move for focusing according to the target phase difference. Because the accuracy of the target phase difference is improved, focusing is carried out only according to the target phase difference during focusing, multiple times of movement are not needed, and the focusing efficiency is improved.
The phase difference obtaining method in this embodiment obtains a first phase difference and a second phase difference of a pixel point in an image, where a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle, and then one pixel point corresponds to two phase differences to obtain a position of the pixel point in the image, and determines a target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point, so that a target phase difference can be determined according to the position of the pixel point, accuracy of obtaining the phase difference is improved, and image blurring caused by the phase difference is effectively avoided, and imaging is clearer.
In one embodiment, determining the target phase difference of the pixel point according to the position of the pixel point in the image includes: acquiring a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference according to the position of the pixel point in the image; and performing corresponding weighting processing on the first phase difference and the second phase difference according to the first weight and the second weight to obtain a target phase difference of the pixel point.
And the positions of the pixel points in the image correspond to the first weight and the second weight. The first weight and the second weight are weights for completing calibration in advance. Can be directly obtained in actual use. The first weights corresponding to the pixel points at the same position are the same, and the second weights are the same. The first weight corresponds to the first phase difference and the second weight corresponds to the second phase difference. For example, the first phase difference is a horizontal phase difference, and the second phase difference is a vertical phase difference. Then, the horizontal phase difference of each position may not be the same corresponding to the first weight. The second weight corresponding to the vertical phase difference at each position may also be different. For example, the position a of the pixel 1 corresponds to the first weight a and the second weight b, and the position a of the pixel 2 also corresponds to the first weight a and the second weight b. The sum of the first weight and the second weight may be 1.
Specifically, the electronic device obtains a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference according to the position of the pixel point in the image. And the electronic equipment weights the first phase difference according to the first weight, weights the second phase difference according to the second weight, and processes the first phase difference and the second phase difference to obtain the target phase difference of the pixel point. For example, the first weight is a, the second weight is b, the first phase difference is X, the second phase difference is Y, and the target phase difference is aX + bY. Or the electronic equipment weights the first phase difference according to the first weight, weights the second phase difference according to the second weight, and averages to obtain the target phase difference of the pixel point. Namely, the target phase difference is (aX + bY)/(a + b).
In the phase difference obtaining method in this embodiment, a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference are obtained according to a position of the pixel point in the image; the first phase difference and the second phase difference are subjected to corresponding weighting processing according to the first weight and the second weight to obtain the target phase difference of the pixel points, the first phase difference and the second phase difference can be combined together to obtain the target phase difference, the accuracy of phase difference obtaining is improved, the phenomenon that images are not clear due to aberration is avoided, and the definition of shot images is improved.
In one embodiment, obtaining a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference according to a position of the pixel point in the image includes: and acquiring a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference in the target subarea according to the target subarea where the pixel point is located, wherein the target subarea is determined from at least two candidate subareas, and the candidate subareas are obtained by dividing the image according to a first preset angle by taking the center of the image as a vertex according to an image coordinate system.
Wherein each image comprises at least two candidate sub-regions. The candidate sub-regions are obtained by dividing the image according to a first preset angle with the center O of the image as a vertex according to the image coordinate system xOy shown in fig. 8.
Specifically, the electronic device may determine the target sub-region where the pixel is located according to the pixel coordinates. The electronic equipment acquires a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference in the target sub-region. Or the electronic device may obtain the target sub-region where the pixel is located according to the image coordinate corresponding to the pixel. For example, the electronic device obtains that the target sub-region where the pixel is located is the region a according to the pixel coordinates, and then obtains a first weight a of a first phase difference and a second weight b of a second phase difference corresponding to the region a. And when the target sub-region where the pixel is located is the B region according to the pixel coordinates, acquiring the weight c of the first phase difference and the second weight d of the second phase difference corresponding to the B region.
According to the phase difference obtaining method in the embodiment of the application, the first weight corresponding to the first phase difference and the second weight corresponding to the second phase difference in the target sub-region are obtained according to the target sub-region where the pixel point is located, the first phase difference and the second phase difference can be combined together to calculate the target phase difference, the phase difference obtaining accuracy is improved, the image blurring caused by the image difference is avoided, and the definition of the shot image is improved.
In one embodiment, fig. 6 is a schematic configuration flow diagram of the first weight and the second weight in one embodiment. As shown in fig. 6, the arrangement of the first weight and the second weight includes:
step 602, an image coordinate system is obtained, wherein the image coordinate system is established by taking the center of the image as an origin.
In particular, the center of the image may be where the intersection of the image diagonals is located. Alternatively, for example, if one image is 1024 × 720, and the pixel coordinate system is established with the upper left corner of the image as the origin, then the coordinates of the center of the image in the pixel coordinate system are (512, 360). The electronic device acquires an image coordinate system established with the center of the image as an origin.
Step 604, dividing the image into at least two candidate sub-regions according to a first preset angle by taking the center of the image as a vertex according to an image coordinate system.
Wherein the division according to the first preset angle is not limited to the division manner shown in fig. 5. The first preset angle may be 1 °, 2 °, 3 °, 5 °, 10 °, 15 °, etc., but is not limited thereto. The more candidate sub-regions, the more accurate the phase difference that is finally acquired. The electronic device may be divided into at least two candidate sub-regions by, but not limited to, 1 °, 2 °, 3 °, 5 °, 10 °, 15 °, etc. The image may be an arbitrary image. The image may be specifically the image shown in fig. 3, and may be an image including horizontal lines and vertical lines.
Specifically, the electronic device divides the image into different sub-regions according to a first preset angle, with an origin of an image coordinate system as a center, that is, with the center of the image as a vertex. Typically, the default positive x-axis direction is 0 degrees. The x-axis negative direction may be a direction of 0 degrees, the y-axis positive direction may be a direction of 0 degrees, or a direction at an angle to the x-axis positive direction may be a direction of 0 degrees. For example, as shown in fig. 5, the image region enclosed by 0 ° to 45 ° may be one candidate sub-region, the image region enclosed by 45 ° to 90 ° may be one candidate sub-region, the image region enclosed by 90 ° to 135 ° may be one candidate sub-region, the image region enclosed by 135 ° to 180 ° may be one candidate sub-region … 315 ° to 0 ° may be one candidate sub-region, and the like. Alternatively, the electronic device may set 0 ° to 10 ° as one candidate sub-region, 10 ° to 25 ° as one candidate sub-region, 25 ° to 50 ° as one candidate sub-region, and the like, but is not limited thereto.
Step 606, obtaining the definition of the lines in the first direction and the definition of the lines in the second direction in each candidate sub-region.
Specifically, the lines in the first direction may refer to lines in a range of the first direction, and likewise, the lines in the second direction may refer to lines in a range of the second direction. For example, the line in the first direction may be a line in a horizontal range, and a line having an angle with the horizontal line within a preset angle range. The lines in the first direction can also be lines in a vertical range, and lines with an included angle with the vertical line in a preset range. The lines of the first direction may be perpendicular to the lines of the second direction. The lines of the first direction correspond to the first direction in the application, and the lines of the second direction correspond to the second direction in the application. That is, when the lines in the first direction are horizontal lines, the lines in the second direction are vertical lines; when the lines in the first direction are vertical lines, the lines in the second direction are horizontal lines. The electronic equipment acquires the definition of the lines in the first direction and the definition of the lines in the second direction in each of at least two candidate sub-regions.
Step 608, when there is a first candidate sub-region having a definition of a line in the first direction greater than a definition of a line in the second direction, configuring a second weight corresponding to the first candidate sub-region to be greater than a first weight corresponding to the first candidate sub-region.
Fig. 7 is a schematic structural diagram of a pixel in an embodiment. As shown in FIG. 7, 702 may be a pixel or group of pixels. Taking the example that each pixel point comprises a sub-pixel point 1, a sub-pixel point 2, a sub-pixel point 3 and a sub-pixel point 4, the sub-pixel point 1 and the sub-pixel point 2 can be synthesized, the sub-pixel point 3 and the sub-pixel point 4 are synthesized to form a PD pixel point pair in the vertical direction, a phase difference in the vertical direction is obtained, and a horizontal edge can be detected; and synthesizing the sub-pixel point 1 and the sub-pixel point 3, and synthesizing the sub-pixel point 2 and the sub-pixel point 4 to form a PD pixel point pair in the left and right directions, so as to obtain a phase difference in the horizontal direction and detect a vertical edge.
Specifically, the lines in the first direction are taken as horizontal lines, and the lines in the second direction are taken as vertical lines for example. When a first candidate sub-region exists, in which the definition of lines in the horizontal direction is greater than that of lines in the vertical direction, it is indicated that the lines in the horizontal direction in the first candidate sub-region are clearer, and the lines in the horizontal direction are less prone to generate aberration. And the horizontal line is detected through the upper and lower PD pixel point pairs, that is, the horizontal line needs to be detected by using the vertical phase difference, then the second weight corresponding to the vertical phase difference corresponding to the first candidate sub-region is configured to be greater than the first weight corresponding to the horizontal phase difference. Because the weight corresponding to the phase difference with higher definition is improved, the image shot according to the phase difference is clearer.
Step 610, when there is a second candidate sub-region in which the definition of the line in the first direction is smaller than the definition of the line in the second direction, configuring the first weight corresponding to the second candidate sub-region to be larger than the second weight corresponding to the second candidate sub-region.
Specifically, the lines in the first direction are taken as horizontal lines, and the lines in the second direction are taken as vertical lines for example. When a second candidate subregion with the definition of the horizontal direction lines smaller than that of the vertical direction lines exists, the fact that the vertical direction lines are clearer in the subregion is indicated, and aberration is not easy to generate on the vertical direction lines. And the vertical line is detected through the left and right PD pixel point pairs, that is, the vertical line needs to be detected by using the horizontal phase difference, then the first weight corresponding to the horizontal phase difference corresponding to the second candidate subregion is configured to be greater than the second weight corresponding to the vertical phase difference. Because the weight corresponding to the phase difference with higher definition is improved, the image shot according to the phase difference is clearer.
In one embodiment, when there is a third candidate sub-region having a sharpness of the line in the first direction equal to the sharpness of the line in the second direction, the first weight and the second weight corresponding to the third candidate sub-region may be the same.
The phase difference obtaining method in the embodiment of the application obtains the definition of the line in the first direction and the definition of the line in the second direction in each candidate subarea, when there is a first candidate sub-region where the sharpness of the lines in the first direction is greater than the sharpness of the lines in the second direction, configuring a second weight corresponding to the first candidate sub-region to be greater than a first weight corresponding to the first candidate sub-region, when there is a second candidate sub-region where the sharpness of the line in the first direction is less than or equal to the sharpness of the line in the second direction, the first weight corresponding to the second candidate subregion is configured to be larger than the second weight corresponding to the second candidate subregion, the weights of different subregions in the image can be calibrated according to the definition of lines in different directions, the weight corresponding to the phase difference with higher definition is large, and the accuracy of phase difference acquisition is improved, so that the definition of the shot image is improved.
In one embodiment, determining the target phase difference of the pixel point according to the position of the pixel point in the image includes: when the position of the pixel point in the image is in a range corresponding to a first direction, taking a second phase difference as a target phase difference, wherein the range corresponding to the first direction is an image range which is formed by a line passing through an origin and is surrounded by a second preset angle with a first direction coordinate axis in an image coordinate system, and the image coordinate system is established by taking the center of the image as the origin; and when the position of the pixel point in the image is in a range corresponding to a second direction, taking the first phase difference as a target phase difference, wherein the range corresponding to the second direction is an image range which is defined by a line passing through the origin and forming a third preset angle with a second direction coordinate axis in the image coordinate system.
When the first direction is a horizontal direction, the second direction is a vertical direction; when the first direction is a vertical direction, then the second direction is a horizontal direction. The first preset angle and the second preset angle may be the same or different. Taking the first direction as the vertical direction as an example, the range corresponding to the first direction may be an image range surrounded by a line passing through the origin and forming a second predetermined angle with the y-axis of the image coordinate system. The second direction is a horizontal direction, and the range corresponding to the second direction may be an image range surrounded by a line passing through the origin and making a third preset angle with the x-axis of the image coordinate system. The line passing through the origin may be a straight line, may be a line of a predetermined shape, and the like, but is not limited thereto.
Specifically, as shown in fig. 7, the PD pixel point pairs in the up-down direction, i.e., the vertical phase difference, can detect the horizontal edge. And PD pixel point pairs in the left and right directions obtain a horizontal phase difference, and a vertical edge can be detected. Then, when the electronic device detects that the position of the pixel point in the image is within the range corresponding to the horizontal direction, the vertical phase difference is taken as the target phase difference. And when the electronic equipment detects that the position of the pixel point in the image is in the range corresponding to the vertical direction, taking the horizontal phase difference as a target phase difference. And when the electronic equipment detects that the positions of the pixel points in the image are in the range corresponding to the horizontal direction, taking the vertical phase difference as a target phase difference. For example, the first direction is a vertical direction, the first predetermined angle is 45 degrees, the second direction is a horizontal direction, and the second predetermined angle is 45 degrees. Then lines through the origin at 45 degrees to the vertical, i.e., y-axis, would be lines at 45 degrees and 225 degrees, and lines at 135 degrees and 315 degrees. Then ranges corresponding to both first directions are obtained. The first is a range enclosed by 45 degrees and the y axis and a range enclosed by 225 degrees and the y axis; the second is a range enclosed by 135 degrees and the y axis and a range enclosed by 315 degrees and the y axis. In summary, the phase difference of the vertical direction, i.e. the vertical line of 90 °, is calculated by the horizontal phase difference between 45 ° and 135 ° and between 225 ° and 315 ° for the electronic device. The electronic device calculates the phase difference of the line in the horizontal direction, i.e., 0 ° using the vertical phase difference at 135 ° to 225 ° and 315 ° to 45 °.
In the phase difference obtaining method in this embodiment, when the position of the pixel point in the image is within the range corresponding to the first direction, the second phase difference is used as the target phase difference, and when the position of the pixel point in the image is within the range corresponding to the second direction, the first phase difference is used as the target phase difference, so that the phase difference in the corresponding direction can be directly selected as the target phase difference according to the position of the pixel point in the image, and the phase difference obtaining accuracy is improved.
In one embodiment, the phase difference acquisition method further includes: and when the first weight corresponding to the first phase difference and the second weight corresponding to the second phase difference are not obtained according to the position of the pixel point in the image, executing a step of taking the second phase difference as a target phase difference when the position of the pixel point in the image is in the range corresponding to the first direction.
Specifically, when a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference cannot be obtained according to the position of the pixel point in the image, whether the pixel point is in the range corresponding to the first direction or the range corresponding to the second direction is detected. And when the position of the pixel point in the image is in the range corresponding to the first direction, taking the second phase difference as the target phase difference. And when the position of the pixel point in the image is in the range corresponding to the second direction, taking the first phase difference as a target phase difference.
In the phase difference obtaining method in this embodiment, when the first weight corresponding to the first phase difference and the second weight corresponding to the second phase difference are not obtained according to the area where the pixel point is located, when the location of the pixel point in the image is within the range corresponding to the first direction, the second phase difference is used as the target phase difference, that is, if the image capturing device of the electronic device is not calibrated in advance, the location of the pixel point in the image can be directly determined, and the phase difference in the corresponding direction is used as the target phase difference, so that the phase difference obtaining accuracy is improved.
In one embodiment, the phase difference acquisition method further includes: acquiring the reliability of the first phase difference and the reliability of the second phase difference; obtaining a reliability difference according to the reliability of the first phase difference and the reliability of the second phase difference; and when the credibility difference is within the preset range, executing the step of acquiring the position of the pixel point in the image.
The reliability is used to indicate the reliability of the phase difference calculation result. The confidence gap amount is used to represent the gap between the confidence of the first phase difference and the confidence of the second phase difference. For example, the reliability difference amount may be a difference between the reliability of the first phase difference and the reliability of the second phase difference, an absolute value of the reliability of the first phase difference and the reliability of the second phase difference, or a ratio of the reliability of the first phase difference and the reliability of the second phase difference. The confidence difference is within a preset range, namely the confidence difference is within a smaller range. The preset range corresponds to the confidence level difference. For example, the difference between the first phase difference and the second phase difference corresponds to a predetermined range α when the difference is an absolute value of the difference between the first phase difference and the second phase difference. The reliability difference is a ratio of the reliability of the first phase difference to the reliability of the second phase difference, and is not limited to this, corresponding to another preset range β. The predetermined range may be zero to a certain confidence threshold.
Specifically, the electronic device acquires the reliability of the first phase difference and the reliability of the second phase difference. And the electronic equipment calculates the reliability difference according to the reliability of the first phase difference and the reliability of the second phase difference. And when the reliability difference is within a preset range, namely when the reliability difference is smaller, the step of obtaining the position of the pixel point in the image and determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point is executed.
In the present embodiment, taking calculation of the horizontal phase difference as an example, a pair of images is divided into a left image and a right image, where the length and width of the left image and the right image are the size of M × N, respectively. For left and right phase differences, we divide the whole image into N lines first, and compute one of them first. In order to avoid mismatching, the left pixels, the right pixels and the pixel of the pixel are used for forming the brightness characteristic of one pixel point. For example, if the length is set to be 5, the combination is x-2, x-1, x, x +1, x +2, and may be set to be 7, 9, etc., and the longer the pixel combination, the smaller the probability of mismatching, but the larger the calculation amount. Considering x-2, and x +2 pixels, then the center pixel range from which the phase difference can be calculated is 2 to M-1-2, i.e., the first pixel is shifted by 0. If the degree is 7, the calculation range is 3 to M-1-3 and so on. I.e. assuming a length of 2n +1, the pixel movement range is-n to M-1-n. Since there is a maximum range from the lens to the image plane, the phase difference also has a maximum range. So the maximum shift would be set when comparing the left and right images, assuming that pd is calculated for the x-position of the left image and the shift range of the right image is-shift 1 to shift2, then the left image would be used
The brightness of x-n, x-n +1, … x, x +1 … x + n respectively and the brightness of the right picture,
x-n-shift1,x-n+1-shift1,…x-shift1,x+1-shift1…x+n-shift1,
x-n-shift1+1,x-n+1-shift1+1,…x-shift1+1,x+1-shift1+1…x+n-shift1+1
……
comparison was made with x-n + shift2, x-n +1+ shift2, … x + shift2, x +1+ shift12 … x + n + shift 2.
Taking the horizontal phase difference as an example, the phase difference of a certain line coordinate x in the image is calculated, the brightness values of 5 pixel points in the left image x-2, x-1, x, x +1, x +2 are taken, the right image is moved, and the moving range is set to be-10 to + 10. Namely:
performing similar comparison on the brightness values Rx-12, Rx-11, Rx-10, Rx-9, Rx-8 and x-2, x-1, x, x +1, x +2 of the right graph;
performing similar comparison on the brightness values Rx-11, Rx-10, Rx-9, Rx-8, Rx-7 and x-2, x-1, x, x +1, x +2 of the right graph;
……
performing similar comparison on the brightness values Rx-2, Rx-1, Rx, Rx +1, Rx +2 and x-2, x-1, x, x +1, x +2 of the right image;
performing similar comparison on brightness values Rx-1, Rx, Rx +1, Rx +2, Rx +3 and x-2, x-1, x, x +1 and x +2 of the right image;
……
performing similar comparison on the brightness values Rx +7, Rx +8, Rx +9, Rx +10, Rx +11 and x-2, x-1, x, x +1 and x +2 of the right image;
similar comparisons were made for the right graph luminance values Rx +8, Rx +9, Rx +10, Rx +11, Rx +12 and x-2, x-1, x, x +1, x + 2.
Taking the five pixel point values of the right image as Rx-2,Rx-1,Rx,Rx+1,Rx+2Five pixel point values of the left image are x-2,x-1,x,x+1,x+2For example, the similarity difference value may be | Rx-2-x-2|+|Rx-1-x-1|+|Rx-x|+|Rx+1--x+1|+|Rx+2-x+2L. the method is used for the preparation of the medicament. The smaller the similarity difference value is, the higher the similarity matching degree is. The shift value corresponding to the minimum value of the similarity difference can be converted into a phase difference value. For example, if the sequence of the center pixel of the right image corresponding to the minimum value of the similarity difference is RX+iThen i may be the phase difference value. And the reliability can be calculated according to the minimum value of the matching degree of the similarity, and the higher the matching degree of the similarity is, the higher the reliability is. And for the upper graph and the lower graph, the brightness values of a row of pixel points in the upper graph and the brightness values of a row of pixel points with the same quantity in the lower graph can be taken for similar comparison. The reliability obtaining process of the upper and lower figures is similar to that of the left and right figures, and is not described in detail here.
In the phase difference obtaining method in the embodiment, the reliability of the first phase difference and the reliability of the second phase difference are obtained, and the reliability difference is obtained according to the reliability of the first phase difference and the reliability of the second phase difference; when the credibility difference is within the preset range, the step of obtaining the position of the pixel point in the image is executed, the position of the pixel point can be obtained when the credibility difference between the first phase difference and the second phase difference is small, the proper phase difference can be obtained according to the position of the pixel point in the image, and the phase difference obtaining precision is improved.
In one embodiment, the phase difference acquisition method further includes: when the reliability difference is not within the preset range, determining the maximum reliability value in the reliability of the first phase difference and the reliability of the second phase difference; and taking the phase difference corresponding to the maximum reliability as the target phase difference.
The confidence difference amount is not within the preset range, which means that the confidence difference amount is greater than the difference amount threshold. For example, the confidence difference is an absolute value of the difference, which is 3, and the difference threshold is 2, i.e. the confidence difference is not within the preset range.
Specifically, when the electronic device detects that the reliability gap amount is not within the preset range, the maximum value of the reliability of the first phase difference and the reliability of the second phase difference is determined. The electronic equipment takes the phase difference corresponding to the maximum reliability value as a target phase difference.
In the phase difference obtaining method in this embodiment, when the difference between the degrees of reliability is not within the preset range, the maximum value of the degrees of reliability of the first phase difference and the second phase difference is determined, and the phase difference corresponding to the maximum value of the degrees of reliability is used as the target phase difference, that is, when the difference is large, the phase difference corresponding to the maximum value of the degrees of reliability is used as the target phase difference, so that the target phase difference can be obtained quickly, and the phase difference obtaining efficiency is improved.
In one embodiment, the position of the pixel point in the image includes an angle of the pixel point. Acquiring the position of a pixel point in an image, comprising: acquiring image coordinates of pixel points in an image coordinate system; and calculating according to the image coordinates to obtain the angle of the pixel point.
The pixel points may be pixel points in an image sensor or pixel points in software. And pixel points in the image sensor correspond to pixel points on the image one by one. FIG. 8 is a diagram illustrating an image coordinate system and a pixel coordinate system in one embodiment. Where xOy is the image coordinate system and uOv is the pixel coordinate system. The origin of xOy is at the center of the image and the origin of uOv is in the upper left corner of the image.
Specifically, when the coordinates of the pixel points in the image, that is, the coordinates in the image coordinate system, are obtained, the angles at which the pixel points are located are calculated according to the image coordinates. For example, the image coordinate is (x, y), and the Angle of the pixel point may be Angle (arctan (y/x)). The range of the arctan function is-90 degrees to 90 degrees. The electronic device may first determine the signs of x and y. When x is positive and y is positive, the pixel point is in the first quadrant, and the value range calculated according to the arctan function is 0-90 degrees. At this time, the Angle may be Angle (arc) and when x is negative and y is negative, it indicates that the pixel point is in the third quadrant, and the numeric area calculated according to the arc function should be 180 ° to 270 °. When x has the same value and y has the same value, the same result is obtained by directly calculating Angle (arctan (y/x)). Then the Angle may be 180 ° + arctan (y/x) when both x and y have negative values.
When x is a negative value and y is a positive value, the pixel point is in the second quadrant, and the value range calculated according to the arctan function is 90-180 degrees. The Angle may be equal to 180 ° + arctan (y/x). When x is a positive value and y is a negative value, the pixel point is indicated to be in the fourth quadrant, and the value range calculated according to the arctan function should be 270-360 degrees and can also be called as 0 degree. The Angle may be 360 ° + arctan (y/x). Other algorithms that can distinguish the angle at which the pixel is located can be included in the scope of the present embodiment.
In this embodiment, the electronic device may obtain, according to the angle at which the pixel point is located, a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference in the angle; and weighting the first phase difference and the second phase difference according to the first weight and the second weight to obtain the target phase difference of the pixel point.
In this embodiment, when the angle of the pixel point is within the range corresponding to the first direction, the second phase difference is used as the target phase difference; and when the angle of the pixel point is in the range corresponding to the second direction, taking the first phase difference as the target phase difference. For example, if the angle at which the pixel point is located is 75 degrees and the first direction is the vertical direction, the range enclosed by the included angle of 45 degrees with the vertical direction is the range corresponding to the first direction. Taking fig. 5 as an example, the range corresponding to the first direction is 45 degrees to 135 degrees, and 225 degrees to 315 degrees. And when the pixel point is located at an angle of 75 degrees in the range corresponding to the first direction, taking the second phase difference as the target phase difference.
In the phase difference obtaining method in this embodiment, image coordinates of pixel points in an image coordinate system are obtained; the angle of the pixel point is obtained through calculation according to the image coordinate, the angle of the pixel point can be calculated in real time, the target phase difference is determined according to the angle of the pixel point, the first phase difference and the second phase difference, and the phase difference obtaining precision is improved.
In one embodiment, obtaining image coordinates of a pixel point in an image coordinate system includes: acquiring pixel point coordinates of pixel points in a pixel coordinate system; the pixel coordinates are converted into image coordinates in an image coordinate system.
Wherein, the coordinate origin in the pixel coordinate system is at the upper left corner of the image, so that the angle of the whole image is only 90 degrees. The electronic equipment acquires pixel point coordinates of the pixel points in a pixel coordinate system and converts the pixel point coordinates into image coordinates in an image coordinate system.
In the phase difference obtaining method in this embodiment, pixel coordinates of a pixel in a pixel coordinate system are obtained; the pixel point coordinates are converted into the image coordinates in the image coordinate system, so that more angle ranges can be obtained, and the phase difference obtaining precision is improved.
Fig. 9 is a schematic diagram showing an internal structure of an image sensor in one embodiment, and an imaging device includes a lens and an image sensor. As shown in fig. 9, the image sensor includes a microlens 90, an optical filter 92, and a photosensitive unit 94. The micro-lens 90, the filter 92 and the photosensitive unit 94 are sequentially located on the incident light path, i.e. the micro-lens 90 is disposed on the filter 92, and the filter 92 is disposed on the photosensitive unit 94.
The filter 92 may include three types of red, green and blue, which can only transmit the light with the wavelength corresponding to the red, green and blue. A filter 92 is disposed on one pixel.
The micro-lens 90 is used for receiving incident light and transmitting the incident light to the optical filter 92. The filter 92 smoothes incident light, and then the smoothed light is incident on the photosensitive cell 94 on the basis of pixel points.
The light sensing unit 94 in the image sensor converts light incident from the optical filter 92 into a charge signal by a photoelectric effect and generates a pixel point signal in accordance with the charge signal. The charge signal corresponds to the received light intensity.
Please refer to fig. 10, which illustrates a schematic diagram of an exemplary pixel group Z, as shown in fig. 10, the pixel group Z includes 4 pixels D arranged in an array arrangement manner of two rows and two columns, wherein a color channel of a pixel in a first row and a first column is green, that is, a color filter included in a pixel in the first row and the first column is a green color filter, a color channel of a pixel in a second row and the second column is red, that is, a color filter included in a pixel in the first row and the second column is a red color filter, a color channel of a pixel in the first column and the second row is blue, that is, a color filter included in a pixel in the first column and the second row is a blue color filter, a color channel of a pixel in the second row and the second column is green, that is, a color filter included in a pixel in the second row and the second column is a green color filter.
In one embodiment, fig. 11 is a schematic flowchart of acquiring the first phase difference and the second phase difference in one embodiment. As shown in fig. 11, acquiring the first phase difference and the second phase difference of the pixel point includes:
step 1102, obtaining a target brightness map according to the brightness values of the pixels included in each pixel group.
The brightness value of the pixel point of the image sensor can be represented by the brightness value of the sub-pixel point included by the pixel point. That is, the electronic device may obtain the target luminance graph according to the luminance values of the sub-pixels in the pixels included in each pixel group. The "brightness value of a subpixel" refers to the brightness value of the optical signal received by the subpixel.
Specifically, the subpixel included in the image sensor is a photosensitive element capable of converting an optical signal into an electrical signal, so that the intensity of the optical signal received by the subpixel can be obtained according to the electrical signal output by the subpixel, and the brightness value of the subpixel can be obtained according to the intensity of the optical signal received by the subpixel.
The target brightness map in the embodiment of the application is used for reflecting the brightness values of the sub pixel points in the image sensor, and the target brightness map can comprise a plurality of pixel points, wherein the pixel point value of each pixel point in the target brightness map is obtained according to the brightness value of the sub pixel point in the image sensor.
And 1104, performing segmentation processing on the target brightness map to obtain a first segmentation brightness map and a second segmentation brightness map.
The electronic device performs a segmentation process on the target luminance graph along the column direction (y-axis direction in the image coordinate system), and each segmentation line during the segmentation process is perpendicular to the column direction during the segmentation process of the target luminance graph along the column direction.
The electronic device performs a segmentation process on the target luminance graph along a row direction (x-axis direction in an image coordinate system), and each segmentation line during the segmentation process is perpendicular to the row direction during the segmentation process.
The first and second sliced luminance graphs obtained by slicing the target luminance graph in the column direction may be referred to as upper and lower graphs, respectively. The first and second sliced luminance maps obtained by slicing the target luminance map in the row direction may be referred to as a left map and a right map, respectively.
Step 1106, determining the phase difference value of the mutually matched pixels according to the position difference of the mutually matched pixels in the first segmentation luminance graph and the second segmentation luminance graph.
The "mutually matched pixels" means that pixel matrixes formed by the pixels themselves and surrounding pixels are mutually similar. For example, a pixel point a in the first tangential luminance graph and surrounding pixel points form a pixel point matrix with 3 rows and 3 columns, and the pixel point value of the pixel point matrix is as follows:
2 10 90
1 20 80
0 100 1
the pixel b and the surrounding pixels in the second segmentation brightness map also form a pixel matrix with 3 rows and 3 columns, and the pixel value of the pixel matrix is as follows:
1 10 90
1 21 80
0 100 2
as can be seen from the above, the two matrices are similar, and it can be considered that the pixel point a and the pixel point b are matched with each other. As for how to judge whether pixel point matrixes are similar, in practical application, there are many different methods, and one common method is to calculate the difference between pixel point values of pixels corresponding to two pixel point matrixes, add the absolute values of the calculated differences, and judge whether the pixel point matrixes are similar by using the result of the addition, that is, if the result of the addition is smaller than a preset threshold, the pixel point matrixes are considered to be similar, otherwise, the pixel point matrixes are considered to be dissimilar.
For example, for the two pixel matrixes of 3 rows and 3 columns, 1 and 2 may be subtracted, 10 and 10 are subtracted, 90 and 90 are subtracted, … … are further added, and the absolute values of the obtained differences are added to obtain an addition result of 3, and if the addition result of 3 is smaller than a preset threshold, the two pixel matrixes of 3 rows and 3 columns are considered to be similar.
Another common method for judging whether pixel point matrices are similar is to extract edge features of the pixel point matrices by using a sobel convolution kernel calculation mode or a high laplacian calculation mode, and the like, and judge whether the pixel point matrices are similar through the edge features.
In this embodiment of the present application, the "position difference between the mutually matched pixel points" refers to a difference between the position of the pixel point located in the first sliced luminance graph and the position of the pixel point located in the second sliced luminance graph among the mutually matched pixel points. As in the above example, the position difference between the pixel point a and the pixel point b that are matched with each other refers to the difference between the position of the pixel point a in the first segmentation luminance graph and the position of the pixel point b in the second segmentation luminance graph.
The matched pixel points respectively correspond to different images formed by imaging light rays entering the lens from different directions in the image sensor. For example, a pixel point a in the first split luminance graph and a pixel point B in the second split luminance graph are matched with each other, where the pixel point a may correspond to the image formed at the position a in fig. 1, and the pixel point B may correspond to the image formed at the position B in fig. 1.
Because the matched pixel points respectively correspond to different images formed by imaging light rays which are emitted into the lens from different directions in the image sensor, the phase difference of the matched pixel points can be determined according to the position difference of the matched pixel points.
Step 1108, determining a first phase difference and a second phase difference of the pixel points according to the phase difference values of the pixel points matched with each other.
Specifically, the electronic device determines the vertical phase difference of the pixel points according to the phase difference values of the pixel points matched with each other in the upper graph and the lower graph. And the electronic equipment determines the horizontal phase difference of the pixel points according to the phase difference values of the pixel points matched with each other in the left image and the right image. Then, when the first phase difference is a horizontal phase difference, the second phase difference is a vertical phase difference; when the second phase difference is a vertical phase difference, the first phase difference is a horizontal phase difference.
In the phase difference obtaining method in the embodiment of the application, a target luminance map is obtained according to the luminance values of the pixels included in each pixel group, after the target luminance map is obtained, the target luminance map is segmented, a first segmentation luminance map and a second segmentation luminance map are obtained according to the segmentation result, then, the phase difference of the pixels matched with each other is determined according to the position difference of the pixels matched with each other in the first segmentation luminance map and the second segmentation luminance map, and then, the target phase difference map is generated according to the phase difference of the pixels matched with each other, so that the phase difference can be obtained by using the luminance values of the pixels included in each pixel group in an image sensor, and therefore, compared with a mode of obtaining the phase difference by using sparsely arranged phase detection pixels, the target phase difference map in the embodiment of the application contains relatively rich phase difference information, therefore, the accuracy of the obtained phase difference can be improved.
In one embodiment, determining the first phase difference and the second phase difference of the pixel points according to the phase difference values of the pixel points matched with each other includes: for each target brightness image, generating a middle phase difference image corresponding to the target brightness image according to the phase difference of the pixel points matched with each other; and determining a first phase difference and a second phase difference of the pixel points according to the intermediate phase difference graph corresponding to each target brightness graph.
Specifically, for each target luminance graph, the electronic device may obtain an intermediate phase difference graph according to a phase difference between pixel points that are matched with each other in the first and second split luminance graphs corresponding to the target luminance graph. Then, the electronic device may obtain the target phase difference map according to the intermediate phase difference map corresponding to each target luminance map. Therefore, the accuracy of acquiring the target phase difference diagram is high.
In one embodiment, determining the first phase difference and the second phase difference of the pixel points according to the intermediate phase difference map corresponding to each target luminance map includes:
and (a1) determining pixel points at the same position from each intermediate phase difference image to obtain a plurality of phase difference pixel point sets, wherein the positions of the pixel points included in each phase difference pixel point set in the intermediate phase difference image are the same.
And the positions of the pixels included in each phase difference pixel point set in the middle phase difference image are the same.
As shown in fig. 12, fig. 12 is a schematic diagram of determining pixels at the same position from each intermediate phase difference map in one embodiment. The electronic device determines pixel points at the same position from the intermediate phase difference diagram 1, the intermediate phase difference diagram 2, the intermediate phase difference diagram 3 and the intermediate phase difference diagram 4 respectively, and can obtain 4 phase difference pixel point sets Y1, Y2, Y3 and Y4, wherein the phase difference pixel point set Y1 includes a pixel point PD _ Gr _1 in the intermediate phase difference diagram 1, a pixel point PD _ R _1 in the intermediate phase difference diagram 2, a pixel point PD _ B _1 in the intermediate phase difference diagram 3 and a pixel point PD _ Gb _1 in the intermediate phase difference diagram 4, the phase difference pixel point set Y2 includes a pixel point PD _ Gr _2 in the intermediate phase difference diagram 1, a pixel point PD _ R _2 in the intermediate phase difference diagram 2, a pixel point PD _ B _2 in the intermediate phase difference diagram 3 and a pixel point PD _ Gb _2 in the intermediate phase difference diagram 4, and the phase difference pixel point set Y3 includes a pixel point PD _ Gr _3, a pixel point PD _ Gr _3 in the intermediate phase difference diagram 1, a pixel point PD _ Gr _ Gb _3, The pixel point PD _ R _3 in the middle phase difference diagram 2, the pixel point PD _ B _3 in the middle phase difference diagram 3, and the pixel point PD _ Gb _3 in the middle phase difference diagram 4, and the phase difference pixel point set Y4 includes the pixel point PD _ Gr _4 in the middle phase difference diagram 1, the pixel point PD _ R _4 in the middle phase difference diagram 2, the pixel point PD _ B _4 in the middle phase difference diagram 3, and the pixel point PD _ Gb _4 in the middle phase difference diagram 4.
And (a2) for each phase difference pixel point set, splicing the pixel points in the phase difference pixel point set to obtain a sub-phase difference graph corresponding to the phase difference pixel point set.
The sub-phase difference graph comprises a plurality of pixel points, each pixel point corresponds to one pixel point in the phase difference pixel point set, and the pixel point value of each pixel point is equal to the pixel point value of the corresponding pixel point.
And step B3, the electronic equipment splices the obtained multiple sub phase difference maps to obtain a target phase difference map.
As shown in fig. 13, fig. 13 is a schematic diagram of a target phase difference diagram in an embodiment, where the target phase difference diagram includes a sub-phase difference diagram 1, a sub-phase difference diagram 2, a sub-phase difference diagram 3, and a sub-phase difference diagram 4, where the sub-phase difference diagram 1 corresponds to a phase difference pixel set Y1, the sub-phase difference diagram 2 corresponds to a phase difference pixel set Y2, the sub-phase difference diagram 3 corresponds to a phase difference pixel set Y3, and the sub-phase difference diagram 4 corresponds to a phase difference pixel set Y4.
And (a3) splicing the obtained multiple sub-phase difference graphs, and determining a first phase difference and a second phase difference of the pixel points.
Specifically, the electronic device splices the obtained multiple sub-phase difference maps to obtain a target phase difference map. And the electronic equipment determines the first phase difference and the second phase difference of the pixel point according to the target phase difference diagram.
In one embodiment, the segmenting the target luminance graph to obtain a first segmented luminance graph and a second segmented luminance graph includes:
and (b1) performing segmentation processing on the target brightness map to obtain a plurality of brightness map regions, wherein each brightness map region comprises one row of pixel points in the target brightness map, or each brightness map region comprises one column of pixel points in the target brightness map.
Each brightness map area comprises a line of pixel points in the target brightness map, or each brightness map area comprises a line of pixel points in the target brightness map.
The electronic device can divide the target brightness image column by column along the row direction to obtain a plurality of pixel point columns of the target brightness image, wherein the pixel point columns are the brightness image areas.
The electronic device can divide the target luminance graph line by line along the column direction to obtain a plurality of pixel point lines of the target luminance graph, and the pixel point lines are the luminance graph area.
And (b2) acquiring a plurality of first brightness map areas and a plurality of second brightness map areas from the plurality of brightness map areas, wherein the first brightness map areas comprise pixel points in even rows in the target brightness map, or the first brightness map areas comprise pixel points in even columns in the target brightness map, and the second brightness map areas comprise pixel points in odd rows in the target brightness map, or the second brightness map areas comprise pixel points in odd columns in the target brightness map.
Specifically, in the case of column-by-column slicing of the target luminance map, the electronic device may determine odd-numbered columns as the first luminance map region and even-numbered columns as the second luminance map region.
In the case of line-by-line segmentation of the target luminance map, the electronic device may determine odd lines as the first luminance map region and even lines as the second luminance map region.
And (b3) forming a first segmentation luminance map by using the plurality of first luminance map regions and forming a second segmentation luminance map by using the plurality of second luminance map regions.
In this embodiment, fig. 14 is a schematic diagram illustrating a target luminance graph is subjected to a slicing process in a first direction in one embodiment. Fig. 15 is a diagram illustrating a slicing process performed on a target luminance graph in a second direction in one embodiment. As shown in fig. 14, assuming that the target luminance graph includes 6 rows and 6 columns of pixel points, when the target luminance graph is sliced column by column, the target luminance graph is sliced in the first direction. The electronic device may determine a1 st row of pixel points, a3 rd row of pixel points, and a 5 th row of pixel points of the target luminance map as a first luminance map region, and may determine a2 nd row of pixel points, a 4 th row of pixel points, and a 6 th row of pixel points of the target luminance map as a second luminance map region. Then, the electronic device may splice the first luminance map regions to obtain a first cut luminance map T1, where the first cut luminance map T1 includes the 1 st column of pixel points, the 3 rd column of pixel points, and the 5 th column of pixel points of the target luminance map. The electronic device may splice the second luminance map regions to obtain a second split luminance map T2, where the second split luminance map T2 includes the 2 nd, 4 th, and 6 th columns of pixels of the target luminance map.
As shown in fig. 15, assuming that the target luminance map includes 6 rows and 6 columns of pixel points, the target luminance map is sliced row by row, that is, the target luminance map is sliced in the second direction. The electronic device can determine the 1 st line of pixel points, the 3 rd line of pixel points and the 5 th line of pixel points of the target brightness map as a first brightness map region, can determine the 2 nd line of pixel points, the 4 th line of pixel points and the 6 th line of pixel points of the target brightness map as a second brightness map region, and then the electronic device can splice the first brightness map region to obtain a first cut brightness map T3, wherein the first cut brightness map T3 comprises the 1 st line of pixel points, the 3 rd line of pixel points and the 5 th line of pixel points of the target brightness map. The electronic device may splice the second luminance map regions to obtain a second split luminance map T4, where the second split luminance map T4 includes the 2 nd row of pixel points, the 4 th row of pixel points, and the 6 th row of pixel points of the target luminance map.
In one embodiment, fig. 16 is a schematic flow chart illustrating a process of determining phase difference values of matched pixels according to one embodiment. As shown in fig. 16, determining the phase difference value of the mutually matched pixels according to the position difference between the mutually matched pixels in the first and second sliced luminance graphs includes:
step 1602, when the luminance graph region includes a row of pixel points in the target luminance graph, a first neighboring pixel point set is determined in each row of pixel points included in the first tangent luminance graph, and the pixel points included in the first neighboring pixel point set correspond to the same pixel point group.
In particular, when the luminance map area includes a row of pixels in the target luminance map, that is, in the case where the electronic device divides the target luminance map row by row in the column direction, after the slicing, since the two pixels of the first row in the sub-luminance map are located in the same row of pixels of the target luminance map, therefore, the two pixels of the first row of the sub-luminance map will be located in the same luminance map region and will be located in the same sliced luminance map, and, similarly, the two pixels of the second row of the sub-luminance map will also be located in the same luminance area and will be located in another split luminance map, assuming that the first row of the sub-luminance map is located in the even pixel rows of the target luminance map, the two pixels of the first row of the sub-luminance map are located in the first split luminance map and the two pixels of the second row of the sub-luminance map are located in the second split luminance map.
The column wise manner is similar to the row wise manner and will not be described further herein.
Step 1604, for each first neighboring pixel point set, searching the second segmentation luminance graph for a first matching pixel point set corresponding to the first neighboring pixel point set.
Specifically, for each first neighboring pixel set, the electronic device may obtain a plurality of pixels around the first neighboring pixel set in a first sliced luminance graph, and form a search pixel matrix from the first neighboring pixel set and the plurality of pixels around the first neighboring pixel set, for example, the search pixel matrix may include 9 pixels in 3 rows and 3 columns, and then the electronic device may search for a pixel matrix similar to the search pixel matrix in a second sliced luminance graph.
After searching for a pixel matrix similar to the searched pixel matrix in the second sliced luminance graph, the electronic device can extract a first set of matched pixels from the searched pixel matrix.
The pixels in the first matched pixel set and the pixels in the first adjacent pixel set obtained by searching respectively correspond to different images formed in the image sensor by imaging light rays entering the lens from different directions.
Step 1606, according to the position difference between each first neighboring pixel point set and each first matching pixel point set, determining the phase difference between the first neighboring pixel point set and the first matching pixel point set corresponding to each other.
The difference in position of the first set of neighboring pixels from the first set of matched pixels refers to: the difference in the position of the first set of neighboring pixels in the first sliced luminance map and the position of the first set of matching pixels in the second sliced luminance map.
When the obtained first and second sliced luminance maps may be referred to as an upper map and a lower map, respectively, a phase difference obtained by the upper and lower maps may reflect an imaging position difference of an object in a vertical direction.
In one embodiment, a phase difference acquisition method includes:
and (c1) acquiring an image coordinate system, wherein the image coordinate system is established by taking the center of the image as an origin.
And (c2) dividing the image into at least two candidate sub-regions according to a first preset angle by taking the center of the image as a vertex according to the image coordinate system.
And (c3) acquiring the definition of the lines in the first direction and the definition of the lines in the second direction in each candidate subarea.
And (c4), when there is a first candidate subregion with the definition of the line in the first direction being greater than the definition of the line in the second direction, configuring the second weight corresponding to the first candidate subregion to be greater than the first weight corresponding to the first candidate subregion.
And (c5) when a second candidate subregion, of which the definition of the line in the first direction is smaller than that of the line in the second direction, exists, configuring a first weight corresponding to the second candidate subregion to be larger than a second weight corresponding to the second candidate subregion.
And (c6), when a third candidate sub-region exists, wherein the definition of the line in the first direction is equal to the definition of the line in the second direction, configuring the first weight and the second weight corresponding to the third candidate sub-region to be equal.
And (c7) acquiring a first phase difference and a second phase difference of the pixel points, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle.
And (c8) obtaining the reliability of the first phase difference and the reliability of the second phase difference.
And (c9) obtaining a reliability difference according to the reliability of the first phase difference and the reliability of the second phase difference.
And (c10) when the credibility difference is within a preset range, acquiring the positions of the pixel points in the image, wherein the positions of the pixel points in the image comprise the target sub-regions of the pixel points in the image.
And (c11) acquiring a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference in the target subarea according to the target subarea where the pixel points are located.
And (c12) performing corresponding weighting processing on the first phase difference and the second phase difference according to the first weight and the second weight to obtain a target phase difference of the pixel point.
And (c13) when the first weight corresponding to the first phase difference and the second weight corresponding to the second phase difference are not obtained according to the target sub-region where the pixel is located, that is, when the first weight corresponding to the first phase difference and the second weight corresponding to the second phase difference cannot be obtained according to the steps (c11) and (c12), executing a step of taking the second phase difference as the target phase difference when the position of the pixel in the image is within a range corresponding to a first direction, wherein the range corresponding to the first direction is an image range which is defined by a line passing through an origin and is surrounded by a first direction coordinate axis in an image coordinate system by a second preset angle.
And (c14) when the positions of the pixels in the image are within a range corresponding to a second direction, taking the first phase difference as a target phase difference, wherein the range corresponding to the second direction is an image range which is defined by a line passing through the origin and forming a third preset angle with a second direction coordinate axis in the image coordinate system.
And (c15) when the confidence level difference quantity is not in the preset range, determining the maximum confidence level value of the confidence level of the first phase difference and the confidence level of the second phase difference.
And (c16) setting the phase difference corresponding to the maximum reliability as the target phase difference.
Although each of the steps (c1) to (c16) is sequentially displayed as indicated by numerals, the steps are not necessarily sequentially performed in the order indicated by arrows. The steps are not limited to being performed in the exact order illustrated and, unless explicitly stated herein, may be performed in other orders.
According to the phase difference obtaining method in the embodiment of the application, a first phase difference and a second phase difference of a pixel point are obtained, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle, one pixel point corresponds to the two phase differences, the position of the pixel point in an image is obtained, a target phase difference of the pixel point is determined according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point, corresponding weighting processing is performed on the first phase difference and the second phase difference according to the first weight and the second weight, the target phase difference of the pixel point is obtained, the first phase difference and the second phase difference can be combined together to obtain the target phase difference, the phase difference obtaining precision is improved, image blurring caused by the image difference is effectively avoided, and imaging is clearer.
It should be understood that although the various steps in the flowcharts of fig. 2, 6, 11 and 16 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 6, 11, and 16 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least some of the sub-steps or stages of other steps.
Fig. 17 is a block diagram showing a configuration of a phase difference acquisition device according to an embodiment. As shown in fig. 17, a phase difference obtaining apparatus includes a first obtaining module 1702, a second obtaining module 1704 and a target phase difference determining module 1706, wherein:
a first obtaining module 1702, configured to obtain a first phase difference and a second phase difference of a pixel, where a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle;
a second obtaining module 1704, configured to obtain a position of the pixel point in the image;
the target phase difference determining module 1706 is configured to determine a target phase difference of the pixel according to a position of the pixel in the image.
The phase difference obtaining device in this embodiment obtains a first phase difference and a second phase difference of a pixel, where a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle, and then one pixel corresponds to two phase differences to obtain a position of the pixel in an image, and a target phase difference of the pixel is determined according to the position of the pixel in the image, the first phase difference and the second phase difference of the pixel, so that a target phase difference can be determined according to the position of the pixel, accuracy of obtaining the phase difference is improved, and image blurring caused by the phase difference is effectively avoided, and imaging is clearer.
In an embodiment, the target phase difference determining module 1706 is configured to obtain a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference according to a position of the pixel point in the image; and performing corresponding weighting processing on the first phase difference and the second phase difference according to the first weight and the second weight to obtain a target phase difference of the pixel point.
The phase difference obtaining device in this embodiment obtains a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference according to the position of the pixel point in the image; the first phase difference and the second phase difference are subjected to corresponding weighting processing according to the first weight and the second weight to obtain the target phase difference of the pixel points, the first phase difference and the second phase difference can be combined together to obtain the target phase difference, the accuracy of phase difference obtaining is improved, the phenomenon that images are not clear due to aberration is avoided, and the definition of shot images is improved.
In an embodiment, the target phase difference determining module 1706 is configured to obtain, according to a target sub-region where the pixel is located, a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference in the target sub-region, where the target sub-region is determined from at least two candidate sub-regions, and the candidate sub-regions are obtained by dividing the image according to the image coordinate system and by using a center of the image as a vertex and according to a first preset angle.
According to the phase difference obtaining device in the embodiment of the application, the first weight corresponding to the first phase difference and the second weight corresponding to the second phase difference in the target subarea are obtained according to the target subarea where the pixel points are located, the first phase difference and the second phase difference can be combined together to obtain the target phase difference, the phase difference obtaining accuracy is improved, the image blurring caused by the image difference is avoided, and the definition of the shot image is improved.
In one embodiment, the phase difference obtaining apparatus further includes a weight obtaining module. The weight acquisition module is used for acquiring an image coordinate system, wherein the image coordinate system is established by taking the center of an image as an origin; dividing the image into at least two candidate sub-regions according to an image coordinate system and a first preset angle by taking the center of the image as a vertex; obtaining the definition of lines in the first direction and the definition of lines in the second direction in each candidate sub-region; when a first candidate sub-region exists, wherein the definition of the line in the first direction is greater than that of the line in the second direction, a second weight corresponding to the first candidate sub-region is configured to be greater than a first weight corresponding to the first candidate sub-region; when a second candidate subarea with the definition of the line in the first direction smaller than or equal to the definition of the line in the second direction exists, the first weight corresponding to the second candidate subarea is configured to be larger than the second weight corresponding to the second candidate subarea.
The phase difference acquisition device in the embodiment of the application acquires the definition of the lines in the first direction and the definition of the lines in the second direction in each candidate subarea, when there is a first candidate sub-region where the sharpness of the lines in the first direction is greater than the sharpness of the lines in the second direction, configuring a second weight corresponding to the first candidate sub-region to be greater than a first weight corresponding to the first candidate sub-region, when there is a second candidate sub-region where the sharpness of the line in the first direction is less than or equal to the sharpness of the line in the second direction, the first weight corresponding to the second candidate subregion is configured to be larger than the second weight corresponding to the second candidate subregion, the weights of different subregions can be calibrated according to the definition of lines in different directions, the weight corresponding to the phase difference with high definition is large, and the accuracy of phase difference acquisition is improved, so that the definition of a shot image is improved.
In one embodiment, the target phase difference determining module 1706 is configured to, when the position of the pixel point in the image is within a range corresponding to a first direction, take the second phase difference as the target phase difference, where the range corresponding to the first direction is an image range that is surrounded by a line passing through an origin and a second preset angle with respect to a first direction coordinate axis in an image coordinate system, where the image coordinate system is established with a center of the image as the origin; and when the position of the pixel point in the image is in a range corresponding to a second direction, taking the first phase difference as a target phase difference, wherein the range corresponding to the second direction is an image range which is defined by a line passing through the origin and forming a third preset angle with a second direction coordinate axis in the image coordinate system.
In the phase difference acquiring device in this embodiment, when the position of the pixel point in the image is within the range corresponding to the first direction, the second phase difference is taken as the target phase difference, and when the position of the pixel point in the image is within the range corresponding to the second direction, the first phase difference is taken as the target phase difference, so that the phase difference in the corresponding direction can be directly selected as the target phase difference according to the position of the pixel point in the image, and the phase difference acquiring accuracy is improved.
In an embodiment, the target phase difference determining module 1706 is configured to, when a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference are not obtained according to an area where the pixel is located, and when a position of the pixel in the image is within a range corresponding to the first direction, take the second phase difference as the target phase difference.
In the phase difference obtaining device in this embodiment, when the first weight corresponding to the first phase difference and the second weight corresponding to the second phase difference are not obtained according to the area where the pixel is located, a step of taking the second phase difference as the target phase difference when the location of the pixel in the image is within the range corresponding to the first direction is executed, that is, if the image capturing device of the electronic device is not calibrated in advance, the location of the pixel in the image can be directly determined, and the phase difference in the corresponding direction is taken as the target phase difference, so that the phase difference obtaining accuracy is improved.
In one embodiment, the second obtaining module 1704 is configured to obtain a confidence level of the first phase difference and a confidence level of the second phase difference; obtaining a reliability difference according to the reliability of the first phase difference and the reliability of the second phase difference; and when the credibility difference is within a preset range, acquiring the position of the pixel point in the image.
The phase difference acquisition device in the embodiment acquires the reliability of the first phase difference and the reliability of the second phase difference, and obtains the reliability difference amount according to the reliability of the first phase difference and the reliability of the second phase difference; when the credibility difference is within the preset range, the step of obtaining the position of the pixel point in the image is executed, when the credibility difference between the first phase difference and the second phase difference is small, the position of the pixel point can be obtained, the proper phase difference can be obtained according to the position of the pixel point in the image, and the phase difference obtaining precision is improved.
In one embodiment, the target phase difference determining module 1706 is configured to determine a maximum value of the confidence level of the first phase difference and the confidence level of the second phase difference when the confidence level gap amount is not within the preset range; and taking the phase difference corresponding to the maximum reliability as the target phase difference.
In the phase difference acquisition device in this embodiment, when the difference in the confidence levels is not within the preset range, the maximum confidence level value of the confidence levels of the first phase difference and the second phase difference is determined, and the phase difference corresponding to the maximum confidence level value is used as the target phase difference, that is, when the difference is large, the phase difference corresponding to the maximum confidence level value is used as the target phase difference, so that the target phase difference can be obtained quickly, and the phase difference acquisition efficiency is improved.
In the phase difference acquisition device in this embodiment, when the difference in the confidence levels is not within the preset range, the maximum value of the confidence levels between the confidence level of the first phase difference and the confidence level of the second phase difference is determined, and the phase difference corresponding to the maximum value of the confidence levels is used as the target phase difference, that is, when the difference is large, the phase difference corresponding to the maximum value of the confidence levels is used as the target phase difference, so that the target phase difference can be obtained quickly, and the phase difference acquisition efficiency is improved.
In one embodiment, the second obtaining module 1704 is configured to obtain image coordinates of the pixel point in an image coordinate system; and calculating according to the image coordinates to obtain the angle of the pixel point.
The phase difference obtaining device in this embodiment obtains image coordinates of pixel points in an image coordinate system; the angle of the pixel point is obtained through calculation according to the image coordinate, the angle of the pixel point can be calculated in real time, the target phase difference is determined according to the angle of the pixel point, the first phase difference and the second phase difference, and the phase difference obtaining precision is improved.
In one embodiment, the second obtaining module 1704 is configured to: acquiring pixel point coordinates of pixel points in a pixel coordinate system; the pixel coordinates are converted into image coordinates in an image coordinate system.
The phase difference obtaining device in this embodiment obtains pixel coordinates of a pixel in a pixel coordinate system; the pixel point coordinates are converted into the image coordinates in the image coordinate system, so that more angle ranges can be obtained, and the phase difference obtaining precision is improved.
In an embodiment, the first obtaining module 1702 is configured to obtain a target luminance map according to luminance values of pixels included in each pixel group; performing segmentation processing on the target brightness image to obtain a first segmentation brightness image and a second segmentation brightness image; determining the phase difference value of the pixel points matched with each other according to the position difference of the pixel points matched with each other in the first segmentation brightness graph and the second segmentation brightness graph; and determining a first phase difference and a second phase difference of the pixel points according to the phase difference values of the pixel points which are matched with each other.
The phase difference obtaining device in the embodiment of the present application obtains the target luminance graph according to the luminance values of the pixels included in each pixel group, performs segmentation processing on the target luminance graph after obtaining the target luminance graph, obtains the first segmentation luminance graph and the second segmentation luminance graph according to the result of the segmentation processing, then determines the phase difference between the pixels that are matched with each other according to the position difference between the pixels that are matched with each other in the first segmentation luminance graph and the second segmentation luminance graph, and then generates the target phase difference graph according to the phase difference between the pixels that are matched with each other, so that the luminance values of the pixels included in each pixel group in the image sensor can be used to obtain the phase difference, and therefore, compared with a mode of obtaining the phase difference by using sparsely arranged phase detection pixels, the target phase difference graph in the embodiment of the present application contains relatively rich phase difference information, therefore, the accuracy of the obtained phase difference can be improved.
In one embodiment, the first obtaining module 1702 is configured to, for each target luminance graph, generate an intermediate phase difference graph corresponding to the target luminance graph according to phase differences of the pixel points that are matched with each other; and determining a first phase difference and a second phase difference of the pixel points according to the intermediate phase difference graph corresponding to each target brightness graph.
In an embodiment, the first obtaining module 1702 is configured to determine, from each intermediate phase difference map, pixel points at the same position, and obtain a plurality of phase difference pixel point sets, where positions of the pixel points included in each phase difference pixel point set in the intermediate phase difference map are the same; for each phase difference pixel point set, splicing the pixel points in the phase difference pixel point set to obtain a sub-phase difference graph corresponding to the phase difference pixel point set; and splicing the obtained multiple sub-phase difference graphs to determine a first phase difference and a second phase difference of the pixel points.
In an embodiment, the first obtaining module 1702 is configured to perform segmentation processing on the target luminance graph to obtain a plurality of luminance graph regions, where each luminance graph region includes a row of pixel points in the target luminance graph, or each luminance graph region includes a column of pixel points in the target luminance graph; acquiring a plurality of first brightness map regions and a plurality of second brightness map regions from the plurality of brightness map regions, wherein the first brightness map regions comprise pixel points in even rows in a target brightness map, or the first brightness map regions comprise pixel points in even columns in the target brightness map, and the second brightness map regions comprise pixel points in odd rows in the target brightness map, or the second brightness map regions comprise pixel points in odd columns in the target brightness map; the first segmentation luminance map is composed of a plurality of first luminance map regions, and the second segmentation luminance map is composed of a plurality of second luminance map regions.
In one embodiment, the first obtaining module 1702 is configured to determine a first neighboring pixel set in each line of pixels included in the first partition luminance graph when the luminance graph region includes a line of pixels in the target luminance graph, where the pixels included in the first neighboring pixel set correspond to a same pixel group; for each first adjacent pixel point set, searching a first matching pixel point set corresponding to the first adjacent pixel point set in the second segmentation brightness graph; and determining the phase difference between the first adjacent pixel point set and the first matching pixel point set corresponding to each other according to the position difference between each first adjacent pixel point set and each first matching pixel point set.
The division of each module in the phase difference obtaining apparatus is only used for illustration, and in other embodiments, the phase difference obtaining apparatus may be divided into different modules as needed to complete all or part of the functions of the phase difference obtaining apparatus.
For specific limitations of the phase difference acquisition device, reference may be made to the above limitations of the phase difference acquisition method, which are not described herein again. Each module in the phase difference obtaining apparatus may be wholly or partially implemented by software, hardware, or a combination thereof. The modules can be embedded in a hardware form or independent of a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
Fig. 18 is a schematic diagram of an internal structure of an electronic device in one embodiment. As shown in fig. 18, the electronic apparatus includes a processor and a memory connected by a system bus. Wherein, the processor is used for providing calculation and control capability and supporting the operation of the whole electronic equipment. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program is executable by a processor to implement a phase difference acquisition method provided in the following embodiments. The internal memory provides a cached execution environment for the operating system computer programs in the non-volatile storage medium. The electronic device may be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device, etc.
The implementation of each module in the phase difference acquisition apparatus provided in the embodiments of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. The program modules formed by the computer program may be stored on the memory of the terminal or the server. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of the phase difference acquisition method.
A computer program product containing instructions which, when run on a computer, cause the computer to perform a phase difference acquisition method.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus (Rambus) direct RAM (RDRAM), direct bused dynamic RAM (DRDRAM), and bused dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, and these are all within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (14)

1. A phase difference acquisition method, comprising:
acquiring a first phase difference and a second phase difference of the same pixel point in an image, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle;
obtaining the reliability of the first phase difference and the reliability of the second phase difference;
obtaining a reliability difference according to the reliability of the first phase difference and the reliability of the second phase difference;
when the credibility difference is within a preset range, acquiring the position of the pixel point in the image;
and determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point.
2. The method of claim 1, wherein determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point comprises:
according to the position of the pixel point in the image, acquiring a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference;
and carrying out corresponding weighting processing on the first phase difference and the second phase difference according to the first weight and the second weight to obtain the target phase difference of the pixel point.
3. The method according to claim 2, wherein the obtaining a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference according to a position of the pixel point in the image comprises:
and acquiring a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference in the target subarea according to the target subarea where the pixel point is located, wherein the target subarea is determined from at least two candidate subareas, and the candidate subareas are obtained by dividing the image according to a first preset angle by taking the center of the image as a vertex according to an image coordinate system.
4. The method of claim 2, wherein the first weight and the second weight are configured in a manner comprising:
acquiring an image coordinate system, wherein the image coordinate system is established by taking the center of the image as an origin;
dividing the image into at least two candidate sub-regions according to the image coordinate system and a first preset angle by taking the center of the image as a vertex;
acquiring the definition of the lines in the first direction and the definition of the lines in the second direction in each candidate sub-region;
when a first candidate sub-region exists, wherein the definition of the lines in the first direction is larger than that of the lines in the second direction, the second weight corresponding to the first candidate sub-region is configured to be larger than the first weight corresponding to the first candidate sub-region;
when a second candidate sub-region exists, wherein the definition of the line in the first direction is smaller than that of the line in the second direction, the first weight corresponding to the second candidate sub-region is configured to be larger than the second weight corresponding to the second candidate sub-region.
5. The method of claim 1, wherein determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point comprises:
when the position of the pixel point in the image is in a range corresponding to the first direction, taking the second phase difference as a target phase difference, wherein the range corresponding to the first direction is an image range which is defined by a line passing through an origin and forms a second preset angle with a first direction coordinate axis in an image coordinate system, and the image coordinate system is established by taking the center of the image as the origin;
when the position of the pixel point in the image is in a range corresponding to the second direction, taking the first phase difference as a target phase difference, wherein the range corresponding to the second direction is an image range which is defined by a line passing through the origin and forms a third preset angle with a second direction coordinate axis in the image coordinate system; the first direction and the second direction are perpendicular to each other, and the sum of the second preset angle and the third preset angle is a right angle.
6. The method of claim 5, further comprising:
and when the first weight corresponding to the first phase difference and the second weight corresponding to the second phase difference are not obtained according to the position of the pixel point in the image, executing a step of taking the second phase difference as a target phase difference when the position of the pixel point in the image is in a range corresponding to the first direction.
7. The method of claim 1, wherein the determination of the confidence gap amount comprises any one of:
taking the difference between the reliability of the first phase difference and the reliability of the second phase difference as the reliability difference;
taking the absolute value of the difference as the credibility gap amount;
and taking the ratio of the reliability of the first phase difference and the reliability of the second phase difference as the reliability difference quantity.
8. The method of claim 1, further comprising:
when the reliability difference quantity is not within a preset range, determining the maximum reliability value in the reliability of the first phase difference and the reliability of the second phase difference;
and taking the phase difference corresponding to the maximum reliability as a target phase difference.
9. The method according to any one of claims 1 to 8, wherein the position of the pixel point in the image comprises an angle of the pixel point;
the obtaining of the position of the pixel point in the image includes:
acquiring the image coordinates of the pixel points in the image coordinate system;
and calculating the angle of the pixel point according to the image coordinate.
10. The method of claim 9, wherein the obtaining image coordinates of the pixel point in an image coordinate system comprises:
acquiring pixel point coordinates of the pixel points in a pixel coordinate system;
and converting the pixel point coordinates into image coordinates in the image coordinate system.
11. The method of claim 1, applied to an electronic device comprising an image sensor, wherein the image sensor comprises a plurality of pixel groups arranged in an array, and each pixel group comprises M × N pixels arranged in an array; each pixel point corresponds to a photosensitive unit, wherein M and N are natural numbers larger than or equal to 2;
the acquiring a first phase difference and a second phase difference of the same pixel point in the image comprises:
acquiring a target brightness graph according to the brightness value of the pixel points included in each pixel point group;
performing segmentation processing on the target brightness graph to obtain a first segmentation brightness graph and a second segmentation brightness graph;
determining the phase difference value of the mutually matched pixel points according to the position difference of the mutually matched pixel points in the first segmentation brightness graph and the second segmentation brightness graph;
and determining a first phase difference and a second phase difference of the pixel points according to the phase difference values of the mutually matched pixel points.
12. A phase difference acquisition apparatus, comprising:
the first obtaining module is used for obtaining a first phase difference and a second phase difference of the same pixel point in an image, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle;
the second obtaining module is used for obtaining the reliability of the first phase difference and the reliability of the second phase difference; obtaining a reliability difference according to the reliability of the first phase difference and the reliability of the second phase difference; when the credibility difference is within a preset range, acquiring the position of the pixel point in the image;
and the target phase difference determining module is used for determining the target phase difference of the pixel point according to the position of the pixel point in the image.
13. An electronic device comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of the phase difference acquisition method according to any one of claims 1 to 11.
14. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 11.
CN201911101457.0A 2019-11-12 2019-11-12 Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium Active CN112866550B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911101457.0A CN112866550B (en) 2019-11-12 2019-11-12 Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium
PCT/CN2020/122790 WO2021093537A1 (en) 2019-11-12 2020-10-22 Phase difference acquisition method and device, electronic apparatus, and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911101457.0A CN112866550B (en) 2019-11-12 2019-11-12 Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN112866550A CN112866550A (en) 2021-05-28
CN112866550B true CN112866550B (en) 2022-07-15

Family

ID=75911951

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911101457.0A Active CN112866550B (en) 2019-11-12 2019-11-12 Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN112866550B (en)
WO (1) WO2021093537A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013141108A (en) * 2011-12-29 2013-07-18 Nikon Corp Interchangeable lens and camera body
CN108337424A (en) * 2017-01-17 2018-07-27 中兴通讯股份有限公司 A kind of phase focusing method and its device
CN109905600A (en) * 2019-03-21 2019-06-18 上海创功通讯技术有限公司 Imaging method, imaging device and computer readable storage medium
CN110246853A (en) * 2018-03-09 2019-09-17 三星电子株式会社 Imaging sensor and image pick-up device including phase-detection pixel

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014155806A1 (en) * 2013-03-29 2014-10-02 富士フイルム株式会社 Imaging device, and focus control method
KR102125561B1 (en) * 2013-12-03 2020-06-22 삼성전자주식회사 Photographing apparatus and method
US9445018B2 (en) * 2014-05-01 2016-09-13 Semiconductor Components Industries, Llc Imaging systems with phase detection pixels
US9749556B2 (en) * 2015-03-24 2017-08-29 Semiconductor Components Industries, Llc Imaging systems having image sensor pixel arrays with phase detection capabilities
US10440301B2 (en) * 2017-09-08 2019-10-08 Apple Inc. Image capture device, pixel, and method providing improved phase detection auto-focus performance

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013141108A (en) * 2011-12-29 2013-07-18 Nikon Corp Interchangeable lens and camera body
CN108337424A (en) * 2017-01-17 2018-07-27 中兴通讯股份有限公司 A kind of phase focusing method and its device
CN110246853A (en) * 2018-03-09 2019-09-17 三星电子株式会社 Imaging sensor and image pick-up device including phase-detection pixel
CN109905600A (en) * 2019-03-21 2019-06-18 上海创功通讯技术有限公司 Imaging method, imaging device and computer readable storage medium

Also Published As

Publication number Publication date
CN112866550A (en) 2021-05-28
WO2021093537A1 (en) 2021-05-20

Similar Documents

Publication Publication Date Title
US11570423B2 (en) System and methods for calibration of an array camera
CN112866549B (en) Image processing method and device, electronic equipment and computer readable storage medium
WO2020010945A1 (en) Image processing method and apparatus, electronic device and computer-readable storage medium
US8929685B2 (en) Device having image reconstructing function, method, and recording medium
CN109559353B (en) Camera module calibration method and device, electronic equipment and computer readable storage medium
WO2023016144A1 (en) Focusing control method and apparatus, imaging device, electronic device, and computer readable storage medium
WO2009095422A2 (en) Methods and apparatuses for addressing chromatic aberrations and purple fringing
JP2013061850A (en) Image processing apparatus and image processing method for noise reduction
CN108156383B (en) High-dynamic billion pixel video acquisition method and device based on camera array
CN112866553B (en) Focusing method and device, electronic equipment and computer readable storage medium
JP2020088689A (en) Image processing apparatus, imaging apparatus, and image processing method
CN112866655B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN112866550B (en) Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium
CN112866547B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866554B (en) Focusing method and device, electronic equipment and computer readable storage medium
WO2021093528A1 (en) Focusing method and apparatus, and electronic device and computer readable storage medium
JP6755737B2 (en) Distance measuring device, imaging device, and distance measuring method
CN112866552B (en) Focusing method and device, electronic equipment and computer readable storage medium
EP2306397A1 (en) Method and system for optimizing lens aberration detection
CN110728714B (en) Image processing method and device, storage medium and electronic equipment
JP2005216191A (en) Stereo image processing apparatus and method
CN112866674B (en) Depth map acquisition method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant