CN112866550A - Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium - Google Patents

Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium Download PDF

Info

Publication number
CN112866550A
CN112866550A CN201911101457.0A CN201911101457A CN112866550A CN 112866550 A CN112866550 A CN 112866550A CN 201911101457 A CN201911101457 A CN 201911101457A CN 112866550 A CN112866550 A CN 112866550A
Authority
CN
China
Prior art keywords
phase difference
image
pixel point
pixel
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911101457.0A
Other languages
Chinese (zh)
Other versions
CN112866550B (en
Inventor
贾玉虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911101457.0A priority Critical patent/CN112866550B/en
Priority to PCT/CN2020/122790 priority patent/WO2021093537A1/en
Publication of CN112866550A publication Critical patent/CN112866550A/en
Application granted granted Critical
Publication of CN112866550B publication Critical patent/CN112866550B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The application relates to a phase difference acquisition method and device, an electronic device and a computer readable storage medium. The method comprises the following steps: acquiring a first phase difference and a second phase difference of a pixel point, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle; acquiring the position of the pixel point in the image; and determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point. By adopting the scheme of the application, the accuracy of phase difference acquisition can be improved.

Description

Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for acquiring a phase difference, an electronic device, and a computer-readable storage medium.
Background
When an image is captured, in order to ensure that the image is captured clearly, focusing is generally required on the electronic device, and the focusing refers to a process of adjusting the distance between a lens and an image sensor. Currently, a common focusing method includes phase detection auto-focusing. In order to perform phase detection automatic focusing, some phase detection pixel points may be generally arranged in pairs among pixel points included in the image sensor, and the phase difference is acquired by the phase detection pixel points to perform focusing. Due to the existence of aberration in the camera, the traditional phase difference acquisition method has the problem of low precision.
Disclosure of Invention
The embodiment of the application provides a phase difference acquisition method, a phase difference acquisition device, an electronic device and a computer readable storage medium, which can improve the phase difference acquisition precision.
A phase difference acquisition method, the method comprising:
acquiring a first phase difference and a second phase difference of a pixel point in an image, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle;
acquiring the position of a pixel point in an image;
and determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point.
A phase difference acquisition apparatus comprising:
the first obtaining module is used for obtaining a first phase difference and a second phase difference of a pixel point in an image, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle;
the second acquisition module is used for acquiring the positions of the pixel points in the image;
and the target phase difference determining module is used for determining the target phase difference of the pixel points according to the positions of the pixel points in the image.
An electronic device comprising a memory and a processor, the memory having a computer program stored thereon, the computer program, when executed by the processor, causing the processor to perform the steps of:
acquiring a first phase difference and a second phase difference of a pixel point in an image, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle;
acquiring the position of a pixel point in an image;
and determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring a first phase difference and a second phase difference of a pixel point in an image, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle;
acquiring the position of a pixel point in an image;
and determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point.
According to the phase difference obtaining method and device, the electronic equipment and the computer readable storage medium, the first phase difference and the second phase difference of the pixel points in the image are obtained, wherein the first direction corresponding to the first phase difference and the second direction corresponding to the second phase difference form the preset included angle, one pixel point corresponds to two phase differences, the position of the pixel point in the image is obtained, the target phase difference of the pixel point is determined according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point, the target phase difference can be determined according to the position of the pixel point, the phase difference obtaining precision is improved, image blurring caused by the image difference is effectively avoided, and imaging is clearer.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of an image processing circuit in one embodiment;
FIG. 2 is a flow chart of a phase difference acquisition method in one embodiment;
FIG. 3 is a schematic illustration of a smear in one embodiment;
FIG. 4 is a schematic illustration of an image field range in one embodiment;
FIG. 5 is a diagram illustrating the angular division of pixel points in one embodiment;
FIG. 6 is a flow diagram illustrating the configuration of the first weights and the second weights in one embodiment;
FIG. 7 is a diagram illustrating a structure of a pixel in one embodiment;
FIG. 8 is a diagram of an image coordinate system and a pixel coordinate system in one embodiment;
FIG. 9 is a schematic diagram showing an internal structure of an image sensor according to an embodiment;
FIG. 10 is a diagram illustrating a pixel group Z according to an embodiment;
FIG. 11 is a schematic diagram of a process for obtaining a first phase difference and a second phase difference according to one embodiment;
FIG. 12 is a schematic diagram of the determination of pixels at the same location from respective intermediate phase difference maps in one embodiment;
FIG. 13 is a schematic illustration of a target phase difference map in one embodiment;
FIG. 14 is a diagram illustrating a slicing process performed on a target luminance graph in a first direction according to one embodiment;
FIG. 15 is a diagram illustrating a slicing process performed on a target luminance graph in a second direction in one embodiment;
FIG. 16 is a flow chart illustrating the process of determining phase difference values between matched pixels according to one embodiment;
fig. 17 is a block diagram showing the configuration of a phase difference acquisition device in one embodiment;
fig. 18 is a schematic diagram of an internal structure of an electronic device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that the terms "first," "second," and the like as used herein may be used herein to describe various data, but the data is not limited by these terms. These terms are only used to distinguish one type of data from another. For example, a first phase difference may be referred to as a second phase difference, and similarly, a second phase difference may be referred to as a first phase difference, without departing from the scope of the present application. The first phase difference and the second client are both phase differences, but they are not the same phase difference.
The embodiment of the application provides electronic equipment. The electronic device includes therein an Image Processing circuit, which may be implemented using hardware and/or software components, and may include various Processing units defining an ISP (Image Signal Processing) pipeline. FIG. 1 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 1, for convenience of explanation, only aspects of the image processing technology related to the embodiments of the present application are shown.
As shown in fig. 1, the image processing circuit includes an ISP processor 140 and control logic 150. The image data captured by the imaging device 110 is first processed by the ISP processor 140, and the ISP processor 140 analyzes the image data to capture image statistics that may be used to determine and/or control one or more parameters of the imaging device 110. The imaging device 110 may include a camera having one or more lenses 112 and an image sensor 114. The image sensor 114 may include an array of color filters (e.g., Bayer filters), and the image sensor 114 may acquire light intensity and wavelength information captured with each imaged pixel of the image sensor 114 and provide a set of raw image data that may be processed by the ISP processor 140. The attitude sensor 120 (e.g., three-axis gyroscope, hall sensor, accelerometer) may provide parameters of the acquired image processing (e.g., anti-shake parameters) to the ISP processor 140 based on the type of interface of the attitude sensor 120. The attitude sensor 120 interface may utilize an SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
In addition, the image sensor 114 may also send raw image data to the attitude sensor 120, the sensor 120 may provide the raw image data to the ISP processor 140 based on the type of interface of the attitude sensor 120, or the attitude sensor 120 may store the raw image data in the image memory 130.
The ISP processor 140 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel point may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 140 may perform one or more image processing operations on the raw image data, collecting statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
The ISP processor 140 may also receive image data from the image memory 130. For example, the attitude sensor 120 interface sends raw image data to the image memory 130, and the raw image data in the image memory 130 is then provided to the ISP processor 140 for processing. The image Memory 130 may be a portion of a Memory device, a storage device, or a separate dedicated Memory within an electronic device, and may include a DMA (Direct Memory Access) feature.
Upon receiving raw image data from the image sensor 114 interface or from the attitude sensor 120 interface or from the image memory 130, the ISP processor 140 may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to image memory 130 for additional processing before being displayed. ISP processor 140 receives processed data from image memory 130 and performs image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. The image data processed by ISP processor 140 may be output to display 160 for viewing by a user and/or further processed by a Graphics Processing Unit (GPU). Further, the output of the ISP processor 140 may also be sent to the image memory 130, and the display 160 may read image data from the image memory 130. In one embodiment, image memory 130 may be configured to implement one or more frame buffers.
The statistical data determined by the ISP processor 140 may be transmitted to the control logic 150 unit. For example, the statistical data may include image sensor 114 statistics such as gyroscope vibration frequency, auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 112 shading correction, and the like. The control logic 150 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of the imaging device 110 and control parameters of the ISP processor 140 based on the received statistical data. For example, the control parameters of the imaging device 110 may include attitude sensor 120 control parameters (e.g., gain, integration time of exposure control, anti-shake parameters, etc.), camera flash control parameters, camera anti-shake displacement parameters, lens 112 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 112 shading correction parameters.
In one embodiment, the image sensor 114 in the imaging device (camera) may include an electronic device including an image sensor including a plurality of pixel groups arranged in an array, each pixel group including M × N pixels arranged in an array; each pixel point corresponds to a photosensitive unit, wherein M and N are both natural numbers which are greater than or equal to 2.
A first image is acquired through the lens 112 and the image sensor 114 in the imaging device (camera) 110 and sent to the ISP processor 140. After receiving the first image, the ISP processor 140 may perform main body detection on the first image to obtain the region of interest in the first image, or may obtain the region of interest by obtaining a region selected by the user as the region of interest, or by obtaining the region of interest in other manners, which is not limited to this.
The ISP processor 140 is configured to obtain a first phase difference and a second phase difference of a pixel point, where a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle; acquiring the position of a pixel point in an image; and determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point. The ISP processor 140 may send information regarding the phase, etc. to the control logic 150.
After receiving the information about the target area, the control logic 150 controls the lens 112 in the imaging device (camera) to move so as to focus on the position in the actual scene corresponding to the target area.
Fig. 2 is a flowchart of a phase difference acquisition method in one embodiment. The phase difference obtaining method in the present embodiment is described by taking the electronic device operating in fig. 1 as an example. As shown in fig. 2, the phase difference acquisition method includes steps 202 to 206.
Step 202, a first phase difference and a second phase difference of a pixel point in an image are obtained, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle.
The phase difference refers to a difference in position of an image formed by imaging light rays incident on the lens from different directions on the image sensor. Each pixel point has a corresponding first phase difference and a corresponding second phase difference.
Specifically, the electronic device obtains a first phase difference and a second phase difference of a pixel point in an image. The first direction of the first phase difference may be an image vertical direction, and the second direction corresponding to the second phase difference may be an image horizontal direction. Alternatively, the first direction corresponding to the first phase difference may be an image horizontal direction, and the second direction corresponding to the second phase difference may be an image vertical direction. A preset included angle is formed between a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference. Wherein the preset included angle may be any included angle other than 0 degrees, 180 degrees, and 360 degrees. The first direction corresponding to the first phase difference is a 45-degree direction, and the second direction corresponding to the second phase difference is a 135-degree direction, etc., but the present invention is not limited thereto.
In this embodiment, the electronic device may obtain the first phase difference and the second phase difference of the pixel points in the image, and determine the first phase difference and the second phase difference corresponding to the pixel point group according to the first phase difference and the second phase difference of the pixel points in the image.
And 204, acquiring the positions of the pixel points in the image.
Specifically, the position of the pixel point in the image may specifically be an angle of the pixel point in the image, an angle and a direction of the pixel point in the image, or a region of the pixel point in the image, which is not limited to this. A pixel in an image is a minimum unit of the image, and is generally represented by a number sequence, and the number sequence may be referred to as a pixel value of the pixel in general. And the electronic equipment acquires the position of the pixel point in the image according to the coordinate of the pixel point. The coordinates of the pixel points may include pixel point coordinates of the pixel points and image coordinates of the pixel points.
In this embodiment, the electronic device may obtain the angle at which the pixel point is located from the corresponding relationship between the coordinate of the pixel point and the angle according to the coordinate of the pixel point. Or, the electronic device may obtain the region corresponding to the pixel point from the correspondence between the pixel point coordinates and the region according to the pixel point coordinates. Or the electronic device can calculate the angle of the pixel point in the image coordinate system according to the coordinate of the pixel point.
And step 206, determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point.
In the actual optical system, the result of non-paraxial ray tracing and the result of paraxial ray tracing do not match each other, and the deviation from the ideal state of gaussian optics is called aberration. The imaged aberrations include spherical aberration, coma, curvature of field, astigmatism, chromatic aberration, and distortion. Besides distortion, other kinds of aberrations may cause degradation of the imaging quality of the edge. FIG. 3 is a schematic illustration of a smear in one embodiment. The circled portion of the ellipse in fig. 3 is the smear generated by the presence of the aberration. FIG. 4 is a diagram illustrating image field coverage in one embodiment. The meridional curve and the sagittal curve can reflect the optical quality, astigmatism, lens contrast, resolution and the like of the lens. Meridian plane: the plane formed by the principal ray of the off-axis object point and the principal axis of the optical system is called the meridian plane of the optical system. The image plane where the meridional image points are located is called a meridional image plane. Sagittal plane: the plane perpendicular to the meridian plane and the principal ray of the off-axis object point is called the sagittal plane of the optical system. And an image plane where the sagittal image points are located is called a sagittal image plane. FIG. 5 is a diagram illustrating angle division of pixel points in an embodiment. Because the shape of the lens is generally thick in the middle and thin at the edges. I.e. the larger the radius, the thinner the lens. The circumferences on any same radius are of the same thickness. Therefore, streaks are not likely to occur in lines extending from the center to the periphery, and streaks are likely to occur in vertical lines or horizontal lines as shown in fig. 3. Therefore, in fig. 5, lines near 0 degree and 180 degree can be calculated by using the vertical phase difference, which is the phase difference between the upper and lower sides, and straight lines at 90 degree and 270 degree can be calculated by using the horizontal phase difference, which is the phase difference between the left and right sides.
Specifically, the electronic device determines the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point. For example, the electronic device may determine whether to select the first phase difference or the second phase difference as the target phase difference according to the angle at which the pixel point is located. Or the electronic device may determine the angle range of the pixel according to the angle of the pixel, determine a first weight of the first phase difference and a second weight corresponding to the second phase difference, and perform weighting processing on the first phase difference and the second phase difference according to the first weight and the second weight to obtain the target phase difference. Or, the electronic device may determine the first weight of the first phase difference and the second weight corresponding to the second phase difference directly according to the target sub-region where the pixel point is located in the image, and the target phase difference of the pixel point is obtained through calculation. Or the electronic equipment determines to select the first phase difference or the second phase difference as the target phase difference according to the target subarea where the pixel points are located in the image.
In this embodiment, the electronic device can control the lens to move for focusing according to the target phase difference. Because the accuracy of the target phase difference is improved, focusing is only carried out according to the target phase difference during focusing, multiple times of movement are not needed, and the focusing efficiency is improved.
The phase difference obtaining method in this embodiment obtains a first phase difference and a second phase difference of a pixel point in an image, where a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle, and then one pixel point corresponds to two phase differences to obtain a position of the pixel point in the image, and determines a target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point, so that a target phase difference can be determined according to the position of the pixel point, accuracy of obtaining the phase difference is improved, and image blurring caused by the phase difference is effectively avoided, and imaging is clearer.
In one embodiment, determining the target phase difference of the pixel point according to the position of the pixel point in the image includes: acquiring a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference according to the position of the pixel point in the image; and performing corresponding weighting processing on the first phase difference and the second phase difference according to the first weight and the second weight to obtain a target phase difference of the pixel point.
And the positions of the pixel points in the image correspond to the first weight and the second weight. The first weight and the second weight are weights for completing calibration in advance. Can be directly obtained in actual use. The first weights corresponding to the pixel points at the same position are the same, and the second weights are the same. The first weight corresponds to the first phase difference and the second weight corresponds to the second phase difference. For example, the first phase difference is a horizontal phase difference, and the second phase difference is a vertical phase difference. Then, the horizontal phase difference of each position may not be the same corresponding to the first weight. The second weight corresponding to the vertical phase difference at each position may also be different. For example, the position a of the pixel 1 corresponds to the first weight a and the second weight b, and the position a of the pixel 2 also corresponds to the first weight a and the second weight b. The sum of the first weight and the second weight may be 1.
Specifically, the electronic device obtains a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference according to the position of the pixel point in the image. And the electronic equipment weights the first phase difference according to the first weight, weights the second phase difference according to the second weight, and processes the first phase difference and the second phase difference to obtain the target phase difference of the pixel point. For example, the first weight is a, the second weight is b, the first phase difference is X, the second phase difference is Y, and the target phase difference is aX + bY. Or the electronic equipment weights the first phase difference according to the first weight, weights the second phase difference according to the second weight, and averages to obtain the target phase difference of the pixel point. Namely, the target phase difference is (aX + bY)/(a + b).
In the phase difference obtaining method in this embodiment, a first weight corresponding to a first phase difference and a second weight corresponding to a second phase difference are obtained according to a position of a pixel point in an image; and performing corresponding weighting processing on the first phase difference and the second phase difference according to the first weight and the second weight to obtain a target phase difference of the pixel point, combining the first phase difference and the second phase difference to obtain the target phase difference, improving the accuracy of phase difference acquisition, avoiding image unsharp caused by aberration, and improving the definition of a shot image.
In one embodiment, obtaining a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference according to a position of the pixel point in the image includes: and acquiring a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference in the target subarea according to the target subarea where the pixel point is located, wherein the target subarea is determined from at least two candidate subareas, and the candidate subareas are obtained by dividing the image according to a first preset angle by taking the center of the image as a vertex according to an image coordinate system.
Wherein each image comprises at least two candidate sub-regions. The candidate sub-regions are obtained by dividing the image according to a first preset angle with the center O of the image as a vertex according to the image coordinate system xOy shown in fig. 8.
Specifically, the electronic device may determine the target sub-region where the pixel is located according to the pixel coordinates. The electronic equipment acquires a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference in the target sub-region. Or the electronic device may obtain the target sub-region where the pixel point is located according to the image coordinate corresponding to the pixel point. For example, the electronic device obtains that the target sub-region where the pixel is located is the region a according to the pixel coordinates, and then obtains a first weight a of a first phase difference and a second weight b of a second phase difference corresponding to the region a. And when the target sub-region where the pixel is located is obtained as a B region according to the pixel coordinates, obtaining the weight c of the first phase difference corresponding to the B region and the second weight d of the second phase difference.
According to the phase difference obtaining method in the embodiment of the application, the first weight corresponding to the first phase difference and the second weight corresponding to the second phase difference in the target sub-region are obtained according to the target sub-region where the pixel point is located, the first phase difference and the second phase difference can be combined together to calculate the target phase difference, the phase difference obtaining accuracy is improved, the image blurring caused by the image difference is avoided, and the definition of the shot image is improved.
In one embodiment, fig. 6 is a flow chart illustrating the configuration of the first weight and the second weight in one embodiment. As shown in fig. 6, the arrangement of the first weight and the second weight includes:
step 602, an image coordinate system is obtained, wherein the image coordinate system is established with the center of the image as an origin.
In particular, the center of the image may be where the intersection of the image diagonals is located. Alternatively, for example, if one image is 1024 × 720, and the pixel coordinate system is established with the upper left corner of the image as the origin, then the coordinates of the center of the image in the pixel coordinate system are (512, 360). The electronic device acquires an image coordinate system established with the center of the image as an origin.
Step 604, dividing the image into at least two candidate sub-regions according to a first preset angle by taking the center of the image as a vertex according to an image coordinate system.
Here, the division according to the first preset angle is not limited to the division manner shown in fig. 5. The first preset angle may be 1 °, 2 °, 3 °, 5 °, 10 °, 15 °, etc., but is not limited thereto. The more candidate subregions, the more accurate the phase difference that is ultimately obtained. The electronic device may be divided into at least two candidate sub-regions according to, but not limited to, 1 °, 2 °, 3 °, 5 °, 10 °, 15 °, etc. The image may be an arbitrary piece of image. The image may be specifically the image shown in fig. 3, and may be an image including horizontal lines and vertical lines.
Specifically, the electronic device divides the image into different sub-regions according to a first preset angle, with an origin of an image coordinate system as a center, that is, with the center of the image as a vertex. Typically, the default positive x-axis direction is 0 degrees. The x-axis negative direction may be a direction of 0 degrees, the y-axis positive direction may be a direction of 0 degrees, or a direction at an angle to the x-axis positive direction may be a direction of 0 degrees. For example, as shown in fig. 5, the image region enclosed by 0 ° to 45 ° may be one candidate sub-region, the image region enclosed by 45 ° to 90 ° may be one candidate sub-region, the image region enclosed by 90 ° to 135 ° may be one candidate sub-region, the image region enclosed by 135 ° to 180 ° may be one candidate sub-region … 315 ° to 0 ° may be one candidate sub-region, and the like. Alternatively, the electronic device may set 0 ° to 10 ° as one candidate sub-region, 10 ° to 25 ° as one candidate sub-region, 25 ° to 50 ° as one candidate sub-region, and the like, but is not limited thereto.
Step 606, obtaining the definition of the lines in the first direction and the definition of the lines in the second direction in each candidate sub-region.
Specifically, the lines of the first direction may refer to lines within a range of the first direction, and likewise, the lines of the second direction may refer to lines within a range of the second direction. For example, the line in the first direction may be a line in a horizontal range, and a line having an angle with the horizontal line within a preset angle range. The lines in the first direction can also be lines in a vertical range, and lines with an included angle with the vertical line in a preset range. The lines of the first direction may be perpendicular to the lines of the second direction. The lines in the first direction correspond to the first direction in the application, and the lines in the second direction correspond to the second direction in the application. That is, when the lines in the first direction are horizontal lines, the lines in the second direction are vertical lines; when the lines in the first direction are vertical lines, the lines in the second direction are horizontal lines. The electronic equipment acquires the definition of the lines in the first direction and the definition of the lines in the second direction in each of at least two candidate sub-regions.
Step 608, when there is a first candidate sub-region having a definition of a line in the first direction greater than a definition of a line in the second direction, configuring a second weight corresponding to the first candidate sub-region to be greater than a first weight corresponding to the first candidate sub-region.
Fig. 7 is a schematic structural diagram of a pixel in an embodiment. As shown in FIG. 7, 702 can be a pixel or a group of pixels. Taking each pixel point as an example, including a sub-pixel point 1, a sub-pixel point 2, a sub-pixel point 3 and a sub-pixel point 4, the sub-pixel point 1 and the sub-pixel point 2 can be synthesized, the sub-pixel point 3 and the sub-pixel point 4 are synthesized to form a PD pixel point pair in the vertical direction, so that a phase difference in the vertical direction can be obtained, and a horizontal edge can be detected; and synthesizing the sub-pixel point 1 and the sub-pixel point 3, and synthesizing the sub-pixel point 2 and the sub-pixel point 4 to form a PD pixel point pair in the left and right directions, so as to obtain a phase difference in the horizontal direction and detect a vertical edge.
Specifically, the lines in the first direction are taken as horizontal lines, and the lines in the second direction are taken as vertical lines. When the first candidate sub-region exists, the definition of the horizontal direction lines is larger than that of the vertical direction lines, and it is shown that the horizontal direction lines in the first candidate sub-region are clearer, and the horizontal direction lines are less prone to aberration. And the horizontal line is detected through the upper and lower PD pixel point pairs, that is, the horizontal line needs to be detected by using the vertical phase difference, then the second weight corresponding to the vertical phase difference corresponding to the first candidate sub-region is configured to be greater than the first weight corresponding to the horizontal phase difference. Because the weight corresponding to the phase difference with higher definition is improved, the image shot according to the phase difference is clearer.
Step 610, when there is a second candidate sub-region in which the definition of the line in the first direction is smaller than the definition of the line in the second direction, configuring the first weight corresponding to the second candidate sub-region to be larger than the second weight corresponding to the second candidate sub-region.
Specifically, the lines in the first direction are taken as horizontal lines, and the lines in the second direction are taken as vertical lines. When a second candidate subregion with the definition of the horizontal direction lines smaller than that of the vertical direction lines exists, the fact that the vertical direction lines are clearer in the subregion is indicated, and aberration is not easy to generate on the vertical direction lines. And the vertical line is detected through the left and right PD pixel point pairs, that is, the vertical line needs to be detected by using the horizontal phase difference, then the first weight corresponding to the horizontal phase difference corresponding to the second candidate subregion is configured to be greater than the second weight corresponding to the vertical phase difference. Because the weight corresponding to the phase difference with higher definition is improved, the image shot according to the phase difference is clearer.
In one embodiment, when there is a third candidate sub-region where the sharpness of the line in the first direction is equal to the sharpness of the line in the second direction, the first weight and the second weight corresponding to the third candidate sub-region may be the same.
The phase difference obtaining method in the embodiment of the application obtains the definition of the line in the first direction and the definition of the line in the second direction in each candidate subarea, when there is a first candidate sub-region where the sharpness of the lines in the first direction is greater than the sharpness of the lines in the second direction, configuring a second weight corresponding to the first candidate sub-region to be greater than a first weight corresponding to the first candidate sub-region, when there is a second candidate sub-region where the sharpness of the line in the first direction is less than or equal to the sharpness of the line in the second direction, the first weight corresponding to the second candidate subregion is configured to be larger than the second weight corresponding to the second candidate subregion, the weights of different subregions in the image can be calibrated according to the definition of lines in different directions, the weight corresponding to the phase difference with higher definition is large, and the accuracy of phase difference acquisition is improved, so that the definition of the shot image is improved.
In one embodiment, determining the target phase difference of the pixel point according to the position of the pixel point in the image includes: when the position of the pixel point in the image is in a range corresponding to a first direction, taking a second phase difference as a target phase difference, wherein the range corresponding to the first direction is an image range which is formed by a line passing through an origin and is surrounded by a second preset angle with a first direction coordinate axis in an image coordinate system, and the image coordinate system is established by taking the center of the image as the origin; and when the position of the pixel point in the image is in a range corresponding to a second direction, taking the first phase difference as a target phase difference, wherein the range corresponding to the second direction is an image range which is defined by a line passing through the origin and forming a third preset angle with a coordinate axis of the second direction in the image coordinate system.
When the first direction is a horizontal direction, the second direction is a vertical direction; when the first direction is a vertical direction, then the second direction is a horizontal direction. The first preset angle and the second preset angle may be the same or different. Taking the first direction as the vertical direction as an example, the range corresponding to the first direction may be an image range surrounded by a line passing through the origin and forming a second predetermined angle with the y-axis of the image coordinate system. The second direction is a horizontal direction, and the range corresponding to the second direction may be an image range surrounded by a line passing through the origin and making a third preset angle with the x-axis of the image coordinate system. The line passing through the origin may be a straight line, may be a line of a predetermined shape, and the like, but is not limited thereto.
Specifically, as shown in fig. 7, the PD pixel point pairs in the up-down direction, i.e., the vertical phase difference, can detect the horizontal edge. And PD pixel point pairs in the left and right directions, namely, the horizontal phase difference is obtained, and the vertical edge can be detected. And then when the electronic equipment detects that the positions of the pixel points in the image are in the range corresponding to the horizontal direction, taking the vertical phase difference as a target phase difference. And when the electronic equipment detects that the position of the pixel point in the image is in the range corresponding to the vertical direction, taking the horizontal phase difference as a target phase difference. And when the electronic equipment detects that the position of the pixel point in the image is in the range corresponding to the horizontal direction, taking the vertical phase difference as a target phase difference. For example, the first direction is a vertical direction, the first predetermined angle is 45 degrees, the second direction is a horizontal direction, and the second predetermined angle is 45 degrees. Then lines through the origin at 45 degrees to the vertical, i.e., y-axis, would be lines at 45 degrees and 225 degrees, and lines at 135 degrees and 315 degrees. Then ranges corresponding to the two first directions are obtained. The first is the range enclosed by 45 degrees and the y axis and the range enclosed by 225 degrees and the y axis; the second is a range enclosed by 135 degrees and the y axis and a range enclosed by 315 degrees and the y axis. In summary, the electronic device calculates the phase difference of the vertical line in the vertical direction, i.e. 90 °, from the horizontal phase difference between 45 ° and 135 ° and 225 ° and 315 °. The electronic device calculates the phase difference of the line in the horizontal direction, i.e., 0 ° with the vertical phase difference at 135 ° to 225 ° and 315 ° to 45 °.
In the phase difference obtaining method in this embodiment, when the position of the pixel point in the image is within the range corresponding to the first direction, the second phase difference is used as the target phase difference, and when the position of the pixel point in the image is within the range corresponding to the second direction, the first phase difference is used as the target phase difference, so that the phase difference in the corresponding direction can be directly selected as the target phase difference according to the position of the pixel point in the image, and the phase difference obtaining accuracy is improved.
In one embodiment, the phase difference acquisition method further includes: and when the first weight corresponding to the first phase difference and the second weight corresponding to the second phase difference are not obtained according to the position of the pixel point in the image, executing a step of taking the second phase difference as a target phase difference when the position of the pixel point in the image is in the range corresponding to the first direction.
Specifically, when a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference cannot be obtained according to the position of the pixel point in the image, whether the pixel point is in the range corresponding to the first direction or the range corresponding to the second direction is detected. And when the position of the pixel point in the image is in the range corresponding to the first direction, taking the second phase difference as the target phase difference. And when the position of the pixel point in the image is in the range corresponding to the second direction, taking the first phase difference as a target phase difference.
In the phase difference obtaining method in this embodiment, when the first weight corresponding to the first phase difference and the second weight corresponding to the second phase difference are not obtained according to the area where the pixel point is located, when the location of the pixel point in the image is within the range corresponding to the first direction, the second phase difference is used as the target phase difference, that is, if the image capturing device of the electronic device is not calibrated in advance, the location of the pixel point in the image can be directly determined, and the phase difference in the corresponding direction is used as the target phase difference, so that the phase difference obtaining accuracy is improved.
In one embodiment, the phase difference acquisition method further includes: acquiring the reliability of the first phase difference and the reliability of the second phase difference; obtaining a reliability difference according to the reliability of the first phase difference and the reliability of the second phase difference; and when the credibility difference is within the preset range, executing the step of acquiring the position of the pixel point in the image.
The reliability is used to indicate the reliability of the phase difference calculation result. The confidence gap amount is used to represent the gap between the confidence of the first phase difference and the confidence of the second phase difference. For example, the reliability difference amount may be a difference between the reliability of the first phase difference and the reliability of the second phase difference, an absolute value of the reliability of the first phase difference and the reliability of the second phase difference, or a ratio of the reliability of the first phase difference and the reliability of the second phase difference. The confidence level difference is within a preset range, namely the confidence level difference is within a smaller range. The preset range corresponds to the confidence level difference. For example, when the difference between the first phase difference and the second phase difference is an absolute value of a difference between the first phase difference and the second phase difference, the difference corresponds to a predetermined range α. The reliability difference is a ratio of the reliability of the first phase difference to the reliability of the second phase difference, and is not limited to this, corresponding to another preset range β. The predetermined range may be zero to a certain confidence threshold.
Specifically, the electronic device acquires the reliability of the first phase difference and the reliability of the second phase difference. The electronic equipment calculates the reliability difference according to the reliability of the first phase difference and the reliability of the second phase difference. And when the credibility difference is within the preset range, namely when the credibility difference is smaller, executing the step of acquiring the position of the pixel point in the image and determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point.
In the present embodiment, taking the calculation of the horizontal phase difference as an example, a pair of images is divided into a left image and a right image, where the length and width of the left image and the right image are the size of M × N, respectively. For left and right phase differences, we divide the whole image into N lines first, and calculate one of them first. In order to avoid mismatching, the left pixels, the right pixels and the pixel of the pixel are used for forming the brightness characteristic of one pixel point. For example, if the length is set to 5, the combination is x-2, x-1, x, x +1, x +2, but it can also be set to 7, 9, etc., and the longer the pixel combination, the smaller the probability of mismatch, but the larger the calculation amount. Considering x-2, and x +2 pixels, then the center pixel range from which the phase difference can be calculated is 2 to M-1-2, i.e., the first pixel is shifted by 0. If the degree is 7, the calculation range is 3 to M-1-3 and so on. I.e. assuming a length of 2n +1, the pixel movement range is-n to M-1-n. Since there is a maximum range from the lens to the image plane, the phase difference also has a maximum range. So the maximum shift would be set when comparing the left and right images, assuming that pd is calculated for the x-position of the left image and the shift range of the right image is-shift 1 to shift2, then the left image would be used
The brightness of x-n, x-n +1, … x, x +1 … x + n respectively and the brightness of the right picture,
x-n-shift1,x-n+1-shift1,…x-shift1,x+1-shift1…x+n-shift1,
x-n-shift1+1,x-n+1-shift1+1,…x-shift1+1,x+1-shift1+1…x+n-shift1+1
……
comparison was made with x-n + shift2, x-n +1+ shift2, … x + shift2, x +1+ shift12 … x + n + shift 2.
Taking the horizontal phase difference as an example, the phase difference of a certain line coordinate x in the image is calculated, the brightness values of 5 pixel points in the left image x-2, x-1, x, x +1, x +2 are taken, the right image is moved, and the moving range is set to be-10 to + 10. Namely:
performing similar comparison on the brightness values Rx-12, Rx-11, Rx-10, Rx-9, Rx-8 and x-2, x-1, x, x +1, x +2 of the right image;
performing similar comparison on the brightness values Rx-11, Rx-10, Rx-9, Rx-8, Rx-7 and x-2, x-1, x, x +1, x +2 of the right image;
……
performing similar comparison on the brightness values Rx-2, Rx-1, Rx, Rx +1, Rx +2 and x-2, x-1, x, x +1, x +2 of the right image;
performing similar comparison on brightness values Rx-1, Rx, Rx +1, Rx +2, Rx +3 and x-2, x-1, x, x +1 and x +2 of the right image;
……
performing similar comparison on brightness values Rx +7, Rx +8, Rx +9, Rx +10, Rx +11 and x-2, x-1, x, x +1 and x +2 of the right image;
similar comparisons were made for the right image luminance values Rx +8, Rx +9, Rx +10, Rx +11, Rx +12 and x-2, x-1, x, x +1, x + 2.
Taking the five pixel point values of the right image as Rx-2,Rx-1,Rx,Rx+1,Rx+2Five pixel point values of the left image are x-2,x-1,x,x+1,x+2For example, the similarity difference value may be | Rx-2-x-2|+|Rx-1-x-1|+|Rx-x|+|Rx+1--x+1|+|Rx+2-x+2L. The smaller the similarity difference value is, the higher the similarity matching degree is. The shift value corresponding to the minimum value of the similarity difference can be converted into a phase difference value. For example, if the sequence of the center pixel of the right image corresponding to the minimum value of the similarity difference is RX+iThen i may be the phase difference value. And the reliability can be calculated according to the minimum value of the similarity matching degree, and the higher the similarity matching degree is, the higher the reliability is. For the upper and lower images, the brightness values of a row of pixel points in the upper image and the brightness values of a row of pixel points with the same quantity in the lower image are compared similarly. The reliability obtaining process of the upper and lower figures is similar to that of the left and right figures, and is not described in detail here.
In the phase difference obtaining method in the embodiment, the reliability of the first phase difference and the reliability of the second phase difference are obtained, and the reliability difference is obtained according to the reliability of the first phase difference and the reliability of the second phase difference; when the credibility difference is within the preset range, the step of obtaining the position of the pixel point in the image is executed, the position of the pixel point can be obtained when the credibility difference between the first phase difference and the second phase difference is small, the proper phase difference can be obtained according to the position of the pixel point in the image, and the phase difference obtaining precision is improved.
In one embodiment, the phase difference acquisition method further includes: when the reliability difference is not within the preset range, determining the maximum reliability value in the reliability of the first phase difference and the reliability of the second phase difference; and taking the phase difference corresponding to the maximum reliability as the target phase difference.
The confidence difference amount is not within the preset range, which means that the confidence difference amount is greater than the difference amount threshold. For example, the confidence level difference is an absolute value of the difference, which is 3, and the difference threshold is 2, i.e. the confidence level difference is not within the preset range.
Specifically, when the electronic device detects that the reliability gap amount is not within the preset range, the maximum value of the reliability of the first phase difference and the reliability of the second phase difference is determined. The electronic equipment takes the phase difference corresponding to the maximum reliability value as a target phase difference.
In the phase difference obtaining method in this embodiment, when the difference between the degrees of reliability is not within the preset range, the maximum value of the degrees of reliability of the first phase difference and the second phase difference is determined, and the phase difference corresponding to the maximum value of the degrees of reliability is used as the target phase difference, that is, when the difference is large, the phase difference corresponding to the maximum value of the degrees of reliability is used as the target phase difference, so that the target phase difference can be obtained quickly, and the phase difference obtaining efficiency is improved.
In one embodiment, the position of the pixel in the image includes an angle of the pixel. Acquiring the position of a pixel point in an image, comprising: acquiring image coordinates of pixel points in an image coordinate system; and calculating according to the image coordinates to obtain the angle of the pixel point.
The pixel points may refer to pixel points in an image sensor or pixel points in software. The pixel points in the image sensor correspond to the pixel points on the image one by one. FIG. 8 is a diagram illustrating an image coordinate system and a pixel coordinate system in one embodiment. Where xOy is the image coordinate system and uOv is the pixel coordinate system. The origin of xOy is at the center of the image and the origin of uOv is in the upper left corner of the image.
Specifically, when the coordinates of the pixel points in the image, that is, the coordinates in the image coordinate system, are obtained, the angles at which the pixel points are located are calculated according to the image coordinates. For example, the image coordinate is (x, y), and the Angle of the pixel point may be Angle (arctan (y/x)). The range of the arctan function is-90 degrees to 90 degrees. The electronic device may first determine the signs of x and y. When x is positive and y is positive, the pixel point is in the first quadrant, and the value range calculated according to the arctan function is 0-90 degrees. At this time, the Angle may be Angle ═ arctan (y/x), when x is negative, and y is negative, it indicates that the pixel point is in the third quadrant, and the value range calculated according to the arctan function should be 180 ° to 270 °. When x has the same value and y has the same value, the same result is obtained by directly calculating Angle (arctan (y/x)). Then the Angle may be 180 ° + arctan (y/x) when both x and y have negative values.
When x is a negative value and y is a positive value, the pixel point is in the second quadrant, and the value range calculated according to the arctan function is 90-180 degrees. The Angle may be 180 ° + arctan (y/x) at this time. When x is a positive value and y is a negative value, the pixel point is indicated to be in the fourth quadrant, and the value range calculated according to the arctan function should be 270-360 degrees and can also be called as 0 degree. The Angle may be 360 ° + arctan (y/x). Other algorithms that can distinguish the angle at which the pixel is located can be included in the scope of the present embodiment.
In this embodiment, the electronic device may obtain, according to the angle at which the pixel point is located, a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference in the angle; and respectively carrying out weighting processing on the first phase difference and the second phase difference according to the first weight and the second weight to obtain the target phase difference of the pixel point.
In this embodiment, when the angle of the pixel point is within the range corresponding to the first direction, the second phase difference is used as the target phase difference; and when the angle of the pixel point is in the range corresponding to the second direction, taking the first phase difference as the target phase difference. For example, if the angle at which the pixel point is located is 75 degrees and the first direction is the vertical direction, the range enclosed by the included angle of 45 degrees with the vertical direction is the range corresponding to the first direction. Taking fig. 5 as an example, the range corresponding to the first direction is 45 degrees to 135 degrees, and 225 degrees to 315 degrees. And when the angle 75 degrees of the pixel point is in the range corresponding to the first direction, taking the second phase difference as the target phase difference.
In the phase difference obtaining method in this embodiment, image coordinates of a pixel point in an image coordinate system are obtained; the angle of the pixel point is obtained through calculation according to the image coordinate, the angle of the pixel point can be calculated in real time, the target phase difference is determined according to the angle of the pixel point, the first phase difference and the second phase difference, and the phase difference obtaining precision is improved.
In one embodiment, obtaining image coordinates of the pixel point in an image coordinate system includes: acquiring pixel point coordinates of pixel points in a pixel coordinate system; the pixel coordinates are converted into image coordinates in an image coordinate system.
Wherein, the coordinate origin in the pixel coordinate system is at the upper left corner of the image, so that the angle of the whole image is only 90 degrees. The electronic equipment acquires pixel point coordinates of the pixel points in a pixel coordinate system and converts the pixel point coordinates into image coordinates in an image coordinate system.
In the phase difference obtaining method in this embodiment, pixel coordinates of a pixel in a pixel coordinate system are obtained; the pixel point coordinates are converted into image coordinates in an image coordinate system, so that more angle ranges can be obtained, and the phase difference obtaining precision is improved.
Fig. 9 is a schematic diagram showing an internal structure of an image sensor in one embodiment, and an imaging device includes a lens and an image sensor. As shown in fig. 9, the image sensor includes a microlens 90, an optical filter 92, and a photosensitive unit 94. The micro lens 90, the filter 92 and the photosensitive unit 94 are sequentially located on the incident light path, that is, the micro lens 90 is disposed on the filter 92, and the filter 92 is disposed on the photosensitive unit 94.
The filter 92 may include three types of red, green and blue, which only transmit the light with the wavelengths corresponding to the red, green and blue colors, respectively. A filter 92 is disposed on one pixel.
The micro-lens 90 is used for receiving incident light and transmitting the incident light to the optical filter 92. The filter 92 smoothes incident light, and then the smoothed light is incident on the light sensing unit 94 on the basis of pixel points.
The light sensing unit 94 in the image sensor converts light incident from the optical filter 92 into a charge signal by a photoelectric effect and generates a pixel point signal in accordance with the charge signal. The charge signal corresponds to the received light intensity.
Please refer to fig. 10, which illustrates a schematic diagram of an exemplary pixel group Z, as shown in fig. 10, the pixel group Z includes 4 pixels D arranged in an array arrangement manner of two rows and two columns, wherein a color channel of a pixel in a first row and a first column is green, that is, a color filter included in a pixel in the first row and the first column is a green color filter, a color channel of a pixel in a second row and the second column is red, that is, a color filter included in a pixel in the first row and the second column is a red color filter, a color channel of a pixel in the first column and the second row is blue, that is, a color filter included in a pixel in the first column and the second row is a blue color filter, a color channel of a pixel in the second row and the second column is green, that is, a color filter included in a pixel in the second row and the second column is a green color filter.
In one embodiment, fig. 11 is a schematic flow chart of acquiring the first phase difference and the second phase difference in one embodiment. As shown in fig. 11, acquiring the first phase difference and the second phase difference of the pixel point includes:
step 1102, obtaining a target brightness map according to the brightness values of the pixels included in each pixel group.
The brightness value of the pixel point of the image sensor can be represented by the brightness value of the sub-pixel point included in the pixel point. That is, the electronic device may obtain the target luminance map according to the luminance values of the sub-pixels in the pixels included in each pixel group. The "brightness value of a sub-pixel" refers to the brightness value of the optical signal received by the sub-pixel.
Specifically, the sub-pixel included in the image sensor is a photosensitive element capable of converting an optical signal into an electrical signal, so that the intensity of the optical signal received by the sub-pixel can be obtained according to the electrical signal output by the sub-pixel, and the brightness value of the sub-pixel can be obtained according to the intensity of the optical signal received by the sub-pixel.
The target brightness map in the embodiment of the application is used for reflecting the brightness values of the sub pixel points in the image sensor, and the target brightness map can comprise a plurality of pixel points, wherein the pixel point value of each pixel point in the target brightness map is obtained according to the brightness value of the sub pixel point in the image sensor.
And 1104, performing segmentation processing on the target brightness map to obtain a first segmentation brightness map and a second segmentation brightness map.
The electronic device performs a segmentation process on the target luminance graph along the column direction (y-axis direction in the image coordinate system), and each segmentation line during the segmentation process is perpendicular to the column direction during the segmentation process of the target luminance graph along the column direction.
The electronic device performs a segmentation process on the target luminance graph along a row direction (x-axis direction in an image coordinate system), and each segmentation line during the segmentation process is perpendicular to the row direction during the segmentation process.
The first and second sliced luminance graphs obtained by slicing the target luminance graph in the column direction may be referred to as upper and lower graphs, respectively. The first and second sliced luminance maps obtained by slicing the target luminance map in the row direction may be referred to as a left map and a right map, respectively.
Step 1106, determining the phase difference value of the matched pixels according to the position difference of the matched pixels in the first segmentation brightness map and the second segmentation brightness map.
The "mutually matched pixels" means that pixel matrixes formed by the pixels themselves and the surrounding pixels are mutually similar. For example, a pixel point a in the first tangential luminance graph and surrounding pixel points form a pixel point matrix with 3 rows and 3 columns, and the pixel point value of the pixel point matrix is as follows:
2 10 90
1 20 80
0 100 1
the pixel b and the surrounding pixels in the second segmentation brightness graph also form a pixel matrix with 3 rows and 3 columns, and the pixel value of the pixel matrix is as follows:
1 10 90
1 21 80
0 100 2
as can be seen from the above, the two matrices are similar, and it can be considered that the pixel point a and the pixel point b are matched with each other. As for how to judge whether pixel point matrixes are similar, in practical application, there are many different methods, and one common method is to calculate the difference of pixel point values of pixel points corresponding to each of two pixel point matrixes, add the absolute values of the calculated difference values, and judge whether the pixel point matrixes are similar by using the addition result, that is, if the addition result is smaller than a preset threshold, the pixel point matrixes are considered to be similar, otherwise, the pixel point matrixes are considered to be dissimilar.
For example, for the two pixel point matrices in 3 rows and 3 columns, 1 and 2 may be subtracted, 10 and 10 may be subtracted, 90 and 90 are subtracted, … … is obtained, and the absolute values of the obtained differences are added to obtain an addition result of 3, and if the addition result 3 is smaller than a preset threshold, the two pixel point matrices in 3 rows and 3 columns are considered to be similar.
Another common method for judging whether pixel point matrices are similar is to extract edge features of the pixel point matrices by using a sobel convolution kernel calculation mode or a high laplacian calculation mode, and the like, and judge whether the pixel point matrices are similar through the edge features.
In this embodiment of the present application, the "position difference between the mutually matched pixel points" refers to a difference between the position of the pixel point located in the first sliced luminance graph and the position of the pixel point located in the second sliced luminance graph among the mutually matched pixel points. As in the above example, the position difference between the pixel point a and the pixel point b that are matched with each other refers to the difference between the position of the pixel point a in the first segmentation luminance graph and the position of the pixel point b in the second segmentation luminance graph.
The matched pixel points respectively correspond to different images formed by imaging light rays entering the lens from different directions in the image sensor. For example, a pixel point a in the first sliced luminance graph and a pixel point B in the second sliced luminance graph are matched with each other, where the pixel point a may correspond to an image formed at the position a in fig. 1, and the pixel point B may correspond to an image formed at the position B in fig. 1.
Because the mutually matched pixel points respectively correspond to different images formed by imaging light rays which enter the lens from different directions in the image sensor, the phase difference of the mutually matched pixel points can be determined according to the position difference of the mutually matched pixel points.
Step 1108, determining a first phase difference and a second phase difference of the pixel points according to the phase difference values of the pixel points matched with each other.
Specifically, the electronic device determines the vertical phase difference of the pixel points according to the phase difference values of the pixel points matched with each other in the upper graph and the lower graph. And the electronic equipment determines the horizontal phase difference of the pixel points according to the phase difference values of the pixel points matched with each other in the left image and the right image. Then, when the first phase difference is a horizontal phase difference, the second phase difference is a vertical phase difference; when the second phase difference is a vertical phase difference, the first phase difference is a horizontal phase difference.
In the phase difference obtaining method in the embodiment of the application, a target luminance map is obtained according to the luminance values of the pixels included in each pixel group, after the target luminance map is obtained, the target luminance map is segmented, a first segmentation luminance map and a second segmentation luminance map are obtained according to the segmentation result, then, the phase difference of the pixels matched with each other is determined according to the position difference of the pixels matched with each other in the first segmentation luminance map and the second segmentation luminance map, and then, the target phase difference map is generated according to the phase difference of the pixels matched with each other, so that the phase difference can be obtained by using the luminance values of the pixels included in each pixel group in an image sensor, and therefore, compared with a mode of obtaining the phase difference by using sparsely arranged phase detection pixels, the target phase difference map in the embodiment of the application contains relatively rich phase difference information, therefore, the accuracy of the obtained phase difference can be improved.
In one embodiment, determining the first phase difference and the second phase difference of the pixel points according to the phase difference values of the pixel points matched with each other includes: for each target brightness graph, generating an intermediate phase difference graph corresponding to the target brightness graph according to the phase difference of the mutually matched pixel points; and determining a first phase difference and a second phase difference of the pixel points according to the intermediate phase difference graph corresponding to each target brightness graph.
Specifically, for each target luminance graph, the electronic device may obtain an intermediate phase difference graph according to a phase difference between pixel points that are matched with each other in the first and second split luminance graphs corresponding to the target luminance graph. Then, the electronic device may obtain the target phase difference maps according to the intermediate phase difference maps corresponding to each target luminance map. Thus, the accuracy of acquiring the target phase difference diagram is high.
In one embodiment, determining the first phase difference and the second phase difference of the pixel points according to the intermediate phase difference map corresponding to each target luminance map includes:
and (a1) determining pixel points at the same position from each intermediate phase difference image to obtain a plurality of phase difference pixel point sets, wherein the positions of the pixel points included in each phase difference pixel point set in the intermediate phase difference image are the same.
And the positions of the pixels included in each phase difference pixel point set in the middle phase difference image are the same.
As shown in fig. 12, fig. 12 is a schematic diagram of determining pixels at the same position from each intermediate phase difference map in one embodiment. The electronic device determines pixel points at the same position from the intermediate phase difference diagram 1, the intermediate phase difference diagram 2, the intermediate phase difference diagram 3 and the intermediate phase difference diagram 4 respectively, and can obtain 4 phase difference pixel point sets Y1, Y2, Y3 and Y4, wherein the phase difference pixel point set Y1 comprises a pixel point PD _ Gr _1 in the intermediate phase difference diagram 1, a pixel point PD _ R _1 in the intermediate phase difference diagram 2, a pixel point PD _ B _1 in the intermediate phase difference diagram 3 and a pixel point PD _ Gb _1 in the intermediate phase difference diagram 4, the phase difference pixel point set Y2 comprises a pixel point PD _ Gr _2 in the intermediate phase difference diagram 1, a pixel point PD _ R _2 in the intermediate phase difference diagram 2, a pixel point PD _ B _2 in the intermediate phase difference diagram 3 and a pixel point PD _ Gb _2 in the intermediate phase difference diagram 4, and the phase difference pixel point set Y3 comprises a pixel point PD _ Gr _3, a pixel point PD _ Gr _3 in the intermediate phase difference diagram, The pixel point PD _ R _3 in the middle phase difference diagram 2, the pixel point PD _ B _3 in the middle phase difference diagram 3, and the pixel point PD _ Gb _3 in the middle phase difference diagram 4, and the phase difference pixel point set Y4 includes the pixel point PD _ Gr _4 in the middle phase difference diagram 1, the pixel point PD _ R _4 in the middle phase difference diagram 2, the pixel point PD _ B _4 in the middle phase difference diagram 3, and the pixel point PD _ Gb _4 in the middle phase difference diagram 4.
And (a2) for each phase difference pixel point set, splicing the pixel points in the phase difference pixel point set to obtain a sub-phase difference graph corresponding to the phase difference pixel point set.
The sub-phase difference graph comprises a plurality of pixel points, each pixel point corresponds to one pixel point in the phase difference pixel point set, and the pixel point value of each pixel point is equal to the pixel point value of the corresponding pixel point.
And step B3, the electronic equipment splices the obtained multiple sub phase difference maps to obtain a target phase difference map.
As shown in fig. 13, fig. 13 is a schematic diagram of a target phase difference diagram in an embodiment, where the target phase difference diagram includes a sub-phase difference diagram 1, a sub-phase difference diagram 2, a sub-phase difference diagram 3, and a sub-phase difference diagram 4, where the sub-phase difference diagram 1 corresponds to a phase difference pixel set Y1, the sub-phase difference diagram 2 corresponds to a phase difference pixel set Y2, the sub-phase difference diagram 3 corresponds to a phase difference pixel set Y3, and the sub-phase difference diagram 4 corresponds to a phase difference pixel set Y4.
And (a3) splicing the obtained multiple sub-phase difference graphs to determine a first phase difference and a second phase difference of the pixel points.
Specifically, the electronic device splices the obtained multiple sub-phase difference maps to obtain a target phase difference map. And the electronic equipment determines the first phase difference and the second phase difference of the pixel point according to the target phase difference diagram.
In one embodiment, the segmenting the target luminance graph to obtain a first segmented luminance graph and a second segmented luminance graph includes:
and (b1) performing segmentation processing on the target brightness map to obtain a plurality of brightness map regions, wherein each brightness map region comprises a row of pixel points in the target brightness map, or each brightness map region comprises a column of pixel points in the target brightness map.
Each brightness map region comprises a row of pixel points in the target brightness map, or each brightness map region comprises a column of pixel points in the target brightness map.
The electronic device can divide the target brightness image column by column along the row direction to obtain a plurality of pixel point columns of the target brightness image, wherein the pixel point columns are the brightness image areas.
The electronic device can divide the target luminance graph line by line along the column direction to obtain a plurality of pixel point lines of the target luminance graph, wherein the pixel point lines are luminance graph areas.
And (b2) acquiring a plurality of first brightness map areas and a plurality of second brightness map areas from the plurality of brightness map areas, wherein the first brightness map areas comprise pixel points in even rows in the target brightness map, or the first brightness map areas comprise pixel points in even columns in the target brightness map, and the second brightness map areas comprise pixel points in odd rows in the target brightness map, or the second brightness map areas comprise pixel points in odd columns in the target brightness map.
Specifically, in the case of column-by-column slicing of the target luminance map, the electronic device may determine odd-numbered columns as the first luminance map region and even-numbered columns as the second luminance map region.
In the case of line-by-line segmentation of the target luminance map, the electronic device may determine odd lines as the first luminance map region and even lines as the second luminance map region.
And (b3) forming a first segmentation luminance map by using the plurality of first luminance map regions and forming a second segmentation luminance map by using the plurality of second luminance map regions.
In this embodiment, fig. 14 is a schematic diagram illustrating a target luminance graph is subjected to a slicing process in a first direction in one embodiment. Fig. 15 is a diagram illustrating a slicing process performed on a target luminance graph in a second direction according to an embodiment. As shown in fig. 14, assuming that the target luminance graph includes 6 rows and 6 columns of pixel points, when the target luminance graph is sliced column by column, the target luminance graph is sliced in the first direction. The electronic device may determine a1 st row of pixel points, a3 rd row of pixel points, and a 5 th row of pixel points of the target luminance map as a first luminance map region, and may determine a2 nd row of pixel points, a 4 th row of pixel points, and a 6 th row of pixel points of the target luminance map as a second luminance map region. Then, the electronic device may splice the first luminance map regions to obtain a first cut luminance map T1, where the first cut luminance map T1 includes the 1 st column of pixel points, the 3 rd column of pixel points, and the 5 th column of pixel points of the target luminance map. The electronic device may splice the second luminance map regions to obtain a second split luminance map T2, where the second split luminance map T2 includes the 2 nd, 4 th, and 6 th columns of pixels of the target luminance map.
As shown in fig. 15, assuming that the target luminance map includes 6 rows and 6 columns of pixel points, the target luminance map is sliced row by row, that is, the target luminance map is sliced in the second direction. The electronic device can determine the 1 st line of pixel points, the 3 rd line of pixel points and the 5 th line of pixel points of the target brightness map as a first brightness map region, can determine the 2 nd line of pixel points, the 4 th line of pixel points and the 6 th line of pixel points of the target brightness map as a second brightness map region, and then the electronic device can splice the first brightness map region to obtain a first cut brightness map T3, wherein the first cut brightness map T3 comprises the 1 st line of pixel points, the 3 rd line of pixel points and the 5 th line of pixel points of the target brightness map. The electronic device may splice the second luminance map regions to obtain a second split luminance map T4, where the second split luminance map T4 includes the 2 nd row of pixel points, the 4 th row of pixel points, and the 6 th row of pixel points of the target luminance map.
In one embodiment, fig. 16 is a schematic flow chart illustrating a process of determining phase difference values of pixels matched with each other in one embodiment. As shown in fig. 16, determining the phase difference value of the mutually matched pixels according to the position difference between the mutually matched pixels in the first and second sliced luminance graphs includes:
step 1602, when the luminance graph region includes a row of pixel points in the target luminance graph, a first neighboring pixel point set is determined in each row of pixel points included in the first tangent luminance graph, and the pixel points included in the first neighboring pixel point set correspond to the same pixel point group.
In particular, when the luminance map area includes a row of pixels in the target luminance map, that is, in the case where the electronic device divides the target luminance map row by row in the column direction, after the slicing, since the two pixels of the first row in the sub-luminance map are located in the same row of pixels of the target luminance map, therefore, the two pixels of the first row of the sub-luminance map will be located in the same luminance map region and will be located in the same sliced luminance map, and, similarly, the two pixels of the second row of the sub-luminance map will also be located in the same luminance region and will be located in another split luminance map, assuming that the first row of the sub-luminance map is located in the even pixel row of the target luminance map, the two pixels of the first row of the sub-luminance map are located in the first split luminance map and the two pixels of the second row of the sub-luminance map are located in the second split luminance map.
The column wise manner is similar to the row wise manner and will not be described further herein.
Step 1604, for each first neighboring pixel point set, searching the second segmentation luminance graph for a first matching pixel point set corresponding to the first neighboring pixel point set.
Specifically, for each first neighboring pixel set, the electronic device may obtain a plurality of pixels around the first neighboring pixel set in a first sliced luminance graph, and form a search pixel matrix from the first neighboring pixel set and the plurality of pixels around the first neighboring pixel set, for example, the search pixel matrix may include 9 pixels in 3 rows and 3 columns, and then the electronic device may search for a pixel matrix similar to the search pixel matrix in a second sliced luminance graph.
After searching for a pixel matrix similar to the searched pixel matrix in the second sliced luminance graph, the electronic device may extract a first set of matched pixels from the searched pixel matrix.
The pixels in the first matched pixel set and the pixels in the first adjacent pixel set obtained by searching respectively correspond to different images formed in the image sensor by imaging light rays entering the lens from different directions.
Step 1606, according to the position difference between each first neighboring pixel point set and each first matching pixel point set, determining the phase difference between the first neighboring pixel point set and the first matching pixel point set corresponding to each other.
The difference in position of the first set of neighboring pixels from the first set of matched pixels refers to: the difference in the position of the first set of neighboring pixels in the first sliced luminance map and the position of the first set of matching pixels in the second sliced luminance map.
When the obtained first and second sliced luminance maps may be referred to as upper and lower maps, respectively, the phase difference obtained by the upper and lower maps may reflect the difference in the imaging position of the object in the vertical direction.
In one embodiment, a phase difference acquisition method includes:
and (c1) acquiring an image coordinate system, wherein the image coordinate system is established by taking the center of the image as an origin.
And (c2) dividing the image into at least two candidate sub-regions according to a first preset angle by taking the center of the image as a vertex according to the image coordinate system.
And (c3) acquiring the definition of the lines in the first direction and the definition of the lines in the second direction in each candidate subarea.
And (c4), when there is a first candidate subregion with the definition of the line in the first direction being greater than the definition of the line in the second direction, configuring the second weight corresponding to the first candidate subregion to be greater than the first weight corresponding to the first candidate subregion.
And (c5), when a second candidate subregion with the definition of the line in the first direction smaller than that of the line in the second direction exists, configuring the first weight corresponding to the second candidate subregion to be larger than the second weight corresponding to the second candidate subregion.
And (c6), when a third candidate sub-region exists, wherein the definition of the line in the first direction is equal to the definition of the line in the second direction, configuring the first weight and the second weight corresponding to the third candidate sub-region to be equal.
And (c7) acquiring a first phase difference and a second phase difference of the pixel points, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle.
And (c8) acquiring the reliability of the first phase difference and the reliability of the second phase difference.
And (c9) obtaining a reliability difference according to the reliability of the first phase difference and the reliability of the second phase difference.
And (c10) when the credibility difference is within the preset range, acquiring the position of the pixel point in the image, wherein the position of the pixel point in the image comprises the target subregion of the pixel point in the image.
And (c11) acquiring a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference in the target subarea according to the target subarea where the pixel points are located.
And (c12) performing corresponding weighting processing on the first phase difference and the second phase difference according to the first weight and the second weight to obtain a target phase difference of the pixel point.
And (c13) when the first weight corresponding to the first phase difference and the second weight corresponding to the second phase difference are not obtained according to the target sub-region where the pixel is located, that is, when the first weight corresponding to the first phase difference and the second weight corresponding to the second phase difference cannot be obtained according to the steps (c11) and (c12), executing a step of taking the second phase difference as the target phase difference when the position of the pixel in the image is within a range corresponding to a first direction, wherein the range corresponding to the first direction is an image range which is defined by a line passing through an origin and is surrounded by a first direction coordinate axis in an image coordinate system by a second preset angle.
And (c14) when the position of the pixel point in the image is within a range corresponding to a second direction, taking the first phase difference as a target phase difference, wherein the range corresponding to the second direction is an image range which is surrounded by a line passing through the origin and forming a third preset angle with a coordinate axis of the second direction in the image coordinate system.
And (c15) determining the maximum value of the reliability of the first phase difference and the reliability of the second phase difference when the reliability difference quantity is not in the preset range.
And (c16) setting the phase difference corresponding to the maximum reliability as the target phase difference.
Although the respective steps of the step (c1) to the step (c16) are sequentially displayed as indicated by numerals, the steps are not necessarily sequentially performed in the order indicated by arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise.
According to the phase difference obtaining method in the embodiment of the application, a first phase difference and a second phase difference of a pixel point are obtained, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle, one pixel point corresponds to the two phase differences, the position of the pixel point in an image is obtained, a target phase difference of the pixel point is determined according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point, corresponding weighting processing is performed on the first phase difference and the second phase difference according to the first weight and the second weight, the target phase difference of the pixel point is obtained, the first phase difference and the second phase difference can be combined together to obtain the target phase difference, the phase difference obtaining precision is improved, image blurring caused by the image difference is effectively avoided, and imaging is clearer.
It should be understood that although the various steps in the flowcharts of fig. 2, 6, 11 and 16 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 6, 11, and 16 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least some of the sub-steps or stages of other steps.
Fig. 17 is a block diagram showing a configuration of a phase difference acquisition apparatus according to an embodiment. As shown in fig. 17, a phase difference acquiring apparatus includes a first acquiring module 1702, a second acquiring module 1704, and a target phase difference determining module 1706, wherein:
a first obtaining module 1702, configured to obtain a first phase difference and a second phase difference of a pixel point, where a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle;
a second obtaining module 1704, configured to obtain a position of the pixel point in the image;
the target phase difference determining module 1706 is configured to determine a target phase difference of the pixel according to a position of the pixel in the image.
The phase difference obtaining device in this embodiment obtains a first phase difference and a second phase difference of a pixel, where a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle, and then one pixel corresponds to two phase differences to obtain a position of the pixel in an image, and a target phase difference of the pixel is determined according to the position of the pixel in the image, the first phase difference and the second phase difference of the pixel, so that a target phase difference can be determined according to the position of the pixel, accuracy of obtaining the phase difference is improved, and image blurring caused by the phase difference is effectively avoided, and imaging is clearer.
In an embodiment, the target phase difference determining module 1706 is configured to obtain a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference according to a position of the pixel point in the image; and performing corresponding weighting processing on the first phase difference and the second phase difference according to the first weight and the second weight to obtain a target phase difference of the pixel point.
The phase difference obtaining device in this embodiment obtains a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference according to the position of the pixel point in the image; and performing corresponding weighting processing on the first phase difference and the second phase difference according to the first weight and the second weight to obtain a target phase difference of the pixel point, combining the first phase difference and the second phase difference to obtain the target phase difference, improving the accuracy of phase difference acquisition, avoiding image unsharp caused by aberration, and improving the definition of a shot image.
In an embodiment, the target phase difference determining module 1706 is configured to obtain, according to a target sub-region where the pixel point is located, a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference in the target sub-region, where the target sub-region is determined from at least two candidate sub-regions, and the candidate sub-regions are obtained by dividing the image according to the image coordinate system and according to a first preset angle with the center of the image as a vertex.
According to the phase difference obtaining device in the embodiment of the application, the first weight corresponding to the first phase difference and the second weight corresponding to the second phase difference in the target subarea are obtained according to the target subarea where the pixel points are located, the first phase difference and the second phase difference can be combined together to obtain the target phase difference, the phase difference obtaining accuracy is improved, the image blurring caused by the image difference is avoided, and the definition of the shot image is improved.
In one embodiment, the phase difference obtaining apparatus further includes a weight obtaining module. The weight acquisition module is used for acquiring an image coordinate system, wherein the image coordinate system is established by taking the center of an image as an origin; dividing the image into at least two candidate sub-regions according to an image coordinate system and a first preset angle by taking the center of the image as a vertex; acquiring the definition of lines in a first direction and the definition of lines in a second direction in each candidate subregion; when a first candidate sub-region exists, wherein the definition of the line in the first direction is greater than that of the line in the second direction, a second weight corresponding to the first candidate sub-region is configured to be greater than a first weight corresponding to the first candidate sub-region; when a second candidate subregion with the definition of the line in the first direction smaller than or equal to the definition of the line in the second direction exists, the first weight corresponding to the second candidate subregion is configured to be larger than the second weight corresponding to the second candidate subregion.
The phase difference acquisition device in the embodiment of the application acquires the definition of the lines in the first direction and the definition of the lines in the second direction in each candidate subarea, when there is a first candidate sub-region where the sharpness of the lines in the first direction is greater than the sharpness of the lines in the second direction, configuring a second weight corresponding to the first candidate sub-region to be greater than a first weight corresponding to the first candidate sub-region, when there is a second candidate sub-region where the sharpness of the line in the first direction is less than or equal to the sharpness of the line in the second direction, the first weight corresponding to the second candidate subregion is configured to be larger than the second weight corresponding to the second candidate subregion, the weights of different subregions can be calibrated according to the definition of lines in different directions, the weight corresponding to the phase difference with high definition is large, and the accuracy of phase difference acquisition is improved, so that the definition of a shot image is improved.
In an embodiment, the target phase difference determining module 1706 is configured to, when the position of the pixel point in the image is within a range corresponding to a first direction, use the second phase difference as the target phase difference, where the range corresponding to the first direction is an image range that is surrounded by a line passing through an origin and forming a second preset angle with a first direction coordinate axis in an image coordinate system, where the image coordinate system is established with a center of the image as the origin; and when the position of the pixel point in the image is in a range corresponding to a second direction, taking the first phase difference as a target phase difference, wherein the range corresponding to the second direction is an image range which is defined by a line passing through the origin and forming a third preset angle with a coordinate axis of the second direction in the image coordinate system.
In the phase difference obtaining device in this embodiment, when the position of the pixel point in the image is within the range corresponding to the first direction, the second phase difference is used as the target phase difference, and when the position of the pixel point in the image is within the range corresponding to the second direction, the first phase difference is used as the target phase difference, so that the phase difference in the corresponding direction can be directly selected as the target phase difference according to the position of the pixel point in the image, and the phase difference obtaining accuracy is improved.
In an embodiment, the target phase difference determining module 1706 is configured to, when a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference are not obtained according to an area where the pixel point is located, and when a position of the pixel point in the image is within a range corresponding to the first direction, take the second phase difference as the target phase difference.
In the phase difference obtaining device in this embodiment, when the first weight corresponding to the first phase difference and the second weight corresponding to the second phase difference are not obtained according to the area where the pixel is located, when the location of the pixel in the image is within the range corresponding to the first direction, the second phase difference is used as the target phase difference, that is, if the image capturing device of the electronic device is not calibrated in advance, the location of the pixel in the image can be directly determined, and the phase difference in the corresponding direction is used as the target phase difference, so that the phase difference obtaining accuracy is improved.
In one embodiment, the second obtaining module 1704 is configured to obtain a confidence level of the first phase difference and a confidence level of the second phase difference; obtaining a reliability difference according to the reliability of the first phase difference and the reliability of the second phase difference; and when the credibility difference is within a preset range, acquiring the position of the pixel point in the image.
The phase difference obtaining device in this embodiment obtains the reliability of the first phase difference and the reliability of the second phase difference, and obtains the reliability difference amount according to the reliability of the first phase difference and the reliability of the second phase difference; when the credibility difference is within the preset range, the step of obtaining the position of the pixel point in the image is executed, when the credibility difference between the first phase difference and the second phase difference is small, the position of the pixel point can be obtained, the proper phase difference can be obtained according to the position of the pixel point in the image, and the phase difference obtaining precision is improved.
In one embodiment, the target phase difference determining module 1706 is configured to determine a maximum value of the confidence level of the first phase difference and the confidence level of the second phase difference when the confidence level gap amount is not within the preset range; and taking the phase difference corresponding to the maximum reliability as the target phase difference.
In the phase difference acquisition device in this embodiment, when the difference in the confidence levels is not within the preset range, the maximum confidence level value of the confidence levels of the first phase difference and the second phase difference is determined, and the phase difference corresponding to the maximum confidence level value is used as the target phase difference, that is, when the difference is large, the phase difference corresponding to the maximum confidence level value is used as the target phase difference, so that the target phase difference can be obtained quickly, and the phase difference acquisition efficiency is improved.
In the phase difference acquisition device in this embodiment, when the difference in the confidence levels is not within the preset range, the maximum confidence level value of the confidence levels of the first phase difference and the second phase difference is determined, and the phase difference corresponding to the maximum confidence level value is used as the target phase difference, that is, when the difference is large, the phase difference corresponding to the maximum confidence level value is used as the target phase difference, so that the target phase difference can be obtained quickly, and the phase difference acquisition efficiency is improved.
In one embodiment, the second obtaining module 1704 is configured to obtain image coordinates of the pixel point in an image coordinate system; and calculating according to the image coordinates to obtain the angle of the pixel point.
The phase difference obtaining device in this embodiment obtains image coordinates of the pixel points in an image coordinate system; the angle of the pixel point is obtained through calculation according to the image coordinate, the angle of the pixel point can be calculated in real time, the target phase difference is determined according to the angle of the pixel point, the first phase difference and the second phase difference, and the phase difference obtaining precision is improved.
In one embodiment, the second obtaining module 1704 is configured to: acquiring pixel point coordinates of pixel points in a pixel coordinate system; the pixel coordinates are converted into image coordinates in an image coordinate system.
The phase difference obtaining device in this embodiment obtains pixel coordinates of a pixel in a pixel coordinate system; the pixel point coordinates are converted into image coordinates in an image coordinate system, so that more angle ranges can be obtained, and the phase difference obtaining precision is improved.
In an embodiment, the first obtaining module 1702 is configured to obtain a target luminance map according to luminance values of pixels included in each pixel group; carrying out segmentation processing on the target brightness graph to obtain a first segmentation brightness graph and a second segmentation brightness graph; determining the phase difference value of the matched pixel points according to the position difference of the matched pixel points in the first segmentation brightness graph and the second segmentation brightness graph; and determining a first phase difference and a second phase difference of the pixel points according to the phase difference values of the pixel points which are matched with each other.
The phase difference obtaining device in the embodiment of the application obtains the target luminance map according to the luminance values of the pixels included in each pixel group, after obtaining the target luminance map, performs segmentation processing on the target luminance map, obtains the first segmentation luminance map and the second segmentation luminance map according to the segmentation processing result, then determines the phase difference of the pixels matched with each other according to the position difference of the pixels matched with each other in the first segmentation luminance map and the second segmentation luminance map, and then generates the target phase difference map according to the phase difference of the pixels matched with each other, so that the luminance values of the pixels included in each pixel group in the image sensor can be used to obtain the phase difference, and therefore, compared with a mode of obtaining the phase difference by using sparsely arranged phase detection pixels, the target phase difference map in the embodiment of the application contains relatively rich phase difference information, therefore, the accuracy of the obtained phase difference can be improved.
In an embodiment, the first obtaining module 1702 is configured to, for each target luminance graph, generate an intermediate phase difference graph corresponding to the target luminance graph according to phase differences of the pixel points that are matched with each other; and determining a first phase difference and a second phase difference of the pixel points according to the intermediate phase difference graph corresponding to each target brightness graph.
In an embodiment, the first obtaining module 1702 is configured to determine, from each intermediate phase difference map, pixel points at the same position, and obtain a plurality of phase difference pixel point sets, where positions of the pixel points included in each phase difference pixel point set in the intermediate phase difference map are the same; for each phase difference pixel point set, splicing the pixel points in the phase difference pixel point set to obtain a sub-phase difference graph corresponding to the phase difference pixel point set; and splicing the obtained multiple sub-phase difference graphs to determine a first phase difference and a second phase difference of the pixel points.
In an embodiment, the first obtaining module 1702 is configured to perform segmentation processing on the target luminance graph to obtain a plurality of luminance graph regions, where each luminance graph region includes a row of pixel points in the target luminance graph, or each luminance graph region includes a column of pixel points in the target luminance graph; acquiring a plurality of first brightness map regions and a plurality of second brightness map regions from the plurality of brightness map regions, wherein the first brightness map regions comprise pixel points in even rows in a target brightness map, or the first brightness map regions comprise pixel points in even columns in the target brightness map, and the second brightness map regions comprise pixel points in odd rows in the target brightness map, or the second brightness map regions comprise pixel points in odd columns in the target brightness map; the first segmentation luminance map is composed of a plurality of first luminance map regions, and the second segmentation luminance map is composed of a plurality of second luminance map regions.
In one embodiment, the first obtaining module 1702 is configured to determine a first neighboring pixel point set in each row of pixel points included in the first cut-and-divide luminance graph when the luminance graph region includes a row of pixel points in the target luminance graph, where the pixel points included in the first neighboring pixel point set correspond to a same pixel point group; for each first adjacent pixel point set, searching a first matching pixel point set corresponding to the first adjacent pixel point set in the second segmentation brightness graph; and determining the phase difference between the first adjacent pixel point set and the first matching pixel point set corresponding to each other according to the position difference between each first adjacent pixel point set and each first matching pixel point set.
The division of each module in the phase difference obtaining apparatus is only used for illustration, and in other embodiments, the phase difference obtaining apparatus may be divided into different modules as needed to complete all or part of the functions of the phase difference obtaining apparatus.
For specific limitations of the phase difference acquisition device, reference may be made to the above limitations of the phase difference acquisition method, which are not described herein again. The respective modules in the phase difference acquisition apparatus may be wholly or partially implemented by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
Fig. 18 is a schematic diagram of an internal structure of an electronic device in one embodiment. As shown in fig. 18, the electronic apparatus includes a processor and a memory connected by a system bus. Wherein, the processor is used for providing calculation and control capability and supporting the operation of the whole electronic equipment. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program can be executed by a processor to implement a phase difference acquisition method provided in the following embodiments. The internal memory provides a cached execution environment for the operating system computer programs in the non-volatile storage medium. The electronic device may be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device, etc.
The implementation of each module in the phase difference acquisition apparatus provided in the embodiments of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. The program modules constituted by the computer program may be stored on the memory of the terminal or the server. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of the phase difference acquisition method.
A computer program product containing instructions which, when run on a computer, cause the computer to perform a phase difference acquisition method.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (14)

1. A phase difference acquisition method, comprising:
acquiring a first phase difference and a second phase difference of a pixel point in an image, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle;
acquiring the position of the pixel point in the image;
and determining the target phase difference of the pixel point according to the position of the pixel point in the image, the first phase difference and the second phase difference of the pixel point.
2. The method of claim 1, wherein determining the target phase difference of the pixel point according to the position of the pixel point in the image comprises:
acquiring a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference according to the position of the pixel point in the image;
and carrying out corresponding weighting processing on the first phase difference and the second phase difference according to the first weight and the second weight to obtain the target phase difference of the pixel point.
3. The method according to claim 2, wherein the obtaining a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference according to the position of the pixel point in the image comprises:
and acquiring a first weight corresponding to the first phase difference and a second weight corresponding to the second phase difference in the target sub-region according to the target sub-region where the pixel point is located, wherein the target sub-region is determined from at least two candidate sub-regions, and the candidate sub-regions are obtained by dividing the image according to a first preset angle by taking the center of the image as a vertex according to an image coordinate system.
4. The method of claim 2, wherein the first weight and the second weight are configured in a manner that comprises:
acquiring an image coordinate system, wherein the image coordinate system is established by taking the center of the image as an origin;
dividing the image into at least two candidate sub-regions according to the image coordinate system and a first preset angle by taking the center of the image as a vertex;
acquiring the definition of the lines in the first direction and the definition of the lines in the second direction in each candidate subregion;
when a first candidate sub-region exists, wherein the definition of the line in the first direction is greater than that of the line in the second direction, the second weight corresponding to the first candidate sub-region is configured to be greater than the first weight corresponding to the first candidate sub-region;
when a second candidate sub-region exists, wherein the definition of the line in the first direction is smaller than that of the line in the second direction, the first weight corresponding to the second candidate sub-region is configured to be larger than the second weight corresponding to the second candidate sub-region.
5. The method of claim 1, wherein determining the target phase difference of the pixel point according to the position of the pixel point in the image comprises:
when the position of the pixel point in the image is in a range corresponding to the first direction, taking the second phase difference as a target phase difference, wherein the range corresponding to the first direction is an image range which passes through a line of the origin and is surrounded by a first direction coordinate axis in an image coordinate system at a second preset angle, and the image coordinate system is established by taking the center of the image as the origin;
and when the position of the pixel point in the image is in a range corresponding to the second direction, taking the first phase difference as a target phase difference, wherein the range corresponding to the second direction is an image range which is surrounded by a line passing through the origin and forms a third preset angle with a coordinate axis of the second direction in the image coordinate system.
6. The method of claim 5, further comprising:
and when the first weight corresponding to the first phase difference and the second weight corresponding to the second phase difference are not obtained according to the position of the pixel point in the image, executing a step of taking the second phase difference as a target phase difference when the position of the pixel point in the image is in a range corresponding to the first direction.
7. The method of claim 1, further comprising:
obtaining the reliability of the first phase difference and the reliability of the second phase difference;
obtaining a reliability difference according to the reliability of the first phase difference and the reliability of the second phase difference;
and when the credibility difference is within a preset range, executing the step of acquiring the position of the pixel point in the image.
8. The method of claim 7, further comprising:
when the reliability difference quantity is not within a preset range, determining the maximum reliability value in the reliability of the first phase difference and the reliability of the second phase difference;
and taking the phase difference corresponding to the maximum reliability as a target phase difference.
9. The method according to any one of claims 1 to 8, wherein the position of the pixel point in the image comprises an angle of the pixel point;
the obtaining of the position of the pixel point in the image includes:
acquiring the image coordinates of the pixel points in the image coordinate system;
and calculating the angle of the pixel point according to the image coordinate.
10. The method of claim 9, wherein the obtaining the image coordinates of the pixel points in the image coordinate system comprises:
acquiring pixel point coordinates of the pixel points in a pixel coordinate system;
and converting the pixel point coordinates into image coordinates in the image coordinate system.
11. The method of claim 1, applied to an electronic device comprising an image sensor, wherein the image sensor comprises a plurality of pixel groups arranged in an array, and each pixel group comprises M × N pixels arranged in an array; each pixel point corresponds to a photosensitive unit, wherein M and N are both natural numbers greater than or equal to 2;
the acquiring of the first phase difference and the second phase difference of the pixel points includes:
acquiring a target brightness map according to the brightness values of the pixel points included in each pixel point group;
performing segmentation processing on the target brightness graph to obtain a first segmentation brightness graph and a second segmentation brightness graph;
determining the phase difference value of the mutually matched pixel points according to the position difference of the mutually matched pixel points in the first segmentation brightness graph and the second segmentation brightness graph;
and determining a first phase difference and a second phase difference of the pixel points according to the phase difference values of the mutually matched pixel points.
12. A phase difference acquisition apparatus, comprising:
the first obtaining module is used for obtaining a first phase difference and a second phase difference of a pixel point in an image, wherein a first direction corresponding to the first phase difference and a second direction corresponding to the second phase difference form a preset included angle;
the second acquisition module is used for acquiring the positions of the pixel points in the image;
and the target phase difference determining module is used for determining the target phase difference of the pixel points according to the positions of the pixel points in the image.
13. An electronic device comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of the phase difference acquisition method according to any one of claims 1 to 11.
14. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 11.
CN201911101457.0A 2019-11-12 2019-11-12 Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium Active CN112866550B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911101457.0A CN112866550B (en) 2019-11-12 2019-11-12 Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium
PCT/CN2020/122790 WO2021093537A1 (en) 2019-11-12 2020-10-22 Phase difference acquisition method and device, electronic apparatus, and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911101457.0A CN112866550B (en) 2019-11-12 2019-11-12 Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN112866550A true CN112866550A (en) 2021-05-28
CN112866550B CN112866550B (en) 2022-07-15

Family

ID=75911951

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911101457.0A Active CN112866550B (en) 2019-11-12 2019-11-12 Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN112866550B (en)
WO (1) WO2021093537A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013141108A (en) * 2011-12-29 2013-07-18 Nikon Corp Interchangeable lens and camera body
CN108337424A (en) * 2017-01-17 2018-07-27 中兴通讯股份有限公司 A kind of phase focusing method and its device
CN109905600A (en) * 2019-03-21 2019-06-18 上海创功通讯技术有限公司 Imaging method, imaging device and computer readable storage medium
CN110246853A (en) * 2018-03-09 2019-09-17 三星电子株式会社 Imaging sensor and image pick-up device including phase-detection pixel

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105074528B (en) * 2013-03-29 2017-07-18 富士胶片株式会社 Camera device and focusing control method
KR102125561B1 (en) * 2013-12-03 2020-06-22 삼성전자주식회사 Photographing apparatus and method
US9445018B2 (en) * 2014-05-01 2016-09-13 Semiconductor Components Industries, Llc Imaging systems with phase detection pixels
US9749556B2 (en) * 2015-03-24 2017-08-29 Semiconductor Components Industries, Llc Imaging systems having image sensor pixel arrays with phase detection capabilities
US10440301B2 (en) * 2017-09-08 2019-10-08 Apple Inc. Image capture device, pixel, and method providing improved phase detection auto-focus performance

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013141108A (en) * 2011-12-29 2013-07-18 Nikon Corp Interchangeable lens and camera body
CN108337424A (en) * 2017-01-17 2018-07-27 中兴通讯股份有限公司 A kind of phase focusing method and its device
CN110246853A (en) * 2018-03-09 2019-09-17 三星电子株式会社 Imaging sensor and image pick-up device including phase-detection pixel
CN109905600A (en) * 2019-03-21 2019-06-18 上海创功通讯技术有限公司 Imaging method, imaging device and computer readable storage medium

Also Published As

Publication number Publication date
WO2021093537A1 (en) 2021-05-20
CN112866550B (en) 2022-07-15

Similar Documents

Publication Publication Date Title
US10958892B2 (en) System and methods for calibration of an array camera
WO2020010945A1 (en) Image processing method and apparatus, electronic device and computer-readable storage medium
CN112866549B (en) Image processing method and device, electronic equipment and computer readable storage medium
US9473700B2 (en) Camera systems and methods for gigapixel computational imaging
CN109685853B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
WO2023016144A1 (en) Focusing control method and apparatus, imaging device, electronic device, and computer readable storage medium
CN109559353B (en) Camera module calibration method and device, electronic equipment and computer readable storage medium
US8929685B2 (en) Device having image reconstructing function, method, and recording medium
CN112019734B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN109559352B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN108156383B (en) High-dynamic billion pixel video acquisition method and device based on camera array
CN112866553B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866655B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN112866547B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866550B (en) Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium
CN112866554B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866552B (en) Focusing method and device, electronic equipment and computer readable storage medium
WO2021093528A1 (en) Focusing method and apparatus, and electronic device and computer readable storage medium
JP2005216191A (en) Stereo image processing apparatus and method
CN112866674B (en) Depth map acquisition method and device, electronic equipment and computer readable storage medium
CN110728714B (en) Image processing method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant