CN113487492A - Parallax value correction method, parallax value correction device, electronic apparatus, and storage medium - Google Patents

Parallax value correction method, parallax value correction device, electronic apparatus, and storage medium Download PDF

Info

Publication number
CN113487492A
CN113487492A CN202110606123.XA CN202110606123A CN113487492A CN 113487492 A CN113487492 A CN 113487492A CN 202110606123 A CN202110606123 A CN 202110606123A CN 113487492 A CN113487492 A CN 113487492A
Authority
CN
China
Prior art keywords
pixel point
value
parallax
target pixel
angle image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110606123.XA
Other languages
Chinese (zh)
Inventor
唐金伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Megvii Technology Co Ltd
Original Assignee
Beijing Megvii Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Megvii Technology Co Ltd filed Critical Beijing Megvii Technology Co Ltd
Priority to CN202110606123.XA priority Critical patent/CN113487492A/en
Publication of CN113487492A publication Critical patent/CN113487492A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06T5/80
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The embodiment of the application provides a parallax value correction method, a parallax value correction device, electronic equipment and a storage medium. The method comprises the following steps: performing stereo matching on a first field angle image and a second field angle image of a target scene to obtain a cost cube between the first field angle image and the second field angle image, wherein the cost cube records the cost values of pixel points in the first field angle image under a plurality of parallax values; for the repeated texture area in the first field angle image, acquiring a phase difference corresponding to a target pixel point in the repeated texture area; and obtaining a corrected parallax value of the target pixel point according to the phase difference corresponding to the target pixel point and the minimum cost value and the secondary minimum cost value in each cost value corresponding to the target pixel point.

Description

Parallax value correction method, parallax value correction device, electronic apparatus, and storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a parallax value correction method, apparatus, electronic device, and storage medium.
Background
In the existing large-aperture blurring mode, a disparity value of each pixel point is calculated through stereo matching, then the disparity value is converted into depth (distance) information, and finally the blurring radius of each pixel point is determined according to the focus distance and the distances of other pixel points of the full image, and the pixel points are subjected to blurring processing according to the blurring radius.
When calculating the parallax value of the pixel point, a winner take-all algorithm is usually adopted, but the winner take-all algorithm directly takes the parallax value corresponding to the minimum cost value as the parallax value of the pixel point, and when calculating the parallax value of the repeated texture area, the parallax value of the pixel point is calculated by the method with high probability, so that the problem of low calculation accuracy of the parallax value occurs.
Disclosure of Invention
An object of the embodiments of the present application is to provide a disparity value correction method, device, electronic device, and storage medium, so as to improve the accuracy of disparity value calculation.
In a first aspect, a disparity value correction method is provided, including:
performing stereo matching on a first field angle image and a second field angle image of a target scene to obtain a cost cube between the first field angle image and the second field angle image, wherein the cost cube records the cost values of pixel points in the first field angle image under a plurality of parallax values;
for the repeated texture area in the first field angle image, acquiring a phase difference corresponding to a target pixel point in the repeated texture area;
and obtaining a corrected parallax value of the target pixel point according to the phase difference corresponding to the target pixel point and the minimum cost value and the secondary minimum cost value in each cost value corresponding to the target pixel point.
In the parallax value correction method, since the parallax is related to the distance (the distance between the object point and the imaging device), and the phase difference is related to the front and rear positions (the front and rear positions of the object point and the focused object point), the parallax value of the target pixel point in the repeated texture region is obtained based on the phase difference and the minimum cost value and the second minimum cost value of the cost values corresponding to the target pixel point, and the method has higher accuracy compared with a method of directly using the parallax value corresponding to the minimum cost value as the parallax value of the pixel point.
In some embodiments of the present application, the phase difference is obtained after light of an object point corresponding to the target pixel point passes through a dual-core sensor, where a plurality of photoelectric groups are disposed in the dual-core sensor, and each photoelectric group includes two photoelectric imaging units.
In some embodiments of the present application, the obtaining a corrected parallax value of the target pixel point according to the phase difference corresponding to the target pixel point, and the minimum cost value and the second minimum cost value of the cost values corresponding to the target pixel point includes: determining the relative positions of the object point corresponding to the target pixel point and the focusing object point corresponding to the target scene according to the phase difference corresponding to the target pixel point; acquiring a reference parallax value corresponding to the focusing object point; acquiring a first parallax value corresponding to the minimum cost value and a second parallax value corresponding to the secondary minimum cost value; and obtaining a corrected parallax value of the target pixel point from the first parallax value and the second parallax value according to the relative position and the reference parallax value.
In some embodiments of the present application, the first and second field angle images are acquired by an imaging device; the determining the relative position of the object point corresponding to the target pixel point and the focusing object point corresponding to the target scene according to the phase difference corresponding to the target pixel point includes: if the phase difference corresponding to the target pixel point is smaller than 0, the object point corresponding to the target pixel point is located in the rear direction of the focusing object point, and the rear direction is a direction far away from the imaging device; if the phase difference corresponding to the target pixel point is larger than 0, the object point corresponding to the target pixel point is located in the front direction of the focusing object point, and the front direction is the direction close to the imaging device.
In some embodiments of the present application, the obtaining a corrected parallax value of the target pixel point from the first parallax value and the second parallax value according to the relative position and the reference parallax value includes: if the relative position is that the object point corresponding to the target pixel point is located in the front direction of the focused object point, taking the parallax value larger than the reference parallax value in the first parallax and the second parallax as the corrected parallax value; and if the relative position is that the object point corresponding to the target pixel point is positioned in the rear direction of the focused object point, taking the parallax value smaller than the reference parallax value in the first parallax and the second parallax as the corrected parallax value.
In some embodiments of the present application, before the obtaining, for a repeated texture region in the first field angle image, a phase difference corresponding to a target pixel point in the repeated texture region, the method further includes: according to the cost cube, obtaining each cost value corresponding to each pixel point in the first field angle image; and determining whether the pixel point is located in the repeated texture region according to each cost value corresponding to each pixel point in the first view field angle image so as to obtain the repeated texture region.
In some embodiments of the present application, the determining whether the pixel point is located in the repeated texture region according to the cost value corresponding to each pixel point in the first view angle image includes: calculating a cost value variance for each pixel point in the first view angle image according to each cost value corresponding to the pixel point; and determining whether the pixel point is positioned in the repeated texture region according to the cost value variance corresponding to the pixel point.
In some embodiments of the present application, the determining whether the pixel point is located in the repeated texture region according to the variance of the cost value corresponding to the pixel point includes: if the cost value variance is larger than or equal to a preset variance, acquiring the minimum cost value and the secondary minimum cost value of the pixel point from each cost value corresponding to the pixel point; obtaining the minimum cost value of the pixel point and the parallax value corresponding to the second minimum cost value; calculating a parallax difference value according to the minimum cost value and the parallax value corresponding to the second minimum cost value of the pixel point; and if the absolute value of the parallax difference value is greater than a preset value, the pixel point is in the repeated texture area.
In a second aspect, there is provided a parallax value correction apparatus including:
the matching module is used for performing stereo matching on a first field angle image and a second field angle image of a target scene to obtain a cost cube between the first field angle image and the second field angle image, wherein the cost cube records the cost values of pixel points in the first field angle image under a plurality of parallax values;
the phase module is used for acquiring a phase difference corresponding to a target pixel point in a repeated texture area in the first field angle image;
and the correction module is used for obtaining a corrected parallax value of the target pixel point according to the phase difference corresponding to the target pixel point and the minimum cost value and the secondary minimum cost value in each cost value corresponding to the target pixel point.
In some embodiments of the present application, the phase difference is obtained after light of an object point corresponding to the target pixel point passes through a dual-core sensor, where a plurality of photoelectric groups are disposed in the dual-core sensor, and each photoelectric group includes two photoelectric imaging units.
In some embodiments of the present application, the orthotic module is specifically configured to: determining the relative positions of the object point corresponding to the target pixel point and the focusing object point corresponding to the target scene according to the phase difference corresponding to the target pixel point; acquiring a reference parallax value corresponding to the focusing object point; acquiring a first parallax value corresponding to the minimum cost value and a second parallax value corresponding to the secondary minimum cost value; and obtaining a corrected parallax value of the target pixel point from the first parallax value and the second parallax value according to the relative position and the reference parallax value.
In some embodiments of the present application, the first and second field angle images are acquired by an imaging device; the correction module is specifically configured to: if the phase difference corresponding to the target pixel point is smaller than 0, the object point corresponding to the target pixel point is located in the rear direction of the focusing object point, and the rear direction is a direction far away from the imaging device; if the phase difference corresponding to the target pixel point is larger than 0, the object point corresponding to the target pixel point is located in the front direction of the focusing object point, and the front direction is the direction close to the imaging device.
In some embodiments of the present application, the orthotic module is specifically configured to: if the relative position is that the object point corresponding to the target pixel point is located in the front direction of the focused object point, taking the parallax value larger than the reference parallax value in the first parallax and the second parallax as the corrected parallax value; and if the relative position is that the object point corresponding to the target pixel point is positioned in the rear direction of the focused object point, taking the parallax value smaller than the reference parallax value in the first parallax and the second parallax as the corrected parallax value.
In some embodiments of the present application, the parallax value correction apparatus further includes: a polyvalent value module to: according to the cost cube, obtaining each cost value corresponding to each pixel point in the first field angle image; and the repeated determining module is used for determining whether the pixel point is positioned in the repeated texture region according to each cost value corresponding to each pixel point in the first field angle image so as to obtain the repeated texture region.
In some embodiments of the present application, the duplication determination module is specifically configured to: calculating a cost value variance for each pixel point in the first view angle image according to each cost value corresponding to the pixel point; and determining whether the pixel point is positioned in the repeated texture region according to the cost value variance corresponding to the pixel point.
In some embodiments of the present application, the duplication determination module is specifically configured to: if the cost value variance is larger than or equal to a preset variance, acquiring the minimum cost value and the secondary minimum cost value of the pixel point from each cost value corresponding to the pixel point; obtaining the minimum cost value of the pixel point and the parallax value corresponding to the second minimum cost value; calculating a parallax difference value according to the minimum cost value and the parallax value corresponding to the second minimum cost value of the pixel point; and if the absolute value of the parallax difference value is greater than a preset value, the pixel point is in the repeated texture area.
In a third aspect, an electronic device is provided, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the steps of the disparity value correction method according to the first aspect when executing the computer program.
In a fourth aspect, there is provided a computer-readable storage medium having stored therein computer program instructions which, when read and executed by a processor, perform the steps of the disparity value correction method according to the first aspect.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a flowchart illustrating an implementation of a method for correcting a disparity value according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of shooting a target scene at different positions according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a cost cube provided by an embodiment of the present application;
FIG. 4 is a diagram illustrating a relationship between a disparity value and an abscissa provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of a repetitive texture region provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of the parallax values and distances provided by an embodiment of the present application;
fig. 7 is a schematic diagram of a photosensitive pixel in a PD pixel pair according to an embodiment of the present application;
FIG. 8 is a graph illustrating pre-focus and post-focus corresponding ray energies provided by an embodiment of the present application;
FIG. 9 is a schematic diagram illustrating different relative positions of an object point and a focused object point according to an embodiment of the disclosure;
FIG. 10 is a schematic diagram of an example of determining a corrected disparity value according to an embodiment of the present application;
fig. 11 is a schematic structural diagram illustrating a parallax correction apparatus according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In recent years, technical research based on artificial intelligence, such as computer vision, deep learning, machine learning, image processing, and image recognition, has been actively developed. Artificial Intelligence (AI) is an emerging scientific technology for studying and developing theories, methods, techniques and application systems for simulating and extending human Intelligence. The artificial intelligence subject is a comprehensive subject and relates to various technical categories such as chips, big data, cloud computing, internet of things, distributed storage, deep learning, machine learning and neural networks. Computer vision is used as an important branch of artificial intelligence, particularly a machine is used for identifying the world, and the computer vision technology generally comprises the technologies of face identification, living body detection, fingerprint identification and anti-counterfeiting verification, biological feature identification, face detection, pedestrian detection, target detection, pedestrian identification, image processing, image identification, image semantic understanding, image retrieval, character identification, video processing, video content identification, behavior identification, three-dimensional reconstruction, virtual reality, augmented reality, synchronous positioning and map construction (SLAM), computational photography, robot navigation and positioning and the like. With the research and progress of artificial intelligence technology, the technology is applied to various fields, such as security, city management, traffic management, building management, park management, face passage, face attendance, logistics management, warehouse management, robots, intelligent marketing, computational photography, mobile phone images, cloud services, smart homes, wearable equipment, unmanned driving, automatic driving, smart medical treatment, face payment, face unlocking, fingerprint unlocking, testimony verification, smart screens, smart televisions, cameras, mobile internet, live webcasts, beauty treatment, medical beauty treatment, intelligent temperature measurement and the like.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
In some embodiments of the present application, a disparity value correction method is provided, and an implementation subject of the disparity value correction method according to an embodiment of the present invention is an apparatus capable of implementing the disparity value correction method according to an embodiment of the present invention, and the apparatus may include, but is not limited to, a terminal, a server, and an imaging apparatus, and the terminal, the server, and the imaging apparatus are all capable of capturing an image. The terminal comprises a desktop terminal and a mobile terminal, the desktop terminal comprises but is not limited to a desktop computer and a vehicle-mounted computer, the mobile terminal comprises but is not limited to a mobile phone, a tablet, a notebook computer and a smart watch, and the server comprises a high-performance computer and a high-performance computer cluster.
Referring to fig. 1, fig. 1 is a flowchart illustrating a method for correcting a parallax error according to an embodiment of the present application, the method including the following steps:
step 100, performing stereo matching on a first field angle image and a second field angle image of a target scene to obtain a cost cube between the first field angle image and the second field angle image, wherein the cost cube records the cost values of pixel points in the first field angle image under a plurality of parallax values.
The target scene is a real application scene shot by the imaging device, and in the embodiment of the invention, the application scene shot by the imaging device in the stereo matching process is specifically referred to.
The first field angle image is an image obtained by shooting from a first field angle by an imaging device in the process of stereo matching; the second field angle image is an image captured by the imaging device from the second field angle during stereo matching. The first and second viewing angles are two different viewing angle positions, for example, as shown in fig. 2, P is an object point in the target scene, and the target scene is photographed from positions 1 and 2, respectively, to obtain a first and second viewing angle images.
And stereo matching, namely acquiring a disparity map by using a stereo matching algorithm through the first view angle image and the second view angle image, and further acquiring a depth map. Wherein, a cost cube is obtained in the process of acquiring the disparity map. And the cost cube records the cost values of the pixel points in the first view angle image under a plurality of parallax values.
The cost cube shown in fig. 3 is represented by (x, y, d), where (x, y) is coordinates of pixel points in the first field image, d is a parallax value, and for each plane M in the cost cube, the plane M describes the parallax value d of each pixel point in the first field image1The value of (x) in plane Mi,yi,d1) Representing a pixel point (x) in a first field angle imagei,yi) At the parallax value d1The cost value below is j.
The cost value reflects the difference between two pixel points corresponding to different parallax values, the larger the cost value is, the larger the difference between two pixel points corresponding to two parallax values is, the smaller the cost value is, and the smaller the difference between two pixel points corresponding to two parallax values is. Illustratively, the cost value is expressed by the distance between pixels, e.g., pixel A (x)1,y1) Pixel point B (x)2,y2) Then the distance between pixels A and B is (x)1-x2)2+(y1-y2)2Namely, the cost value between the pixel points a and B is: (x)1-x2)2+(y1-y2)2
In the embodiment of the present invention, the first view angle image and the second view angle image are images after epipolar rectification, optical centers of the imaging devices corresponding to the two images after epipolar rectification are parallel, optical axes are parallel, and at the same time, heights of pixel points of the object point in the two images are the same, that is, vertical coordinates are the same, at this time, a parallax value can be easily obtained, that is, a parallax value is a difference between horizontal coordinates of two parallel pixel points (when vertical coordinates of two pixel points are the same, the two pixel points are defined as parallel pixel points) in the first view angle image and the second view angle image, as shown in fig. 4, a pixel point k is a pixel point k1And pixel point kiThe value of the disparity between is i.
As can be seen from fig. 4, for a pixel point k, the cost value of the pixel point k at a parallax value i is the pixel point k in the first view angle image and the pixel point k in the second view angle imageiCost value in between, e.g. pixel point k (x)1,y1) Pixel point ki(x1+i,y1) Then pixel point k and pixel point kiThe cost values in between are: (x)1-x1+i)2+(y1-y1)2
Step 200, for the repeated texture area in the first field angle image, obtaining a phase difference corresponding to a target pixel point in the repeated texture area.
A repeated texture region, which is a region where textures are repeated, and a region selected by a dashed frame shown in fig. 5 is a repeated texture region; and the target pixel point is a pixel point in the repeated texture region.
The phase difference refers to a difference between phases of waveforms of two electrical signals with the same frequency, and it should be noted that when front and back positions of an object point corresponding to a pixel point in an image in an actual application scene are different, the phase difference corresponding to the pixel point will also be different, so that the front and back positions of the object point corresponding to the pixel point in the actual application scene can be determined based on the phase difference.
And 300, obtaining a corrected parallax value of the target pixel point according to the phase difference corresponding to the target pixel point and the minimum cost value and the secondary minimum cost value in each cost value corresponding to the target pixel point.
The minimum cost value is the minimum value in all the cost values corresponding to the pixel points; the second lowest cost value is the second lowest cost value of the corresponding cost values of the pixel points.
Illustratively, a method of obtaining a minimum cost value and a next minimum cost value is provided: sorting the cost values corresponding to the pixel points according to the size sequence to obtain a sorting result; and selecting the minimum cost value and the next minimum cost value from the sequencing result. For example, each cost value corresponding to the pixel point a is [ j1, j2, j3, j4], wherein j3> j2> j4> j1, so that the sorting result of each cost value corresponding to the pixel point a is [ j3, j2, j4, j1], wherein the sorting sequence numbers corresponding to j3, j2, j4, j1 are 1, 2, 3 and 4, respectively, so that the cost value with the largest sorting sequence number in the sorting result is taken as the minimum cost value of j1, and the cost value with the second largest sorting sequence number is taken as the second minimum cost value of j 4.
The following describes the relationship between the parallax value and the distance, so as to obtain the corrected parallax of the target pixel point according to the relationship.
As shown in FIG. 6, the imaging device includes two optical lenses, O1 and O2 are optical centers of the two optical lenses of the imaging device, a line segment O1O2 is a distance between the two optical centers, denoted by b, P is an object point in the object scene, and a line segment T is a line segment0T1Is the focal length, denoted by f, line segment PT1Is the distance from the optical center to the object point P, denoted by Z, M1And M2Are two phase planes, P1And P2Is that the object point P in the target scene is in two phase planes M1And M2The imaging point, i.e. pixel point, line segment P0P1Is the imaging point P1To the phase plane M1Pixel distance of left edge, by X1Represents, line segment P0P2Is the imaging point P2To the phase plane M2Pixel distance of left edge, by X2This means that there are: b/Z ═ b + X2)-X1) (Z-f), by derivation:
Z=(b×f)/(X1-X2) (1)。
wherein, X1-X2The parallax value means that the closer the object point is to the phase plane, the larger the parallax value of the object point in the left and right imaging devices will be; the farther an object point is from the phase plane, the smaller the parallax value of the object point in the left and right imaging devices will be. It can be seen that the parallax value and the distance (the distance between the object point and the imaging device) have a certain relationship.
It can be seen that when the phase difference is known, the front and back positions of the object point in the target scene can be known (the front and back positions, and the distance between the object point and the imaging device has a certain relationship); when the minimum cost value and the second minimum cost value are known, namely, the parallax value corresponding to the two cost values is known; therefore, when the phase difference, the minimum cost value and the secondary minimum cost value are known at the same time, the corrected parallax value of the target pixel point can be obtained according to the two cost values.
In the parallax value correction method, since the parallax is related to the distance (the distance between the object point and the imaging device), and the phase difference is related to the front and rear positions (the front and rear positions of the object point and the focused object point), the parallax value of the target pixel point in the repeated texture region is obtained based on the phase difference and the minimum cost value and the second minimum cost value of the cost values corresponding to the target pixel point, and the method has higher accuracy compared with a method of directly using the parallax value corresponding to the minimum cost value as the parallax value of the pixel point.
In some embodiments of the present application, the phase difference is obtained after light of an object point corresponding to the target pixel point passes through a dual-core sensor, where a plurality of photoelectric groups are disposed in the dual-core sensor, and each photoelectric group includes two photoelectric imaging units.
And a photoelectric imaging unit which converts the optical signal into an electrical signal and images the signal. Each photoelectric imaging unit in each photoelectric group in the dual-core sensor (arranged in the imaging device) can independently image, and the phase of an object point is different after the two photoelectric imaging units in the photoelectric group image, so that the object point has a certain phase difference after passing through the two photoelectric imaging units in the photoelectric group.
Illustratively, the dual-core sensor is a sensor with a phase focusing function, the sensor with the phase focusing function includes a PD pixel pair (photoelectric group) including a photosensitive pixel L (photoelectric imaging unit) and a photosensitive pixel R (photoelectric imaging unit), when an object point is imaged on the PD pixel pair, since positions of shielding objects on the photosensitive pixels L and R are different, light energy received by the photosensitive pixels L and R in the PD pixel pair is different, as shown in fig. 7, since the received light energy is different, a phase difference of the object point can be obtained through the PD pixel pair. To more clearly illustrate that the photosensitive pixels L and R can receive different light energies, as shown in fig. 8, when the PD pixel pair is in front of focus and in back of focus, the light energies received by the photosensitive pixels L and R are opposite, when the PD pixel pair is in front of focus, the light energy received by the photosensitive pixels L is the upper light energy, and when the PD pixel pair is in back of focus, the light energy received by the photosensitive pixels L is the lower light energy.
The above embodiment provides a simple and fast way to obtain the phase difference, i.e. directly obtain the phase difference by the dual-core sensor, and does not need to calculate the phase difference by a complicated calculation formula.
In some embodiments of the present application, the obtaining, in step 300, the corrected parallax value of the target pixel point according to the phase difference corresponding to the target pixel point, and the minimum cost value and the next minimum cost value in each cost value corresponding to the target pixel point includes:
step 301, determining the relative position of the object point corresponding to the target pixel point and the focused object point corresponding to the target scene according to the phase difference corresponding to the target pixel point.
And focusing the object point, wherein the object point is focused in the target scene. As described above, since the phase difference is different between the object point and the focused object point, the relative position between the object point and the focused object point can be determined from the phase difference after the phase difference is determined.
Step 302, obtaining a reference parallax value corresponding to the focused object point.
And predetermining a parallax value of the focused object point, and taking the parallax value of the focused object point as a reference parallax value to obtain a corrected parallax value according to the reference parallax value.
Step 303, obtaining a first parallax value corresponding to the minimum cost value and a second parallax value corresponding to the second minimum cost value.
Since the cost cube between the first field angle image and the second field angle image is obtained through stereo matching, after the minimum cost value and the second minimum cost value corresponding to the target pixel point are known, the first disparity value corresponding to the minimum cost value and the second disparity value corresponding to the second minimum cost value can be obtained from the cost cube.
And 304, obtaining a corrected parallax value of the target pixel point from the first parallax value and the second parallax value according to the relative position and the reference parallax value.
Because the relative position of object point and the object point of focusing has been known promptly, has known image device and has been close to the object point or the object point is close to focusing apart from, simultaneously, because the distance has certain relation with the parallax value, consequently, when having known relative position and reference parallax value, alright in order to obtain the correction parallax value of target pixel point in first parallax value and the second parallax value.
In the above embodiment, since the distance of the position determines the size of the disparity value, the size relationship between the disparity value of the target pixel point and the reference disparity value can be determined according to the relative position between the object point corresponding to the target pixel point and the focused object point and the reference disparity value, so that one disparity value is selected from the first disparity value and the second disparity value as the corrected disparity value, and the accuracy of the disparity value of the target pixel point is improved.
In some embodiments of the present application, the first and second field angle images are acquired by an imaging device; step 301, determining the relative position between the object point corresponding to the target pixel point and the focused object point corresponding to the target scene according to the phase difference corresponding to the target pixel point, including:
in step 301A, if the phase difference corresponding to the target pixel point is smaller than 0, the object point corresponding to the target pixel point is located in a rear direction of the focused object point, where the rear direction is a direction away from the imaging device.
Step 301B, if the phase difference corresponding to the target pixel point is greater than 0, the object point corresponding to the target pixel point is located in a front direction of the focused object point, where the front direction is a direction close to the imaging device.
When the object point is positioned in the rear direction of the focusing object point, the phase difference is less than 0; when the object point is positioned in the front direction of the focused object point, the phase difference is greater than 0, and when the object point is superposed with the focused object point, the phase difference is 0, so that when the phase difference is obtained, the relative position of the object point and the focused object point can be judged according to the fact that the phase difference is an integer, 0 or a negative number. To assist understanding, fig. 9 shows the positional relationship of the object point and the imaging device.
The above-described embodiment describes the relationship between the phase difference corresponding to the target pixel point and the position (the front-rear position between the object point corresponding to the target pixel point and the focused object point) so as to determine the correction parallax value from the relationship.
In some embodiments of the present application, the obtaining a corrected parallax value of the target pixel point from the first parallax value and the second parallax value according to the relative position and the reference parallax value in step 304 includes:
step 304A, if the relative position is that the object point corresponding to the target pixel point is located in the front direction of the focused object point, taking a parallax value larger than the reference parallax value in the first parallax and the second parallax as the corrected parallax value.
When the object point corresponding to the target pixel point is located in the front direction of the focused object point, that is, the object point corresponding to the target pixel point is closer to the imaging device, it can be known from formula (1) that the parallax value corresponding to the target pixel point should be larger than the reference parallax value corresponding to the focused object point, and therefore, the parallax value larger than the reference parallax value is selected from the first parallax and the second parallax as the corrected parallax value.
Step 304B, if the relative position is that the object point corresponding to the target pixel point is located in the rear direction of the focused object point, taking a parallax value smaller than the reference parallax value in the first parallax and the second parallax as the corrected parallax value.
When the object point corresponding to the target pixel point is located in the rear direction of the focused object point, that is, the object point corresponding to the target pixel point is farther from the imaging device, it can be known from formula 1 that the parallax value corresponding to the target pixel point should be smaller than the reference parallax value corresponding to the focused object point, and therefore, the parallax value smaller than the reference parallax value is selected from the first parallax and the second parallax as the corrected parallax value.
In the foregoing embodiment, a method for obtaining a corrected disparity value is provided, where one of a first disparity value and a second disparity value is directly used as the corrected disparity value, so that the disparity value obtaining accuracy is improved, and a disparity value calculation method with simple steps is also provided.
In some embodiments of the present application, before obtaining, in step 200, for a repeated texture region in the first field angle image, a phase difference corresponding to a target pixel point in the repeated texture region, the method further includes:
and 001, acquiring each cost value corresponding to each pixel point in the first view angle image according to the cost cube.
Since the cost cube records the cost values of the pixel points under different parallax values, a plurality of cost values (the number of the cost values is the same as the number of the parallax values) can be obtained for each pixel point.
And 002, determining whether the pixel point is located in the repeated texture region according to each cost value corresponding to each pixel point in the first view angle image, so as to obtain the repeated texture region.
As shown in fig. 5, the non-texture regions framed by the oval frame are relatively flat, that is, the similarity between the pixels (the similarity between the pixels can be determined by comparing the absolute values of the differences between the coordinates of the pixels or the distances between the pixels) is very high, and the repetition of regularity does not occur.
The foregoing embodiment provides a method for determining a repeated texture region, and specifically, because the neighborhood similarity of each pixel point of a non-texture region is very high, the cost values obtained by calculation are very small, and therefore, whether the pixel point is located in the repeated texture region can be determined by each cost value corresponding to each pixel point in a first view angle image.
In some embodiments of the present application, the step 002 of determining whether the pixel point is located in the repeated texture region according to the respective cost value corresponding to each pixel point in the first view angle image includes:
and 002A, calculating cost value variance for each pixel point in the first view angle image according to each cost value corresponding to the pixel point.
For example, for a pixel a, the respective cost values corresponding to the pixel a are: j1, j2, j3 and j4, and calculating the average value t, t ═ of j1, j2, j3 and j4 (j1+ j2+ j3+ j4)/4, the variance of j1, j2, j3 and j4 is: ((j1-t)2+(j2-t)2+(j3-t)2+(j4-t)2)/4。
And 002B, determining whether the pixel point is located in the repeated texture area according to the cost value variance corresponding to the pixel point.
The variance is small, which means that the difference between the cost values is small, that is, the difference between the pixel points is small, and the variance is large, which means that the difference between the cost values is relatively large, that is, the difference between the pixel points is relatively large, so that whether the pixel points are located in the repeated texture region can be determined according to the calculated variance.
The above embodiments provide a method for evaluating a cost value difference, where a variance may evaluate a difference between data, so that a difference between cost values of pixel points may be evaluated based on the variance, and a non-texture region is a flat region.
In some embodiments of the present application, the determining, according to the cost value variance corresponding to the pixel point, whether the pixel point is located in the repeated texture region in step 002B includes:
and 002B1, if the variance of the cost values is greater than or equal to a preset variance, obtaining the minimum cost value and the next minimum cost value of the pixel point from each cost value corresponding to the pixel point.
And presetting the variance, wherein the preset variance is a preset cost value variance.
When the variance of the cost value is larger than or equal to the preset variance, the difference between the pixel points is considered to be relatively large, and the pixel points are not the pixel points of the non-texture area.
And 002B2, acquiring a parallax value corresponding to the minimum cost value and the second minimum cost value of the pixel point.
According to the cost cube, the minimum cost value of the pixel point a (x1, y1) in the first view angle image is the cost value between the pixel point a and the pixel point a1(x2, y1) in the second view angle image, and thus the disparity value corresponding to the minimum cost value is: x2-x 1; the second smallest cost value of the pixel point a is the cost value between the pixel point a and the pixel point a2(x3, y1) in the second view angle image, and then the parallax value corresponding to the second smallest cost value is: x3-x 1.
And step 002B3, calculating a parallax difference value according to the parallax value corresponding to the minimum cost value and the second minimum cost value of the pixel point.
The parallax difference is: (x2-x1) - (x3-x 1).
In step 002B4, if the absolute value of the parallax difference is greater than a preset value, the pixel point is in the repeated texture region.
The preset value is a preset parallax difference value.
Since the repeated texture region is a region where a plurality of textures are repeated, according to the cost cube, a minimum cost value and a second minimum cost value are found, the two cost values are considered to be repeated pixel points in two repeated texture regions, such as pixel points marked by two triangles shown in fig. 5, if a parallax value between the two pixel points is large, such as a parallax value d marked by a white dotted line in fig. 5, a phenomenon that the textures are repeated is considered to occur, and the pixel point is determined to be a pixel point in the repeated texture region.
In the above embodiment, the repeated texture region is determined to have a phenomenon of repeated texture, and therefore, the minimum cost value and the next minimum cost value of the pixel point are found, where the minimum cost value and the next minimum cost value are cost values of the repeated pixel point between the two texture regions that start to be repeated, and since the two texture regions are present, the parallax between the two cost values is determined to be greater than a set value, and therefore, the parallax value corresponding to the minimum cost value and the next minimum cost value is found, and when the absolute value of the parallax value corresponding to the minimum cost value and the next minimum cost value is greater than a preset value, it is determined that the phenomenon of repeated texture occurs, that is, the pixel point is determined to be in the repeated texture region.
To better illustrate the embodiment of the present invention, as shown in fig. 10, the reference parallax value for the focus object point (human body) is 7; in the cost cube, the repeated texture area selected by the white frame is shown in fig. 10, and the first disparity value and the second disparity value corresponding to the minimum cost value and the second minimum cost value are 8 and 2, respectively; calculating the phase difference to obtain that the phase difference is less than 0, and then, the object point corresponding to the target pixel point is positioned in the rear direction of the human body; since the object point corresponding to the target pixel point is located in the rear direction of the human body, the parallax value corresponding to the target pixel point should be smaller than the reference parallax value 7 of the human body, and the second parallax value 2 is used as the corrected parallax value of the target pixel point.
In some embodiments of the present application, as shown in fig. 11, there is provided a parallax value correction apparatus 1100, the parallax value correction apparatus 1000 including:
the matching module 1101 is configured to perform stereo matching on a first field angle image and a second field angle image of a target scene to obtain a cost cube between the first field angle image and the second field angle image, where the cost cube records the cost values of pixel points in the first field angle image under multiple parallax values.
A phase module 1102, configured to acquire, for a repeated texture region in the first view angle image, a phase difference corresponding to a target pixel point in the repeated texture region.
A correcting module 1103, configured to obtain a corrected parallax value of the target pixel point according to the phase difference corresponding to the target pixel point, and a minimum cost value and a sub-minimum cost value in each cost value corresponding to the target pixel point.
In the parallax value correction device, since the parallax is related to the distance (the distance between the object point and the imaging device), and the phase difference is related to the front and rear positions (the front and rear positions of the object point and the focusing object point), the parallax value of the target pixel point in the repeated texture region is obtained based on the phase difference and the minimum cost value and the second minimum cost value of the cost values corresponding to the target pixel point, and the parallax value correction device has higher accuracy compared with a mode of directly using the parallax value corresponding to the minimum cost value as the parallax value of the pixel point.
In some embodiments of the present application, the phase difference is obtained after light of an object point corresponding to the target pixel point passes through a dual-core sensor, where a plurality of photoelectric groups are disposed in the dual-core sensor, and each photoelectric group includes two photoelectric imaging units.
In some embodiments of the present application, the rectification module 1103 is specifically configured to: determining the relative positions of the object point corresponding to the target pixel point and the focusing object point corresponding to the target scene according to the phase difference corresponding to the target pixel point; acquiring a reference parallax value corresponding to the focusing object point; acquiring a first parallax value corresponding to the minimum cost value and a second parallax value corresponding to the secondary minimum cost value; and obtaining a corrected parallax value of the target pixel point from the first parallax value and the second parallax value according to the relative position and the reference parallax value.
In some embodiments of the present application, the first and second field angle images are acquired by an imaging device; the correction module 1103 is specifically configured to: if the phase difference corresponding to the target pixel point is smaller than 0, the object point corresponding to the target pixel point is located in the rear direction of the focusing object point, and the rear direction is a direction far away from the imaging device; if the phase difference corresponding to the target pixel point is larger than 0, the object point corresponding to the target pixel point is located in the front direction of the focusing object point, and the front direction is the direction close to the imaging device.
In some embodiments of the present application, the rectification module 1103 is specifically configured to: if the relative position is that the object point corresponding to the target pixel point is located in the front direction of the focused object point, taking the parallax value larger than the reference parallax value in the first parallax and the second parallax as the corrected parallax value; and if the relative position is that the object point corresponding to the target pixel point is positioned in the rear direction of the focused object point, taking the parallax value smaller than the reference parallax value in the first parallax and the second parallax as the corrected parallax value.
In some embodiments of the present application, the apparatus 1100 further includes: a polyvalent value module to: according to the cost cube, obtaining each cost value corresponding to each pixel point in the first field angle image; and the repeated determining module is used for determining whether the pixel point is positioned in the repeated texture region according to each cost value corresponding to each pixel point in the first field angle image so as to obtain the repeated texture region.
In some embodiments of the present application, the duplication determination module is specifically configured to: calculating a cost value variance for each pixel point in the first view angle image according to each cost value corresponding to the pixel point; and determining whether the pixel point is positioned in the repeated texture region according to the cost value variance corresponding to the pixel point.
In some embodiments of the present application, the duplication determination module is specifically configured to: if the cost value variance is larger than or equal to a preset variance, acquiring the minimum cost value and the secondary minimum cost value of the pixel point from each cost value corresponding to the pixel point; obtaining the minimum cost value of the pixel point and the parallax value corresponding to the second minimum cost value; calculating a parallax difference value according to the minimum cost value and the parallax value corresponding to the second minimum cost value of the pixel point; and if the absolute value of the parallax difference value is greater than a preset value, the pixel point is in the repeated texture area.
In some embodiments of the present application, as shown in fig. 12, an electronic device is provided, which may be specifically a terminal, a server, or an imaging device. The electronic equipment comprises a processor, a memory, a network interface and an imaging device which are connected through a system bus, wherein the imaging device comprises a dual-core sensor, the memory comprises a nonvolatile storage medium and an internal memory, the nonvolatile storage medium of the electronic equipment stores an operating system, and a computer program can be stored, and when the computer program is executed by the processor, the processor can realize the parallax value correction method. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM). The internal memory may also have a computer program stored therein, which when executed by the processor, causes the processor to perform the disparity value correction method. Those skilled in the art will appreciate that the structure shown in fig. 12 is a block diagram of only a portion of the structure relevant to the present disclosure, and does not constitute a limitation on the electronic device to which the present disclosure may be applied, and that a particular electronic device may include more or less components than those shown, or combine certain components, or have a different arrangement of components.
The parallax value correction method provided by the present application may be implemented in the form of a computer program that is executable on an electronic device such as that shown in fig. 12. The memory of the electronic device may store therein a plurality of program templates constituting the parallax value correcting device. Such as a matching module, a phase module, and an object correction module.
An electronic device comprising a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of:
performing stereo matching on a first field angle image and a second field angle image of a target scene to obtain a cost cube between the first field angle image and the second field angle image, wherein the cost cube records the cost values of pixel points in the first field angle image under a plurality of parallax values;
for the repeated texture area in the first field angle image, acquiring a phase difference corresponding to a target pixel point in the repeated texture area;
and obtaining a corrected parallax value of the target pixel point according to the phase difference corresponding to the target pixel point and the minimum cost value and the secondary minimum cost value in each cost value corresponding to the target pixel point.
In some embodiments of the present application, there is provided a computer readable storage medium storing a computer program which, when executed by a processor, causes the processor to perform the steps of:
performing stereo matching on a first field angle image and a second field angle image of a target scene to obtain a cost cube between the first field angle image and the second field angle image, wherein the cost cube records the cost values of pixel points in the first field angle image under a plurality of parallax values;
for the repeated texture area in the first field angle image, acquiring a phase difference corresponding to a target pixel point in the repeated texture area;
and obtaining a corrected parallax value of the target pixel point according to the phase difference corresponding to the target pixel point and the minimum cost value and the secondary minimum cost value in each cost value corresponding to the target pixel point.
It should be noted that the above-mentioned parallax correction method, parallax correction device, electronic apparatus, and computer-readable storage medium belong to one general inventive concept, and the contents of the embodiments of the parallax correction method, the parallax correction device, the electronic apparatus, and the computer-readable storage medium are mutually applicable.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or a plurality of modules may exist separately, or two or more modules may be integrated to form an independent part.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (11)

1. A parallax value correction method, comprising:
performing stereo matching on a first field angle image and a second field angle image of a target scene to obtain a cost cube between the first field angle image and the second field angle image, wherein the cost cube records the cost values of pixel points in the first field angle image under a plurality of parallax values;
for the repeated texture area in the first field angle image, acquiring a phase difference corresponding to a target pixel point in the repeated texture area;
and obtaining a corrected parallax value of the target pixel point according to the phase difference corresponding to the target pixel point and the minimum cost value and the secondary minimum cost value in each cost value corresponding to the target pixel point.
2. The method according to claim 1, wherein the phase difference is obtained after light of an object point corresponding to the target pixel point passes through a dual-core sensor, and a plurality of photoelectric groups are arranged in the dual-core sensor, and each photoelectric group comprises two photoelectric imaging units.
3. The method according to claim 1 or 2, wherein the obtaining of the corrected parallax value of the target pixel point according to the phase difference corresponding to the target pixel point, and the minimum cost value and the next-minimum cost value of the cost values corresponding to the target pixel point comprises:
determining the relative positions of the object point corresponding to the target pixel point and the focusing object point corresponding to the target scene according to the phase difference corresponding to the target pixel point;
acquiring a reference parallax value corresponding to the focusing object point;
acquiring a first parallax value corresponding to the minimum cost value and a second parallax value corresponding to the secondary minimum cost value;
and obtaining a corrected parallax value of the target pixel point from the first parallax value and the second parallax value according to the relative position and the reference parallax value.
4. The method of claim 3, wherein the first and second field angle images are acquired by an imaging device;
the determining the relative position of the object point corresponding to the target pixel point and the focusing object point corresponding to the target scene according to the phase difference corresponding to the target pixel point includes:
if the phase difference corresponding to the target pixel point is smaller than 0, the object point corresponding to the target pixel point is located in the rear direction of the focusing object point, and the rear direction is a direction far away from the imaging device;
if the phase difference corresponding to the target pixel point is larger than 0, the object point corresponding to the target pixel point is located in the front direction of the focusing object point, and the front direction is the direction close to the imaging device.
5. The method according to any one of claims 3 or 4, wherein the obtaining of the corrected disparity value of the target pixel point from the first disparity value and the second disparity value according to the relative position and the reference disparity value comprises:
if the relative position is that the object point corresponding to the target pixel point is located in the front direction of the focused object point, taking the parallax value larger than the reference parallax value in the first parallax and the second parallax as the corrected parallax value;
and if the relative position is that the object point corresponding to the target pixel point is positioned in the rear direction of the focused object point, taking the parallax value smaller than the reference parallax value in the first parallax and the second parallax as the corrected parallax value.
6. The method according to any one of claims 1 to 5, further comprising, before the obtaining, for the repeated texture region in the first field angle image, a phase difference corresponding to a target pixel point in the repeated texture region:
according to the cost cube, obtaining each cost value corresponding to each pixel point in the first field angle image;
and determining whether the pixel point is located in the repeated texture region according to each cost value corresponding to each pixel point in the first view field angle image so as to obtain the repeated texture region.
7. The method of claim 6, wherein determining whether the pixel point is located in the repeated texture region according to the respective cost value corresponding to each pixel point in the first field angle image comprises:
calculating a cost value variance for each pixel point in the first view angle image according to each cost value corresponding to the pixel point;
and determining whether the pixel point is positioned in the repeated texture region according to the cost value variance corresponding to the pixel point.
8. The method of claim 7, wherein said determining whether the pixel point is located in the repeated texture region according to the cost value variance corresponding to the pixel point comprises:
if the cost value variance is larger than or equal to a preset variance, acquiring the minimum cost value and the secondary minimum cost value of the pixel point from each cost value corresponding to the pixel point;
obtaining the minimum cost value of the pixel point and the parallax value corresponding to the second minimum cost value;
calculating a parallax difference value according to the minimum cost value and the parallax value corresponding to the second minimum cost value of the pixel point;
and if the absolute value of the parallax difference value is greater than a preset value, the pixel point is in the repeated texture area.
9. A parallax correction device, comprising:
the matching module is used for performing stereo matching on a first field angle image and a second field angle image of a target scene to obtain a cost cube between the first field angle image and the second field angle image, wherein the cost cube records the cost values of pixel points in the first field angle image under a plurality of parallax values;
the phase module is used for acquiring a phase difference corresponding to a target pixel point in a repeated texture area in the first field angle image;
and the correction module is used for obtaining a corrected parallax value of the target pixel point according to the phase difference corresponding to the target pixel point and the minimum cost value and the secondary minimum cost value in each cost value corresponding to the target pixel point.
10. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method according to any one of claims 1 to 8 when executing the computer program.
11. A computer-readable storage medium, having stored thereon computer program instructions, which, when read and executed by a processor, perform the steps of the method of any one of claims 1 to 8.
CN202110606123.XA 2021-05-31 2021-05-31 Parallax value correction method, parallax value correction device, electronic apparatus, and storage medium Pending CN113487492A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110606123.XA CN113487492A (en) 2021-05-31 2021-05-31 Parallax value correction method, parallax value correction device, electronic apparatus, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110606123.XA CN113487492A (en) 2021-05-31 2021-05-31 Parallax value correction method, parallax value correction device, electronic apparatus, and storage medium

Publications (1)

Publication Number Publication Date
CN113487492A true CN113487492A (en) 2021-10-08

Family

ID=77933792

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110606123.XA Pending CN113487492A (en) 2021-05-31 2021-05-31 Parallax value correction method, parallax value correction device, electronic apparatus, and storage medium

Country Status (1)

Country Link
CN (1) CN113487492A (en)

Similar Documents

Publication Publication Date Title
CN111444744A (en) Living body detection method, living body detection device, and storage medium
CN111160232B (en) Front face reconstruction method, device and system
CN114049512A (en) Model distillation method, target detection method and device and electronic equipment
CN113256699B (en) Image processing method, image processing device, computer equipment and storage medium
CN110213491B (en) Focusing method, device and storage medium
CN113793382A (en) Video image splicing seam searching method and video image splicing method and device
WO2023284358A1 (en) Camera calibration method and apparatus, electronic device, and storage medium
CN113570530A (en) Image fusion method and device, computer readable storage medium and electronic equipment
CN113630549A (en) Zoom control method, device, electronic equipment and computer-readable storage medium
CN114842466A (en) Object detection method, computer program product and electronic device
CN114170690A (en) Method and device for living body identification and construction of living body identification model
CN111160233B (en) Human face in-vivo detection method, medium and system based on three-dimensional imaging assistance
CN116883981A (en) License plate positioning and identifying method, system, computer equipment and storage medium
CN115830354A (en) Binocular stereo matching method, device and medium
CN113592777A (en) Image fusion method and device for double-shooting and electronic system
CN116051736A (en) Three-dimensional reconstruction method, device, edge equipment and storage medium
CN116012609A (en) Multi-target tracking method, device, electronic equipment and medium for looking around fish eyes
CN113487492A (en) Parallax value correction method, parallax value correction device, electronic apparatus, and storage medium
CN113225484B (en) Method and device for rapidly acquiring high-definition picture shielding non-target foreground
CN115880206A (en) Image accuracy judging method, device, equipment, storage medium and program product
Hossain et al. A real-time face to camera distance measurement algorithm using object classification
CN112818743B (en) Image recognition method and device, electronic equipment and computer storage medium
CN112257666B (en) Target image content aggregation method, device, equipment and readable storage medium
CN114998980A (en) Iris detection method and device, electronic equipment and storage medium
CN115482285A (en) Image alignment method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination