CN112116660B - Disparity map correction method, device, terminal and computer readable medium - Google Patents

Disparity map correction method, device, terminal and computer readable medium Download PDF

Info

Publication number
CN112116660B
CN112116660B CN201910533023.1A CN201910533023A CN112116660B CN 112116660 B CN112116660 B CN 112116660B CN 201910533023 A CN201910533023 A CN 201910533023A CN 112116660 B CN112116660 B CN 112116660B
Authority
CN
China
Prior art keywords
point
parallax
pixel
value
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910533023.1A
Other languages
Chinese (zh)
Other versions
CN112116660A (en
Inventor
孙一郎
侯一凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Beijing BOE Optoelectronics Technology Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Beijing BOE Optoelectronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd, Beijing BOE Optoelectronics Technology Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN201910533023.1A priority Critical patent/CN112116660B/en
Priority to PCT/CN2020/096963 priority patent/WO2020253805A1/en
Publication of CN112116660A publication Critical patent/CN112116660A/en
Application granted granted Critical
Publication of CN112116660B publication Critical patent/CN112116660B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

The invention provides a disparity map correction method, a disparity map correction device, a disparity map correction terminal and a computer readable medium, wherein the disparity map correction method comprises the following steps: acquiring a parallax map to be corrected and a reference view for generating the parallax map; extracting the outline of the target object in the reference view; judging whether each pixel point corresponding to the outline in the parallax image is a starting error point or not according to the difference of the parallax values of each pixel point corresponding to the outline and the adjacent pixel points in the parallax image; for any pixel point, when the pixel point is judged to be the initial error point, detecting the position of the boundary error point in the same line with the initial error point; and obtaining parallax values of a preset number of pixel points which are positioned in the same row with the boundary error points and are positioned on one side of the boundary error points away from the initial error points, and correcting the initial error points, the boundary error points and the parallax values of the pixel points positioned between the initial error points and the boundary error points according to the parallax values of the preset number of pixel points. The present invention corrects distortion generated when forming a disparity map.

Description

Disparity map correction method, device, terminal and computer readable medium
Technical Field
The present invention relates to the field of image processing, and in particular, to a disparity map correction method, a device, a terminal, and a computer readable medium.
Background
Binocular stereo matching is always a research hot spot of binocular vision, a binocular camera shoots left and right viewpoint images of the same scene, a stereo matching algorithm is used for obtaining a parallax image, and then a depth image is obtained. The depth map has a very wide application range, and can be used for measuring, three-dimensional reconstruction, virtual viewpoint synthesis and the like because the depth map can record the distance between an object in a scene and a camera.
However, in the process of acquiring the parallax map by using the left viewpoint image and the right viewpoint image, the parallax map is distorted due to the inconsistent information at the target outline in the left viewpoint image and the right viewpoint image, and the application of binocular vision is seriously affected.
Disclosure of Invention
The invention aims to at least solve one of the technical problems in the prior art, and provides a disparity map correction method, a disparity map correction device, a disparity map correction terminal and a computer readable medium.
In order to achieve the above object, the present invention provides a disparity map correction method, wherein the method includes:
acquiring a parallax map to be corrected and a reference view for generating the parallax map;
extracting the outline of the target object in the reference view;
judging whether each pixel point corresponding to the outline in the parallax map is an initial error point or not according to the difference of the parallax values of each pixel point corresponding to the outline and the adjacent pixel points in the parallax map;
for any pixel point, when the pixel point is judged to be the initial error point, detecting the position of the boundary error point in the same line as the initial error point;
and acquiring parallax values of a preset number of pixel points which are positioned on the same line as the boundary error point and are positioned on one side of the boundary error point away from the initial error point, and correcting the initial error point, the boundary error point and the parallax values of the pixel points positioned between the initial error point and the boundary error point according to the parallax values of the preset number of pixel points.
Optionally, the determining whether each pixel point corresponding to the contour is a starting error point according to a difference between the disparity values of each pixel point corresponding to the contour and surrounding pixel points in the disparity map includes:
for any pixel point corresponding to the outline in the parallax map, extracting a first parallax value corresponding to the pixel point, a second parallax value corresponding to an adjacent pixel point positioned at one side of the pixel point and a third parallax value corresponding to an adjacent pixel point positioned at the other side of the pixel point from the parallax map; wherein, the adjacent pixel point on one side of the pixel point and the adjacent pixel point on the other side of the pixel point are in the same row with the pixel point;
judging whether the difference between the second parallax value and the first parallax value is smaller than a preset threshold value or not and whether the difference between the third parallax value and the first parallax value is smaller than the preset threshold value or not;
and when the difference between the second parallax value and the first parallax value is smaller than a preset threshold value and the difference between the third parallax value and the first parallax value is smaller than the preset threshold value, determining the pixel point as a starting error point.
Optionally, the detecting a position of a boundary error point located in the same line as the start error point includes:
sequentially extracting parallax values of pixel points in the same row as the initial error point from the parallax map along a preset direction by taking the initial error point as a starting point;
for each extracted parallax value, judging whether the difference between the parallax value and a first parallax value corresponding to the initial error point is larger than a preset threshold value or not;
if the difference between the parallax value and the first parallax value corresponding to the initial error point is greater than a preset threshold value, taking the pixel point corresponding to the parallax value as a boundary error point.
Optionally, the correcting the parallax values of the start error point, the boundary error point, and the pixels located between the start error point and the boundary error point according to the parallax values of the preset number of pixels includes:
extracting parallax values corresponding to the preset number of pixel points from the parallax map;
calculating the average value of the parallax values corresponding to the preset number of pixel points;
replacing the initial error point, the boundary error point and the disparity value of each pixel point between the initial error point and the boundary error point with the average value.
The invention also provides a disparity map correction device, wherein the device comprises: the device comprises an acquisition module, an extraction module, a judgment module, a detection module and a correction module;
the acquisition module is used for acquiring a parallax image to be corrected and generating a reference view of the parallax image;
the extraction module is used for extracting the outline of the target object in the reference view;
the judging module is used for judging whether each pixel point on the outline is an initial error point or not according to the difference of parallax values corresponding to each pixel point on the outline and surrounding pixel points in the parallax map;
for any pixel point, the detection module is used for detecting the position of a boundary error point in the same line with the initial error point when judging that the pixel point is the initial error point;
the correction module is used for obtaining parallax values of a preset number of pixel points which are in the same row as the boundary error point and are positioned on one side of the boundary error point away from the initial error point, and correcting the initial error point, the boundary error point and the parallax values of the pixel points positioned between the initial error point and the boundary error point according to the parallax values of the preset number of pixel points.
Optionally, the judging module includes: a first extraction unit and a first judgment unit;
for any pixel point on the contour, the first extracting unit is configured to extract, from the disparity map, a first disparity value corresponding to the pixel point, a second disparity value corresponding to a pixel point located at one side of the pixel point, and a third disparity value corresponding to a pixel point located at the other side of the pixel point; wherein, the pixel point positioned at one side of the pixel point and the pixel point positioned at the other side of the pixel point are positioned in the same row with the pixel point;
the first judging unit is used for judging whether the pixel point is an initial error point or not according to the first error value, the second error value, the third error value and a preset threshold value; and when the difference between the second parallax value and the first parallax value is smaller than a preset threshold value and the difference between the third parallax value and the first parallax value is smaller than the preset threshold value, determining the pixel point as a starting error point.
Optionally, the detection module includes: a second extraction unit and a second judgment unit;
the second extraction unit is used for sequentially extracting parallax values corresponding to pixel points in the same row with the initial error point from the parallax map along a preset direction by taking the initial error point as a starting point;
for each extracted parallax value, the second judging unit is used for judging whether the difference between the parallax value and the first parallax value is larger than a preset threshold value; if the difference between the parallax value and the first parallax value is greater than a preset threshold value, stopping extracting the parallax value, and taking the pixel point corresponding to the parallax value as a boundary error point.
Optionally, the correction module includes: a third extraction unit, a calculation unit, and a correction unit;
the third extraction unit is used for extracting parallax values corresponding to the preset number of pixel points from the parallax map;
the calculating unit is used for calculating the average value of the parallax values corresponding to the preset number of pixel points;
the correction unit is used for replacing the initial error point, the boundary error point and the parallax value of the pixel point between the initial error point and the boundary error point with the average value.
The invention also provides a terminal, which comprises: at least one processor and memory;
the memory stores computer-executable instructions;
at least one of the processors executes computer-executable instructions stored in the memory to cause the terminal to perform the disparity map correction method described above.
The invention also provides a computer readable medium, wherein the computer readable medium stores computer execution instructions, and when the computer execution instructions are executed by a processor, the parallax map correction method is realized.
Drawings
The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate the invention and together with the description serve to explain, without limitation, the invention. In the drawings:
fig. 1 is a flowchart of a disparity map correction method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of error point locations according to an embodiment of the present invention;
FIG. 3 is a flowchart of determining whether a pixel is an initial error point according to an embodiment of the present invention;
FIG. 4 is a flow chart of detecting boundary error points according to an embodiment of the present invention;
fig. 5 is a schematic diagram of an electronic device of a parallax map correction apparatus according to an embodiment of the present invention;
fig. 6 is a schematic diagram showing a hardware structure of an electronic device for performing the disparity map correction method according to an embodiment of the present invention.
Detailed Description
The following describes specific embodiments of the present invention in detail with reference to the drawings. It should be understood that the detailed description and specific examples, while indicating and illustrating the invention, are not intended to limit the invention.
The invention provides a parallax map correction method, fig. 1 is a flow chart of the parallax map correction method provided by the embodiment of the invention, as shown in fig. 1, the parallax map correction method comprises the following steps:
s110, acquiring a disparity map to be corrected and a reference view for generating the disparity map.
In this step, a left view and a right view are acquired by a binocular vision sensor, and in the embodiment of the present invention, the left view is taken as a reference view, the right view is taken as an auxiliary view, a disparity map corresponding to the left view is generated, and the disparity map and the left view are extracted. Of course, the right view may also be used as a reference view in the process of generating the disparity map, which is specifically determined according to actual needs.
S120, extracting the outline of the target object in the reference view.
In the step, the left view is segmented based on a pixel-level image segmentation algorithm, so that each object in the left view is convenient to distinguish, the outline of the target object is extracted from the segmented left view, and the coordinates of each pixel point at the outline of the target object are stored. In the case of dividing the left view, the division may be performed based on pixel-level processing of the image morphology.
S130, judging whether each pixel point corresponding to the outline in the parallax image is a starting error point according to the difference of the parallax values of each pixel point corresponding to the outline and the adjacent pixel points in the parallax image.
In this step, according to the outline of the object, the parallax value of each pixel point on the outline and the parallax value of the pixel point adjacent to the pixel point can be extracted from the position corresponding to the outline in the parallax map. For any pixel point corresponding to the outline in the disparity map, the pixel point adjacent to the pixel point is especially the pixel point adjacent to the pixel point and located in the same row. For the same object, the parallax values of all the pixels on the object are similar, so that a preset threshold value is set, the parallax values of the pixels corresponding to the outline in the parallax map and the parallax values of the pixels adjacent to the outline are calculated, and when the difference between the parallax values of the pixels corresponding to the outline in the parallax map and the parallax values of the pixels adjacent to the outline in the parallax map is smaller than the preset threshold value, the parallax values can be regarded as the same object; when the difference between the parallax value of the pixel point corresponding to the outline and the parallax value of the pixel point adjacent to the parallax value in the parallax map is larger than the preset threshold value, the parallax value can be regarded as a different object. Because the embodiment of the invention is based on the parallax value in the parallax map extracted by the outline of the target object in the reference image, if the parallax map is not distorted, two pixel points on two sides of any pixel point on the outline correspond to different objects; and when the calculation result shows that a certain pixel point in the pixel points corresponding to the outline in the parallax image and the pixel points on two adjacent sides are the same object, determining the pixel point as an initial error point.
S140, for any pixel point, when the pixel point is judged to be the initial error point, detecting the position of the boundary error point in the same line with the initial error point.
Fig. 2 is a schematic diagram of the positions of error points according to an embodiment of the present invention, as shown in fig. 2, in this step, the boundary error points may be located in the same line as the start error points and correspond to the pixel points of the outline of the disparity map. After the initial error point is determined, the error boundary is found, so that the region to be corrected is determined, therefore, on the line where the initial error point is located, whether each pixel point is a normal pixel point or not is detected point by point along the preset direction by taking the initial error point as a starting point, namely, the pixel point corresponds to different objects with the initial error point, and when the normal pixel point is found, the normal pixel point is taken as the boundary error point.
S150, obtaining parallax values of preset number of pixel points which are positioned on the same line with the boundary error points and on one side of the boundary error points away from the initial error point.
In this step, when the boundary error point is found, the region to be corrected is formed by the start error point, the boundary error point, and the pixel points between the start error point and the boundary error point, and the parallax values of the preset number of pixel points outside the region to be corrected are extracted along the direction from the start error point to the boundary error point, and are used in correction.
S160, correcting the parallax values of the initial error point, the boundary error point and the pixel points between the initial error point and the boundary error point according to the parallax values of the preset number of pixel points.
In this step, when the area to be corrected is determined, the parallax values of the preset number of pixels outside the area to be corrected are extracted along the direction from the start error point to the boundary error point, and may be considered as the parallax values to be applied to the area to be corrected, and therefore, the parallax values of the pixels in the area to be corrected are replaced according to the parallax values to be corrected, so as to perform correction.
In the prior art, when an image is acquired, the acquisition angles of a left view and a right view are different, and when a target object is blocked by an obstacle in acquiring a certain side view, information of the left view and information of the right view are not completely consistent, so that a parallax image obtained by matching the left view and the right view is distorted at the outline boundary of the target. By adopting the method of the embodiment of the invention, the initial error point and the boundary error point in the parallax map are determined according to the outline of the target object in the reference view, so that the pixel point to be corrected is determined, and then the pixel point to be corrected is corrected according to the parallax value of the normal pixel point close to the pixel point to be corrected, so that the distortion parallax map is corrected, and the accuracy of forming the parallax map is improved.
Fig. 3 is a flowchart of determining whether a pixel point is an initial error point according to an embodiment of the present invention, and as shown in fig. 3, when determining whether any pixel point corresponding to a contour in a disparity map is an initial error point, the method includes the following sub-steps:
s131, extracting a first parallax value corresponding to the pixel point, a second parallax value corresponding to an adjacent pixel point positioned at one side of the pixel point and a third parallax value corresponding to an adjacent pixel point positioned at the other side of the pixel point from the parallax map. Wherein, the adjacent pixel point on one side of the pixel point and the adjacent pixel point on the other side of the pixel point are in the same row with the pixel point.
In this step, for any pixel point corresponding to the contour in the disparity map, the disparity value d of the pixel point, the second disparity value d1 corresponding to the adjacent pixel point on one side of the pixel point, and the third disparity value d2 corresponding to the adjacent pixel point on the other side of the pixel point are extracted from the disparity map according to the coordinates (x, y) of the pixel point. Wherein the coordinates of two adjacent pixel points are (x-1, y) and (x+1, y), respectively.
S132, judging whether the difference between the second parallax value and the first parallax value is smaller than a preset threshold value and whether the difference between the third parallax value and the first parallax value is smaller than the preset threshold value.
In this step, it is determined whether d, d1, and d2 satisfy |d-d1| < T and |d-d2| < T, if yes, S133 is executed, and if no, the pixel is considered to be a normal pixel. Preferably, the preset threshold T is a small value, typically empirically set to 3.
S133, determining the pixel point as a starting error point.
In one embodiment, the detecting the position of the boundary error point in the same line as the starting error point in step S140 includes: and sequentially extracting the parallax values of the pixel points in the same row as the initial error point from the parallax map along the preset direction by taking the initial error point as a starting point, and judging whether the pixel point corresponding to the parallax value is a boundary error point or not. Fig. 4 is a flowchart of detecting a boundary error point according to an embodiment of the present invention, as shown in fig. 4, wherein coordinates of a start error point are denoted as (x, y) during detection, and a corresponding first disparity value is d. Wherein, the abscissa of each pixel point represents the column number of the pixel point, the ordinate represents the line number of the pixel point, and i is a non-zero integer.
As shown in fig. 4, S140 includes the steps of:
s141, setting the initial value of i to be 1 or-1.
S142, extracting a parallax value of a pixel point with coordinates of (x+i, y) from the parallax map, wherein the parallax value is denoted as d x+i,y The method comprises the steps of carrying out a first treatment on the surface of the That is, the parallax values of the pixel points located in the same line as the start error point are extracted.
S143, judging the parallax value d x+i,y Whether the difference between the first parallax values d corresponding to the start error points is greater than a preset threshold.
In this step, the parallax value of the pixel point with coordinates (x+i, y) is set to be an integer, and the initial value of i is set to be 1 or-1. On the line where the initial error point is located, the initial error point is taken as a starting point, the disparity value is extracted from the pixel point to the left side and the right side of the initial error point respectively, and after each extraction, the judgment of S142 is performed. Of course, the embodiment of the invention can also extract along a certain direction, and the extraction is specifically determined according to actual needs.
In this step, d and d are determined x+i,y Whether or not |d-d is satisfied x+i,y |>T. If yes, executing S144; if not, determining the pixel point corresponding to the parallax value di as an error pixel point, and adding 1 to or subtracting 1 from the value of i. After that, S142 is performed. The process is repeated until a boundary error point is found.
S144, taking the pixel point corresponding to the parallax value as a boundary error point. Wherein the coordinates of the boundary error point are (x+i, y).
In step S140, the initial value of i, and whether the pixel corresponding to the disparity value di is determined as the error pixel in step 143, and then the value of i is added with 1 or subtracted may be determined according to the preset direction. For example, the abscissa of the pixel point gradually increases from the left side to the right side of the image, the preset direction is horizontal to the right, at this time, the initial value of i is 1, and after determining that the pixel point corresponding to the disparity value di is the error pixel point in step 143, the value of i is added by 1; if the preset direction is horizontal to the left, the initial value of i is 1, and after determining that the pixel corresponding to the disparity value di is the error pixel in step 143, the value of i is subtracted by 1.
As shown in fig. 2, the area a is an area where information of pixel points in the left view and the right view are inconsistent, when an object in a certain side view is blocked by an obstacle due to different acquisition angles of the left view and the right view during image acquisition, the information of the left view and the right view are not completely consistent, so that a parallax image obtained by matching the left view and the right view is distorted at a contour boundary of the object. For example, when a certain pixel point a (not shown) located on the left side of the object in the left view and a pixel point b (not shown) corresponding to the pixel point a in the right view do not have the same information when the disparity map is generated using the left view as the reference view, a pixel point c (not shown) having a higher similarity to the pixel point a is traversed to the left side of the pixel point a in the left view on the line of the pixel point a in the left view, and the disparity value of the pixel point c is used as the disparity value of the pixel point a. When each pixel point goes through the above process, a distorted parallax map as shown in fig. 2 is obtained.
Therefore, in the embodiment of the present invention, the preset direction may be determined according to whether the reference view for generating the disparity map is the left view or the right view. For example, when the disparity map is generated with the left view as a reference view, the preset direction is a horizontal right direction; when the disparity map is generated with the right view as a reference view, the preset direction is a horizontal left direction.
In one embodiment, S160 includes the sub-steps of:
s161, extracting parallax values corresponding to a preset number of pixel points from the parallax map.
S162, calculating an average value of parallax values corresponding to the preset number of pixel points.
S163, replacing the initial error point, the boundary error point and the disparity value of each pixel point between the initial error point and the boundary error point by the average value.
Specifically, since the coordinates of the boundary error points are (x+i, y) and the preset number is 3, the pixel points at (x+i+1, y), (x+i+2, y) and (x+i+3, y) 3 are taken, and the parallax values corresponding to the pixel points at 3 are d3, d4 and d5, respectively, so as to obtain the average value d6 of d3, d4 and d 5. D6 replaces the initial error point, the boundary error point and the disparity value of each pixel point between the initial error point and the boundary error point to correct.
Based on the same inventive concept, an embodiment of the present invention further provides a parallax map correction device, and fig. 5 is a schematic diagram of an electronic device of the parallax map correction device provided by the embodiment of the present invention, as shown in fig. 5, where the parallax map correction device includes: the device comprises an acquisition module 510, an extraction module 520, a judgment module 530, a detection module 540 and a correction module 550.
The acquisition module 510 is configured to acquire a disparity map to be corrected and to generate a reference view of the disparity map.
The extraction module 520 is configured to extract a contour of the object in the reference view.
The determining module 530 is configured to determine whether each pixel on the contour is a starting error point according to differences between disparity values corresponding to each pixel on the contour and surrounding pixels in the disparity map.
For any pixel, the detecting module 540 is configured to detect the position of the boundary error point in the same line as the start error point when the pixel is determined to be the start error point.
The correction module 550 is configured to obtain parallax values of a preset number of pixels located on a side of the boundary error point away from the start error point and the same line as the boundary error point, and correct the start error point, the boundary error point, and the parallax values of the pixels located between the start error point and the boundary error point according to the parallax values of the preset number of pixels.
By adopting the device provided by the embodiment of the invention, the initial error point and the boundary error point in the parallax map are determined according to the outline of the target object in the reference view, so that the pixel point to be corrected is determined, and then the pixel point to be corrected is corrected according to the parallax value of the normal pixel point close to the pixel point to be corrected, so that the distortion parallax map is corrected, and the accuracy of forming the parallax map is improved.
In one embodiment, the determining module 530 includes: the first extraction unit 531 and the first judgment unit 532.
For any pixel on the contour, the first extraction unit 531 is configured to extract, from the disparity map, a first disparity value corresponding to the pixel, a second disparity value corresponding to a pixel located on one side of the pixel, and a third disparity value corresponding to a pixel located on the other side of the pixel. Wherein, the pixel point at one side of the pixel point and the pixel point at the other side of the pixel point are in the same row with the pixel point.
The first determining unit 532 is configured to determine whether the pixel point is a start error point according to the first error value, the second error value, the third error value, and a preset threshold. And when the difference between the second parallax value and the first parallax value is smaller than a preset threshold value and the difference between the third parallax value and the first parallax value is smaller than the preset threshold value, determining the pixel point as a starting error point.
In one embodiment, the detection module 540 includes: a second extraction unit 541 and a second determination unit 542.
The second extraction unit 541 is configured to sequentially extract, from the disparity map, disparity values corresponding to pixel points in the same row as the starting error point, in a preset direction, with the starting error point as a starting point.
For each extracted parallax value, the second determining unit 542 is configured to determine whether the difference between the parallax value and the first parallax value is greater than a preset threshold. If the difference between the parallax value and the first parallax value is greater than a preset threshold value, stopping extracting the parallax value, and taking the pixel point corresponding to the parallax value as a boundary error point.
In one embodiment, the correction module 550 includes: a third extraction unit 551, a calculation unit 552, and a correction unit 553.
The third extraction unit 551 is configured to extract, from the disparity map, disparity values corresponding to a preset number of pixels.
The calculating unit 552 is configured to calculate an average value of parallax values corresponding to a preset number of pixel points.
The correction unit 553 is used to replace the parallax values of the start error point, the boundary error point, and the pixel points located between the start error point and the boundary error point with the average value.
The embodiment of the invention also provides a terminal, which comprises: at least one processor and a memory.
The memory stores computer-executable instructions.
At least one processor executes computer-executable instructions stored in the memory to cause the terminal to perform the method described above.
Fig. 6 is a schematic diagram showing a hardware structure of an electronic device for performing a disparity map correction method according to an embodiment of the present invention, as shown in fig. 6, the device includes:
one or more processors 610, and a memory 620, one processor 610 being illustrated in fig. 6.
The processor 610 and the memory 620 may be connected by a bus or otherwise, for example in fig. 5.
The memory 620 is a non-volatile computer readable storage medium, and may be used to store a non-volatile software program, a non-volatile computer executable program, and modules, such as program instructions/modules corresponding to the disparity map correction method according to the present invention. The processor 610 executes various functional applications of the server and data processing, i.e., implements the disparity map correction method in the above-described embodiment of the present invention, by running nonvolatile software programs, instructions, and modules stored in the memory 620.
Memory 620 may include a storage program area and a storage data area, wherein the storage program area may store an operating system and at least one application program required for a function; the storage data area may store data created according to the use of any of the above methods, and the like. In addition, memory 620 may include high-speed random access memory, and may also include non-volatile memory, such as magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices. In a particular embodiment, memory 620 optionally includes memory remotely located relative to processor 610, which may be connected to the processor running any of the above methods via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
One or more modules are stored in the memory 620 that, when executed by the one or more processors 610, perform the disparity map correction method described above.
The embodiment of the invention also provides a computer readable medium, wherein the computer readable medium stores computer execution instructions, and when the computer execution instructions are executed by a processor, the method is realized.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the apparatus and modules described above may refer to the corresponding process in the foregoing method embodiment, which is not repeated herein.
In the embodiments of the present invention, it should be understood that the disclosed apparatus and method may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of modules is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple modules may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The modules illustrated as separate components may or may not be physically separate, i.e., may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purposes of the embodiments of the present invention.
In addition, each functional module in the embodiment of the present invention may be integrated in one processing module, or each module may exist alone physically, or two or more modules may be integrated in one module.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored on a non-volatile computer readable storage medium executable by a processor. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method of the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only memory (ROM), a random access memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the above examples are only specific embodiments of the present invention for illustrating the technical solution of the present invention, but not for limiting the scope of the present invention, and although the present invention has been described in detail with reference to the foregoing examples, it will be understood by those skilled in the art that the present invention is not limited thereto: any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or perform equivalent substitution of some of the technical features, while remaining within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are intended to be included in the scope of the present invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (8)

1. A disparity map correction method, the method comprising:
acquiring a parallax map to be corrected and a reference view for generating the parallax map;
dividing the reference view, and extracting the outline of the target object in the divided reference view;
judging whether each pixel point corresponding to the outline in the parallax map is an initial error point or not according to the difference of the parallax values of each pixel point corresponding to the outline and the adjacent pixel points in the parallax map;
for any pixel point, when the pixel point is judged to be the initial error point, detecting the position of the boundary error point in the same line as the initial error point;
acquiring parallax values of a preset number of pixel points which are positioned on the same line as the boundary error point and are positioned on one side of the boundary error point away from the initial error point, and correcting the initial error point, the boundary error point and the parallax values of the pixel points positioned between the initial error point and the boundary error point according to the parallax values of the preset number of pixel points;
wherein the determining whether each pixel corresponding to the contour is an initial error point according to the difference between the disparity values of each pixel corresponding to the contour and surrounding pixels in the disparity map includes:
for any pixel point corresponding to the outline in the parallax map, extracting a first parallax value corresponding to the pixel point, a second parallax value corresponding to an adjacent pixel point positioned at one side of the pixel point and a third parallax value corresponding to an adjacent pixel point positioned at the other side of the pixel point from the parallax map; wherein, the adjacent pixel point on one side of the pixel point and the adjacent pixel point on the other side of the pixel point are in the same row with the pixel point;
judging whether the difference between the second parallax value and the first parallax value is smaller than a preset threshold value or not and whether the difference between the third parallax value and the first parallax value is smaller than the preset threshold value or not;
and when the difference between the second parallax value and the first parallax value is smaller than a preset threshold value and the difference between the third parallax value and the first parallax value is smaller than the preset threshold value, determining the pixel point as a starting error point.
2. The disparity map correction method according to claim 1, wherein the detecting the position of the boundary error point located in the same line as the start error point includes:
sequentially extracting parallax values of pixel points in the same row as the initial error point from the parallax map along a preset direction by taking the initial error point as a starting point;
for each extracted parallax value, judging whether the difference between the parallax value and a first parallax value corresponding to the initial error point is larger than a preset threshold value or not;
if the difference between the parallax value and the first parallax value corresponding to the initial error point is greater than a preset threshold value, taking the pixel point corresponding to the parallax value as a boundary error point.
3. The disparity map correction method according to claim 1, wherein correcting the disparity values of the start error point, the boundary error point, and the pixel points located between the start error point and the boundary error point according to the disparity values of the preset number of pixel points includes:
extracting parallax values corresponding to the preset number of pixel points from the parallax map;
calculating the average value of the parallax values corresponding to the preset number of pixel points;
replacing the initial error point, the boundary error point and the disparity value of each pixel point between the initial error point and the boundary error point with the average value.
4. A disparity map correcting apparatus, characterized in that the apparatus comprises: the device comprises an acquisition module, an extraction module, a judgment module, a detection module and a correction module;
the acquisition module is used for acquiring a parallax image to be corrected and generating a reference view of the parallax image;
the extraction module is used for dividing the reference view and extracting the outline of the target object in the divided reference view;
the judging module is used for judging whether each pixel point on the outline is an initial error point or not according to the difference of parallax values corresponding to each pixel point on the outline and surrounding pixel points in the parallax map;
for any pixel point, the detection module is used for detecting the position of a boundary error point in the same line with the initial error point when judging that the pixel point is the initial error point;
the correction module is used for obtaining parallax values of a preset number of pixel points which are in the same row as the boundary error point and are positioned on one side of the boundary error point away from the initial error point, and correcting the initial error point, the boundary error point and the parallax values of the pixel points positioned between the initial error point and the boundary error point according to the parallax values of the preset number of pixel points;
the judging module comprises: a first extraction unit and a first judgment unit;
for any pixel point on the contour, the first extracting unit is configured to extract, from the disparity map, a first disparity value corresponding to the pixel point, a second disparity value corresponding to a pixel point located at one side of the pixel point, and a third disparity value corresponding to a pixel point located at the other side of the pixel point; wherein, the pixel point positioned at one side of the pixel point and the pixel point positioned at the other side of the pixel point are positioned in the same row with the pixel point;
the first judging unit is used for judging whether the pixel point is an initial error point or not according to the first error value, the second error value, the third error value and a preset threshold value; and when the difference between the second parallax value and the first parallax value is smaller than a preset threshold value and the difference between the third parallax value and the first parallax value is smaller than the preset threshold value, determining the pixel point as a starting error point.
5. The disparity map correction apparatus according to claim 4, wherein the detection module includes: a second extraction unit and a second judgment unit;
the second extraction unit is used for sequentially extracting parallax values corresponding to pixel points in the same row with the initial error point from the parallax map along a preset direction by taking the initial error point as a starting point;
for each extracted parallax value, the second judging unit is used for judging whether the difference between the parallax value and the first parallax value is larger than a preset threshold value; if the difference between the parallax value and the first parallax value is greater than a preset threshold value, stopping extracting the parallax value, and taking the pixel point corresponding to the parallax value as a boundary error point.
6. The disparity map correction apparatus according to claim 4, wherein the correction module includes: a third extraction unit, a calculation unit, and a correction unit;
the third extraction unit is used for extracting parallax values corresponding to the preset number of pixel points from the parallax map;
the calculating unit is used for calculating the average value of the parallax values corresponding to the preset number of pixel points;
the correction unit is used for replacing the initial error point, the boundary error point and the parallax value of the pixel point between the initial error point and the boundary error point with the average value.
7. A terminal, comprising: at least one processor and memory;
the memory stores computer-executable instructions;
at least one of the processors executes computer-executable instructions stored in the memory to cause the terminal to perform the method of any one of claims 1-3.
8. A computer readable medium having stored thereon computer executable instructions which, when executed by a processor, implement the method of any of claims 1-3.
CN201910533023.1A 2019-06-19 2019-06-19 Disparity map correction method, device, terminal and computer readable medium Active CN112116660B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910533023.1A CN112116660B (en) 2019-06-19 2019-06-19 Disparity map correction method, device, terminal and computer readable medium
PCT/CN2020/096963 WO2020253805A1 (en) 2019-06-19 2020-06-19 Disparity map correction method and apparatus, terminal, and non-transitory computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910533023.1A CN112116660B (en) 2019-06-19 2019-06-19 Disparity map correction method, device, terminal and computer readable medium

Publications (2)

Publication Number Publication Date
CN112116660A CN112116660A (en) 2020-12-22
CN112116660B true CN112116660B (en) 2024-03-29

Family

ID=73796654

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910533023.1A Active CN112116660B (en) 2019-06-19 2019-06-19 Disparity map correction method, device, terminal and computer readable medium

Country Status (2)

Country Link
CN (1) CN112116660B (en)
WO (1) WO2020253805A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114440775A (en) * 2021-12-29 2022-05-06 全芯智造技术有限公司 Characteristic dimension offset error calculation method and device, storage medium and terminal

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104915927A (en) * 2014-03-11 2015-09-16 株式会社理光 Parallax image optimization method and apparatus
CN105631887A (en) * 2016-01-18 2016-06-01 武汉理工大学 Two step parallax improvement method based on adaptive support weight matching algorithm and system
KR20180016823A (en) * 2016-08-08 2018-02-20 한국전자통신연구원 Apparatus for correcting image and method using the same
CN107909036A (en) * 2017-11-16 2018-04-13 海信集团有限公司 A kind of Approach for road detection and device based on disparity map
CN108401460A (en) * 2017-09-29 2018-08-14 深圳市大疆创新科技有限公司 Generate method, system, storage medium and the computer program product of disparity map
CN109215044A (en) * 2017-06-30 2019-01-15 京东方科技集团股份有限公司 Image processing method and system, storage medium and mobile system
CN109859253A (en) * 2018-12-17 2019-06-07 深圳市道通智能航空技术有限公司 A kind of solid matching method, device and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104252701B (en) * 2013-06-28 2017-08-29 株式会社理光 Correct the method and system of disparity map
CN105023263B (en) * 2014-04-22 2017-11-14 南京理工大学 A kind of method of occlusion detection and parallax correction based on region growing
JP6377970B2 (en) * 2014-06-12 2018-08-22 トヨタ自動車株式会社 Parallax image generation apparatus and parallax image generation method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104915927A (en) * 2014-03-11 2015-09-16 株式会社理光 Parallax image optimization method and apparatus
CN105631887A (en) * 2016-01-18 2016-06-01 武汉理工大学 Two step parallax improvement method based on adaptive support weight matching algorithm and system
KR20180016823A (en) * 2016-08-08 2018-02-20 한국전자통신연구원 Apparatus for correcting image and method using the same
CN109215044A (en) * 2017-06-30 2019-01-15 京东方科技集团股份有限公司 Image processing method and system, storage medium and mobile system
CN108401460A (en) * 2017-09-29 2018-08-14 深圳市大疆创新科技有限公司 Generate method, system, storage medium and the computer program product of disparity map
CN107909036A (en) * 2017-11-16 2018-04-13 海信集团有限公司 A kind of Approach for road detection and device based on disparity map
CN109859253A (en) * 2018-12-17 2019-06-07 深圳市道通智能航空技术有限公司 A kind of solid matching method, device and electronic equipment

Also Published As

Publication number Publication date
CN112116660A (en) 2020-12-22
WO2020253805A1 (en) 2020-12-24

Similar Documents

Publication Publication Date Title
US10043278B2 (en) Method and apparatus for reconstructing 3D face with stereo camera
US9070042B2 (en) Image processing apparatus, image processing method, and program thereof
JP6351238B2 (en) Image processing apparatus, imaging apparatus, and distance correction method
JP6663652B2 (en) Stereo source image correction method and apparatus
US20030198378A1 (en) Method and system for 3D smoothing within the bound of error regions of matching curves
EP3110149B1 (en) A system and a method for depth-image-based rendering
KR100953076B1 (en) Multi-view matching method and device using foreground/background separation
CN107016348B (en) Face detection method and device combined with depth information and electronic device
KR20140000195A (en) Autofocus for stereoscopic camera
CN109982064B (en) Naked eye 3D virtual viewpoint image generation method and portable terminal
US20150178573A1 (en) Ground plane detection
CN107977649B (en) Obstacle identification method and device and terminal
KR101683164B1 (en) Apparatus and method for inpainting in occlusion
EP4064177A1 (en) Image correction method and apparatus, and terminal device and storage medium
CN111223059A (en) Robust depth map structure reconstruction and denoising method based on guide filter
KR102009990B1 (en) System and method for acquisition of safe vision based on 3d bpc imaging technology
CN112116660B (en) Disparity map correction method, device, terminal and computer readable medium
KR100942271B1 (en) Method and Apparatus for Reconstruction Integral Image Using Depth
CN111192214B (en) Image processing method, device, electronic equipment and storage medium
KR20140001358A (en) Method and apparatus of processing image based on occlusion area filtering
JP2001184497A (en) Stereo image processor and recording medium
CN110800020A (en) Image information acquisition method, image processing equipment and computer storage medium
US20020110272A1 (en) Method and apparatus for improving object boundaries extracted from stereoscopic images
CN107680083B (en) Parallax determination method and parallax determination device
JP6601893B2 (en) Image processing apparatus, image processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant