CN107578389B - Plane-supervised image color depth information collaborative restoration system - Google Patents

Plane-supervised image color depth information collaborative restoration system Download PDF

Info

Publication number
CN107578389B
CN107578389B CN201710823813.4A CN201710823813A CN107578389B CN 107578389 B CN107578389 B CN 107578389B CN 201710823813 A CN201710823813 A CN 201710823813A CN 107578389 B CN107578389 B CN 107578389B
Authority
CN
China
Prior art keywords
depth map
image
color image
plane
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710823813.4A
Other languages
Chinese (zh)
Other versions
CN107578389A (en
Inventor
陈龙
范蕾
张朝强
黎丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Sun Yat Sen University
Original Assignee
National Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Sun Yat Sen University filed Critical National Sun Yat Sen University
Priority to CN201710823813.4A priority Critical patent/CN107578389B/en
Publication of CN107578389A publication Critical patent/CN107578389A/en
Application granted granted Critical
Publication of CN107578389B publication Critical patent/CN107578389B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of images, in particular to a plane supervision image color depth information collaborative restoration system. In the specific implementation process, the algorithm mainly provides clues for deep restoration while restoring the color image through the sample-based image restoration algorithm, and then the color image which is preliminarily restored and perfected is segmented by using a super-pixel segmentation method. Meanwhile, the inclined plane smoothing method is applied to the depth map restoration process, and the plane equation obtained by the plane fitting method is used for estimating the parallax value. And the local outliers obtained in the process of plane fitting and the segmentation blocks taking the blocking surface or the hinged surface as the segmentation boundary are recalculated in the next iteration process. The inpainted depth map gives feedback on the color image inpainting. And finally, continuously iterating the steps to optimize to obtain the optimal color image and depth image repairing result, thereby achieving the purpose of cooperatively repairing the color depth information.

Description

Plane-supervised image color depth information collaborative restoration system
Technical Field
The invention relates to the technical field of images, in particular to a plane supervision image color depth information collaborative restoration system.
Background
In the fields of autonomous navigation and intelligent driving, in order to be able to successfully perceive and navigate in the three-dimensional world, a mobile robot or a vehicle needs to calculate information of its own position and surrounding three-dimensional environment. The vision-based instantaneous positioning and mapping allows the position of the mobile robot to be estimated and a three-dimensional map of the surrounding environment to be built up step by step at the same time. When the mobile robot moves in an unknown environment, whether three-dimensional information of the external surrounding environment can be timely acquired is a key premise for avoiding obstacles and planning a path.
However, in real life, when an object which is not desired to exist, such as a moving object, appears, the effect of three-dimensional reconstruction of an image captured by a binocular lens is not always satisfactory. Therefore, it is necessary to remove such unwanted objects as moving objects. However, removing these objects from the image can result in voids on the three-dimensional reconstructed map, severely affecting the integrity of the map, reducing the aesthetic appearance of the map and compromising further processing by the computer. In order to solve the problem, the blank space left after the object is removed needs to be repaired from the depth map and the color image, so that the integrity of the three-dimensional reconstructed map is ensured.
At present, most of the existing image restoration or depth map restoration is completed independently, but the two are not restored together at the same time, and the constraint relationship between the two is not considered. In a few research works, the simultaneous restoration of the color image and the depth image is to complete partial restoration of the color image through the coherence of the image pair, and then to restore the remaining unrepaired areas of the color image through the depth image with complete restoration.
Disclosure of Invention
The invention provides a plane supervision image color depth information collaborative restoration system for overcoming at least one defect in the prior art, and the system finishes restoration of color depth information through the constraint between color information and depth information. The system effectively ensures the integrity of the damaged three-dimensional map and is beneficial to computer processing.
The technical scheme of the invention is as follows: a plane supervision image color depth information collaborative restoration system comprises a color image restoration module, an image segmentation module, a depth map restoration module and a color image restoration module under the guidance of a depth map;
a color image restoration module: filling color information obtained from an undamaged area in a color image into the damaged area by using a sample-based image repairing method to obtain a visually reasonable repairing result;
an image segmentation block module: performing super-pixel segmentation on the color image obtained by the color image restoration module by using a simple linear iterative clustering algorithm to obtain a segmented region;
a depth map restoration module: for each segmentation area obtained by the image segmentation block module, fitting a plane equation by using a random sampling consensus algorithm; estimating a missing disparity value using the plane equation; meanwhile, the boundaries among the segmentation areas are classified; the local outliers obtained in the process of plane fitting and the segmentation area taking the blocking surface or the hinged surface as the segmentation boundary can be recalculated in the next iteration process;
and the color image is restored under the guidance of the depth map: for the newly needed repair regions fed back from the depth map repair module, the similarity according to which color information is obtained from the undamaged regions needs to take into account the inter-plane differences.
Further, the damaged area color image and the depth map are in the same position, that is, the color image and the depth map have consistency, and the two are mutually restricted. And (4) repairing the perfect color image to provide clues for the depth map, and the repairation of the color image can obtain feedback given by the depth map.
Further, in the image restoration module, the damaged area is restored according to blocks, which are called to-be-restored blocks; filling and repairing the blocks to be repaired at the edge of the damaged area according to the priority level, and determining the priority level of each block to be repaired before filling; and searching the best matching block in the undamaged area, wherein the matching block is most similar to the block to be repaired with the highest priority in the CIELab color space.
Furthermore, in the image restoration module, the priority of the block to be restored is determined by the product of the data item and the confidence coefficient item; the more the number of the blocks to be repaired containing known pixel points is, the larger the data item value is, the stronger the linear structure is contained, and the larger the confidence coefficient item value is; after the repairing block is completely filled, the damaged edge changes; the confidence of the block to be repaired at the edge of the damaged area needs to be updated.
Further, in the image segmentation block module, the color image obtained in the image restoration module is converted into a CIELab color space and a five-dimensional feature vector with spatial information by color space conversion; and according to the distance between the five-dimensional feature vector and the pixel points with high similarity, clustering the pixel points together.
Further, in the depth map restoration module, for the segmented region, three different pixel points located in the undamaged region of the depth map are continuously and randomly selected to fit the plane equation until enough local interior points fit the plane equation.
Further, in the depth map restoration module, a plane parameter obtained by calculation according to a pixel point of a known parallax value and a plane parameter obtained in a color image restoration process are used as set elements; and obtaining a plane equation of the block to be repaired in the depth map from the set through a random consistency algorithm, and estimating the depth information lost in the estimation through the plane equation.
Furthermore, in the depth map restoration module, the boundary categories among the divided regions can be divided into a coplanar plane, a blocking plane and a hinge plane.
Further, the color image restoration module, the image segmentation module, the depth map restoration module and the color image restoration module are executed in an iterative manner under the guidance of the depth map; after a stable and natural repair result is obtained, the iterative process in the method is ended.
Compared with the prior art, the beneficial effects are: the invention realizes the restoration of the color depth information through an iteration mode and a constraint relation between the depth map and the color image. Meanwhile, the inclined plane smoothing method is applied to the depth map repairing work. The invention can be used for repairing the damage conditions of different degrees in different scenes, and is a repairing system which is more stable and more accordant with the visual characteristics of human eyes.
Drawings
FIG. 1 is a flow diagram of a flat supervised image color depth information inpainting system.
FIG. 2 is a diagram illustrating a relationship between boundaries of the partition areas.
FIG. 3 shows the color depth repairing result of the present invention.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the patent; for the purpose of better illustrating the embodiments, certain features of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product; it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted. The positional relationships depicted in the drawings are for illustrative purposes only and are not to be construed as limiting the present patent.
Example (b):
FIG. 1 is a flow chart of a plane supervised image color depth information restoration system, which comprises four parts, namely a color image restoration module, an image segmentation module, a depth map restoration module and a color image restoration module under the guidance of a depth map.
Wherein the damaged area is the same location in the color image and the depth map. The color information and the depth information have consistency in the damaged area, and the two are mutually restricted.
In the image restoration module, a preliminary color image which is completely restored is obtained by using an image restoration algorithm based on a sample. And repairing the damaged area according to blocks, namely repairing the damaged area to be called as blocks to be repaired. In the image repairing process, the priority among all damaged blocks in the region to be repaired needs to be determined, and the best matching block needs to be searched for repairing the damaged blocks. The high or low priority determines the filling order of the blocks to be repaired. The priority of the block to be repaired is determined by the product of the data item and the confidence item. The block to be repaired contains stronger linear structures with larger confidence coefficient term values when the number of the known pixel points is larger and the data item value of the block to be repaired is larger. After the block to be repaired is completely filled, the damaged edge changes. The confidence of the block to be repaired at the edge of the damaged area needs to be updated. The best matching block is determined according to the similarity between the known pixel points in the block to be repaired and the pixel points of the blocks with the same size in the undamaged area. The similarity is determined by the sum of squared differences of the pixel points.
And in the image segmentation module, segmenting the color image which is preliminarily repaired and obtained by the image repair module. And converting the color image into a CIELab color space and a five-dimensional characteristic vector with space information. And determining the number of the segmentation areas, and gathering the pixel points with high similarity together according to the distance between the segmentation areas and the five-dimensional feature vector. In order to avoid the existence of isolated points and the occurrence of excessive segmentation, the connectivity is enhanced, and small regions are removed. The super-pixel block is merged by comparing the color space distance between its neighboring tiles. The undersized region determination is determined by comparing the color difference to a threshold.
In the depth map restoration module, for the segmented area obtained by the image segmentation module, three different pixel points in the undamaged area of the depth map are continuously and randomly selected to fit a plane equation until enough local interior points fit the plane equation. And calculating plane parameters according to the pixel points with known parallax values and using the plane parameters obtained in the color image restoration process as set elements. And obtaining a plane equation of the block to be repaired in the depth map from the set through a random consistency algorithm, and estimating the depth information lost in the estimation through the plane equation. Meanwhile, the boundary between the divided regions is classified, and the type of the boundary is shown in fig. 2. The segmentation area with the blocking surface or the hinge surface as the segmentation boundary and the local outlier obtained in the plane fitting process are recalculated in the next iteration process.
The color image is subjected to the depth map restoration module, and the color image restoration module can obtain feedback given by the depth map restoration result. The area needing repairing again comprises the combination of the local outer point pixel obtained in the plane parameter estimation process and the superpixel segmentation block taking the blocking surface and the hinge surface as boundary relation. For selecting the best matching block in the re-repair region, the inter-plane difference needs to be taken into account when measuring the similarity between the matching block and the block to be repaired. After a stable and natural repair result is obtained, the iterative process in the method is ended.
Fig. 3 is a color depth restoration result of an application example of the present invention, and experiments show that the method can effectively restore image damage conditions of different degrees in different scenes, and realize restoration of color depth information.
It should be understood that the above-described embodiments of the present invention are merely examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.

Claims (9)

1. A plane-supervised image color depth information collaborative restoration system is characterized by comprising four parts, namely a color image restoration module, an image segmentation module, a depth map restoration module and a color image restoration module under the guidance of a depth map;
a color image restoration module: filling color information obtained from an undamaged area in a color image into the damaged area by using a sample-based image repairing method to obtain a visually reasonable repairing result;
an image segmentation module: performing super-pixel segmentation on the color image obtained by the color image restoration module by using a simple linear iterative clustering algorithm to obtain a segmented region;
a depth map restoration module: for each segmentation area obtained by the image segmentation module, fitting a plane equation by using a random sampling consensus algorithm; estimating a missing disparity value using the plane equation; meanwhile, the boundaries among the segmentation areas are classified; the local outliers obtained in the process of plane fitting and the segmentation area taking the blocking surface or the hinged surface as the segmentation boundary can be recalculated in the next iteration process;
and the color image is restored under the guidance of the depth map: for the newly needed repair regions fed back from the depth map repair module, the similarity according to which color information is obtained from the undamaged regions needs to take into account the inter-plane differences.
2. The system according to claim 1, wherein the system comprises: the damaged area is the same position in the color image and the depth map, namely the color image and the depth map have consistency, and the two are mutually restricted.
3. The system according to claim 1, wherein the system comprises: in the image restoration module, a damaged area is restored according to blocks, and the damaged area is called as a block to be restored; filling and repairing the blocks to be repaired at the edge of the damaged area according to the priority level, and determining the priority level of each block to be repaired before filling; and searching the best matching block in the undamaged area, wherein the matching block is most similar to the block to be repaired with the highest priority in the CIELab color space.
4. The system according to claim 2, wherein the system comprises: in the image restoration module, the priority of a block to be restored is determined by the product of a data item and a confidence coefficient item; the more the number of the blocks to be repaired containing known pixel points is, the larger the data item value is, the stronger the linear structure is contained, and the larger the confidence coefficient item value is; after the repairing block is completely filled, the damaged edge changes; the confidence of the block to be repaired at the edge of the damaged area needs to be updated.
5. The system according to claim 1, wherein the system comprises: in the image segmentation module, the color image obtained in the image restoration module is subjected to color space conversion and converted into a CIELab color space and a five-dimensional feature vector with space information; and according to the distance between the five-dimensional feature vector and the pixel points with high similarity, clustering the pixel points together.
6. The system according to claim 1, wherein the system comprises: in the depth map restoration module, for the segmented area, three different pixel points in the undamaged area of the depth map are continuously and randomly selected to fit the plane equation until enough local points fit the plane equation.
7. The system according to claim 1, wherein the system comprises: in the depth map restoration module, a plane parameter obtained by calculation according to a pixel point with a known parallax value and a plane parameter obtained in a color image restoration process are used as set elements; and obtaining a plane equation of the block to be repaired in the depth map from the set through a random consistency algorithm, and estimating the lost depth information by using the plane equation.
8. The system according to claim 1, wherein the system comprises: in the depth map restoration module, the boundary categories among the divided regions can be divided into a coplanar surface, a blocking surface and a hinge surface.
9. The system according to claim 1, wherein the system comprises: the method comprises the following steps of iteratively executing four major parts of a color image restoration module, an image segmentation module, a depth map restoration module and a color image restoration module under the guidance of a depth map; and after a stable and natural repairing result is obtained, the iteration process is ended.
CN201710823813.4A 2017-09-13 2017-09-13 Plane-supervised image color depth information collaborative restoration system Active CN107578389B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710823813.4A CN107578389B (en) 2017-09-13 2017-09-13 Plane-supervised image color depth information collaborative restoration system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710823813.4A CN107578389B (en) 2017-09-13 2017-09-13 Plane-supervised image color depth information collaborative restoration system

Publications (2)

Publication Number Publication Date
CN107578389A CN107578389A (en) 2018-01-12
CN107578389B true CN107578389B (en) 2021-01-08

Family

ID=61036020

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710823813.4A Active CN107578389B (en) 2017-09-13 2017-09-13 Plane-supervised image color depth information collaborative restoration system

Country Status (1)

Country Link
CN (1) CN107578389B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114972129B (en) * 2022-08-01 2022-11-08 电子科技大学 Image restoration method based on depth information

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7324704B2 (en) * 2003-09-11 2008-01-29 Primax Electronics Ltd. Method of repairing scratches in digital images
CN103489183B (en) * 2012-10-17 2017-10-10 深圳市瑞工科技有限公司 A kind of sectional perspective matching process split based on edge with seed point
CN104616286B (en) * 2014-12-17 2017-10-31 浙江大学 Quick semi-automatic multi views depth restorative procedure
CN105513064B (en) * 2015-12-03 2018-03-20 浙江万里学院 A kind of solid matching method based on image segmentation and adaptive weighting

Also Published As

Publication number Publication date
CN107578389A (en) 2018-01-12

Similar Documents

Publication Publication Date Title
CN106780524B (en) Automatic extraction method for three-dimensional point cloud road boundary
US9412040B2 (en) Method for extracting planes from 3D point cloud sensor data
Sun et al. Aerial 3D building detection and modeling from airborne LiDAR point clouds
Cheng et al. Integration of LiDAR data and optical multi-view images for 3D reconstruction of building roofs
CN103927717B (en) Depth image restoration methods based on modified model bilateral filtering
US20130129190A1 (en) Model-Based Stereo Matching
Broggi et al. Terrain mapping for off-road autonomous ground vehicles using rational b-spline surfaces and stereo vision
CN111598916A (en) Preparation method of indoor occupancy grid map based on RGB-D information
Hulik et al. Fast and accurate plane segmentation in depth maps for indoor scenes
CN107978017B (en) Indoor structure rapid modeling method based on frame line extraction
CN110706269B (en) Binocular vision SLAM-based dynamic scene dense modeling method
Kong et al. A method for learning matching errors for stereo computation.
EP3343507B1 (en) Producing a segmented image of a scene
Holzmann et al. Semantically aware urban 3d reconstruction with plane-based regularization
CN111652241B (en) Building contour extraction method integrating image features and densely matched point cloud features
US11657195B2 (en) Processing a 3D signal of a shape attribute over a real object
Alcantarilla et al. Large-scale dense 3D reconstruction from stereo imagery
Chuang et al. Dense stereo matching with edge-constrained penalty tuning
Ye et al. Integrated image matching and segmentation for 3D surface reconstruction in urban areas
CN110363178B (en) Airborne laser point cloud classification method based on local and global depth feature embedding
CN115222884A (en) Space object analysis and modeling optimization method based on artificial intelligence
CN107578389B (en) Plane-supervised image color depth information collaborative restoration system
CN106097336B (en) Front and back scape solid matching method based on belief propagation and self similarity divergence measurement
Song et al. Building extraction from high resolution color imagery based on edge flow driven active contour and JSEG
Ji et al. CNN-based dense image matching for aerial remote sensing images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant