CN112085834B - Wall reconstruction method in indoor environment - Google Patents

Wall reconstruction method in indoor environment Download PDF

Info

Publication number
CN112085834B
CN112085834B CN202010897546.7A CN202010897546A CN112085834B CN 112085834 B CN112085834 B CN 112085834B CN 202010897546 A CN202010897546 A CN 202010897546A CN 112085834 B CN112085834 B CN 112085834B
Authority
CN
China
Prior art keywords
point
wall
points
contour
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010897546.7A
Other languages
Chinese (zh)
Other versions
CN112085834A (en
Inventor
宁小娟
王曼
王映辉
金海燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian University of Technology
Original Assignee
Xian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian University of Technology filed Critical Xian University of Technology
Priority to CN202010897546.7A priority Critical patent/CN112085834B/en
Publication of CN112085834A publication Critical patent/CN112085834A/en
Application granted granted Critical
Publication of CN112085834B publication Critical patent/CN112085834B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/162Segmentation; Edge detection involving graph-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20072Graph-based image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/12Bounding box

Abstract

The invention discloses a wall surface reconstruction method in an indoor environment, which is implemented according to the following steps: step 1, extracting contour points of wall point cloud data; step 2, obtaining an initial wall characteristic line according to the contour points of the wall extracted in the step 1, filtering redundant lines of the obtained initial wall characteristic line to obtain a final wall characteristic line, and forming a two-dimensional unit complex; step 3, analyzing the space occupation condition of each wall unit of the two-dimensional unit complex formed in the step 2 and marking the state of each wall unit; and 4, performing graph cutting operation on the wall data processed in the step 3, and reconstructing a complete wall structure. The wall reconstruction method in the indoor environment solves the problems that the prior art depends on the color information of data and is difficult to detect a complex window structure.

Description

Wall reconstruction method in indoor environment
Technical Field
The invention belongs to the technical field of computer vision and image processing methods, and relates to a wall reconstruction method in an indoor environment.
Background
Various scenes of the real world are the most important components of the human perception world, wherein buildings are important components in the real world, and the establishment of three-dimensional models is widely applied to the fields of building design, cultural relic protection, game design, automatic driving, virtual reality, augmented reality and the like. Walls are the basis of building construction, with doors and windows being the most important elements, and these details directly affect the realism of the three-dimensional model of the building. However, the door and window are made of materials and are shielded, so that the reconstruction of the wall surface has great challenges.
On the basis of scattered point cloud data of a building point cloud scene acquired by three-dimensional laser scanning equipment, door and window elements in the building scene are detected according to the spatial layout and the topological structure of doors and windows on walls in the scene, so that the effect of reconstructing a real wall is achieved. The 'wall reconstruction under indoor environment' is chosen for discussion and research mainly because although a plurality of wall reconstruction methods exist at present, the color information depending on data exists in the methods, and when the colors are similar, the reconstruction method can be invalid; some doors and windows with complex structures cannot be accurately detected, so that the reconstruction result is inaccurate. How to accurately and effectively detect doors and windows in a scene and finish reconstruction of walls in the scene is a study needing deep expansion.
Disclosure of Invention
The invention aims to provide a wall reconstruction method in an indoor environment, which solves the problems that the prior art depends on color information of data and is difficult to detect a complex window structure.
The technical scheme adopted by the invention is that the wall surface reconstruction method under the indoor environment is implemented according to the following steps:
step 1, extracting contour points of wall point cloud data;
step 2, obtaining an initial wall characteristic line according to the contour points of the wall extracted in the step 1, filtering redundant lines of the obtained initial wall characteristic line to obtain a final wall characteristic line, and forming a two-dimensional unit complex;
step 3, analyzing the space occupation condition of each wall unit of the two-dimensional unit complex formed in the step 2 and marking the state of each wall unit;
and 4, performing graph cutting operation on the wall data processed in the step 3, and reconstructing a complete wall structure.
The present invention is also characterized in that,
the step 1 specifically comprises the following steps:
step 1.1, obtaining normal vectors of each point in wall point cloud data through a PCA principal component analysis method;
and step 1.2, determining contour points of the wall point cloud data according to the normal vector of the points obtained in the step 1.1.
The step 1.1 specifically comprises the following steps:
step 1.1.1, wall point cloud data is p= { P 1 ,p 2 ,…,p n P is }, where i Coordinate information (x) indicating the i-th point in the point cloud file i ,y i ,z i ) I=1, 2,3, …, n, n is the number of wall points in the wall point cloud data;
step 1.1.2, for a point p of the wall point cloud data i The k neighbor points are queried by utilizing KD-tree, and the k neighbor points are defined as k neighbor points of the sampling point
Step 1.1.3, constructing covariance cov (X, X), cov (X, Y), cov (X, Z), cov (Y, Y), cov (Y, Z), cov (Z, Z) from k neighboring points;
where q is a three-dimensional data point p i Is a certain adjacent point, q x Representing the x-coordinate, q of the point y Representing the y-coordinate, q of the point z The z-coordinate of the point is represented,
in step 1.1.4 of the method,establishing covariance matrix C based on covariance i
Step 1.1.5 according to covariance matrix C i Three eigenvalues λ are found 1 、λ 2 、λ 3 And the corresponding feature vector is omega 1 ,ω 2 ,ω 31 ≥λ 2 ≥λ 3 Wherein the minimum eigenvalue lambda 3 Corresponding feature vector omega 3 Is point p i Wherein the minimum eigenvalue lambda 3 Corresponding feature vector omega 3 =(x n ,y n ,z n ) Is point p i Normal vector of (c), point p i Is n i
Step 1.1.6, repeat 1.1.2-1.1.5 to obtain normal vector of all points.
The step 1.2 specifically comprises the following steps:
step 1.2.1 for each Point p of wall data i According to its point coordinates and normal vector n i Obtaining a local plane pi at the point i :x n (x-x i )+y n (y-y i )+z n (z-z i )=0;
Step 1.2.2 p is determined according to the planar normal vector direction i Projection of neighboring points of a point to a local plane n i On the above, the adjacent point q is recorded j Is q' j =(x q ,y q ,z q );
Step 1.2.3, junction p i And any one projection point q' j Taking the connecting line direction as an x-axis, and according to the fact that the x-axis and the y-axis are mutually perpendicular, obtaining the y-axis, and establishing a local coordinate system on a local plane;
step 1.2.4, connecting all projection points with the local origin of coordinates, calculating the included angles between all adjacent connecting lines, and if the maximum included angle is greater than a given threshold value, then the point p i Marked as boundary points;
step 1.2.5, all boundaries marked by step 1.2.4The points are outline points, a m Representation, put in the set { a } 1 ,a 2 ,…a m In, where m is the number of contour points.
In the step 2, the initial wall characteristic line is obtained according to the contour points of the wall extracted in the step 1, and is specifically:
step 2.1.1, respectively determining the maximum value and the minimum value of the x coordinate and the y coordinate of the wall contour point, and marking as maxx, minx, maxy and miny;
step 2.1.2 for each wall contour point a i Marking, initializing to 0, and marking as pointx [ a ] i ]=0, indicating that the ith contour point is not processed on the operation for the x-axis, pointy [ a ] i ]=0, indicating that the ith contour point is not processed on the operation for the y-axis;
step 2.1.3 for a certain wall contour point a i Find its neighboring point, if pointx [ a ] i ]If the value is=0, the step 2.1.4 is entered, otherwise, the next contour point is continuously selected for marking judgment;
step 2.1.4, calculating the contour point a i The difference value of the x coordinate value of the rest contour points is less than 0.02, and the contour point a is obtained i Adding 1 to the number of adjacent points;
step 2.1.5, if the contour point a i If the number of the adjacent points is larger than the threshold sigma, the marked pointx of the adjacent points is set as 1, and the contour point a is obtained i And the average value of the x coordinate values of the adjacent pointsWherein the threshold sigma is the number of contour points divided by the number of contour categories;
step 2.1.6, according to the average valueDetermining an initial characteristic line by maxy and miny;
step 2.1.7, repeating 2.1.3-2.1.6, and determining an initial feature line set in the x-axis direction, wherein the initial feature line set in the x-axis direction is coredinatex= { coredinatex [1], coredinatex [2],. Coredinatex [ kx ] }, kx is the number of initial feature lines in the x-axis direction, and coredinatex [ kx ] is the initial feature line in the x-axis direction;
step 2.1.7 for a certain wall contour point a i Find its neighboring point, if point [ a ] i ]If=0, go to step 2.1.8, otherwise continue to select the next contour point for label judgment;
step 2.1.8, calculating the contour point a i A difference value of y coordinate value from the rest contour points, if the difference value is less than 0.02, the contour point a is obtained i Adding 1 to the adjacent point of (2);
step 2.1.9, if the contour point a i If the number of the adjacent points is larger than the threshold sigma, the marker point y of the adjacent points is set as 1, and the contour point a is obtained i And the average value of the y coordinate values of the adjacent pointsWherein the threshold sigma is the number of contour points divided by the number of contour categories;
step 2.1.10 according to the average valuemaxx and minx determine an initial feature line;
step 2.1.11, repeating 2.1.7-2.1.10, and determining an initial feature line set in the y-axis direction, wherein the initial feature line set in the y-axis direction is coredinatey [1], coredinatey [2],. Coredinatey [ ky ] }, ky is the number of initial feature lines in the y-axis direction, and coredinatey [ ky ] is the initial feature line in the y-axis direction.
Filtering the redundant lines of the obtained initial wall surface characteristic lines in the step 2 to obtain final wall surface characteristic lines, wherein the formation of the two-dimensional unit complex is specifically as follows:
step 2.2.1, applying bubbling sequencing to sequence the initial characteristic lines in the x-axis direction and the y-axis direction respectively, and updating the sequence of the initial characteristic lines;
step 2.2.2, if the distance between two feature lines, namely, the coredinatex [ i ] -coredinatex [ j ] < = 0.045, calculating the average value of the x values of the two feature lines, and giving the average value to the coredinatex [ i ], deleting the coredinatex [ j ], and updating the initial feature line in the x-axis direction;
step 2.2.3, repeating the step 2.2.2 until each initial characteristic line is completed, and obtaining a wall characteristic line in the final x-axis direction;
2.2.4, if the distance between two feature lines, namely, the coredinatey [ i ] -coredinatey [ j ] < = 0.04, calculating the average value of y values of the two feature lines, giving the average value to the coredinatey [ i ], deleting the coredinatey [ j ], and updating the initial feature line in the y-axis direction;
step 2.2.5, repeating the step 2.2.4 until each initial characteristic line is completed, and obtaining a wall characteristic line in the final y-axis direction;
and 2.2.6, forming a two-dimensional unit complex on the wall surface by the wall surface characteristic line in the final y-axis direction and the wall surface characteristic line in the final x-axis direction.
The step 3 is specifically as follows:
step 3.1, obtaining an AABB bounding box of the wall according to the contour points of the wall point cloud data extracted in the step 1, dividing the bounding box into smaller cells, and specifically performing the following steps:
step 3.1.1, respectively determining the maximum value and the minimum value of the x coordinate, the y coordinate and the z coordinate of the wall profile point, and marking as maxx, minx, maxy, miny, maxz and minz;
step 3.1.2, determining eight vertexes of the bounding box according to the maximum value and the minimum value of the x coordinate, the y coordinate and the z coordinate of the wall data obtained in the step 3.1.1, so as to obtain the bounding box;
step 3.1.3, determining the length, width and height of the cells according to the coordinate values of the wall point cloud data, wherein the length of the cells isThe width of the cell is +.>The height of the unit cell is maxz-minuz;
step 3.1.4, dividing the AABB bounding box according to the cell data obtained in the step 3.1.3;
step 3.1.5, initializing the state of each cell, and marking as 0;
step 3.1.6, if the wall data point exists in the cell, changing the state of the cell to 1;
step 3.1.7, repeating the step 3.1.6 to finish marking all the cells;
step 3.2, marking the space occupation condition of each unit of the two-dimensional unit complex obtained in the step 2 according to the step 3.1, wherein the space occupation condition is respectively empty and occupied, and the method specifically comprises the following steps of:
step 3.2.1, recording the number recordk of the cells with the state 0 and the number recordz of the cells with the state 1 obtained in the step 3.1 for the cells of a certain two-dimensional cell complex;
step 3.2.2, marking the unit of the two-dimensional unit complex as 0 if the recordk > recordz, indicating that the unit is empty, otherwise marking as 1, indicating that the unit is occupied;
and 3.2.3, repeating the step 3.2.2, and marking all units of the two-dimensional unit complex to finish analysis of space occupation conditions.
The step 4 is specifically as follows:
step 4.1, establishing an undirected graph G according to the two-dimensional unit complex of the marked wall obtained in the step 3;
and 4.2, carrying out energy minimization solution on the undirected graph G by using a maximum flow/minimum cut algorithm, finding an augmented path from the source point S to the sink point T, using the augmented path as a minimum cut of the undirected graph G, and finally mapping the minimum cut onto a wall to reconstruct a complete wall structure.
The step 4.1 specifically comprises the following steps:
step 4.1.1, converting the wall units of each two-dimensional unit complex into vertexes in an undirected graph G, adding side information to the wall units which can be connected with each other, wherein the sides are provided with weights, the weights of the sides are defined by the values of two wall units connected with the sides, wherein the values of the wall units are set according to the unit states marked in the step 3, and are represented as d, the state is 1, namely the wall units are occupied states, the values are random numbers between 400 and 500, the state is 0, namely the wall units are empty states, and the values are random numbers between 1 and 100;
step 4.1.2, determining the weight w of each vertex edge in the undirected graph G according to the formula (8) <p,q>
Wherein,
wherein p, q respectively represent two vertices in the undirected graph G, i.e. two wall elements, d p ,d q Values of p and q points, respectively, delta 2 Representing variance, ex represents mean, n is the number of vertices in undirected graph G;
step 4.1.3, adding 2 terminal vertexes, namely a source point S and a sink point T, into the undirected graph G, wherein the rest vertexes are connected with the terminal vertexes;
step 4.1.4, randomly selecting two points in different states from the vertex of the undirected graph G as seed points, wherein the occupied state seed points are marked as zseed, and the empty state seed points are marked as kzeed;
step 4.1.5, determining weights of the seed point and the terminal vertex edge, if the seed point is the seed point, setting the weight of the edge connected with the source point by zseed to be s, setting the weight of the edge connected with the source point by kzeed to be 0, setting the weight of the edge connected with the sink point by zseed to be 0, setting the weight of the edge connected with the sink point by kzeed to be s, and s=5000000;
step 4.1.6, determining the weight of each of the rest of the vertexes except the seed point and the terminal vertex edge according to the formula (11) (12),
wherein w is (s-t) Weights, w, for edges of wall units and source points (t-t) For the weight of the edges of the wall units and the sink, V represents the set of G vertices of the undirected graph, beta w Is a constant with a value of 10 4 ,K H ,K L Is an empirical constant with values of 0.9 and 0.2, respectively, l p Label, pr (l) p ) Representing the probability of each wall element tag, Q p Is a unit density equation, K is a normalization factor, B p Representing the number of points on the cell boundary, cv p Points representing units excluding boundary points, S p Representing the area of the cell, rD is a scale factor, having a value of 10 3
The step 4.2 is specifically as follows:
step 4.2.1, establishing an energy equation of the undirected graph G, as shown in formula (15),
E=w <p,q> +w (s-t) +w (t-t) (15)
wherein w is <p,q> Weight, w, of edges between wall units (s-t) Weights, w, for edges of wall units and source points (t-t) The weight of the edge of the wall unit and the sink is calculated;
step 4.2.2, from the source point 'S' as a starting point, searching an augmented path which can reach the sink point 'T' in the undirected graph G by applying breadth-first search, wherein the maximum flow of the path is the minimum weight value in the edge;
step 4.2.3, subtracting the maximum flow of the path from all sides in the augmented path, updating the weight of the sides to an undirected graph G, and recording the sides with the size of 0 by using tranzero [ i ];
step 4.2.4, repeating 4.2.2-4.2.3 until there is no augmented path from source point "S" to sink point "T";
step 4.2.5, the edges recorded in tranzero [ i ] are the edges that need to be segmented;
and 4.2.6, mapping the edges obtained in the step 4.2.5 onto the wall to finish the reconstruction of the wall.
The beneficial effects of the invention are as follows:
the invention solves the problem of reconstruction failure caused by data-dependent color information in the prior art, solves the problem of difficult detection of complex window structures, greatly improves the effectiveness, stability and accuracy of the wall reconstruction work, has better robustness, and enriches the method system of computer graphics and visual intelligence.
Drawings
FIG. 1 is a contour point of wall point cloud data obtained in step 1 of a wall surface reconstruction method in an indoor environment according to the present invention;
FIG. 2 is an initial characteristic line of a wall obtained in step 2 of a wall reconstruction method in an indoor environment according to the present invention;
FIG. 3 is a final characteristic line of the wall obtained in step 2 of the wall reconstruction method in an indoor environment according to the present invention;
fig. 4 is a block diagram showing the steps 3 of the wall reconstruction method in the indoor environment, wherein the AABB bounding box of the wall is divided into individual cells;
FIG. 5 is a graph showing the result of marking cells in step 3 of a wall reconstruction method in an indoor environment according to the present invention;
fig. 6 is a result of reconstructing a complete wall in step 4 of a wall reconstruction method in an indoor environment according to the present invention.
Detailed Description
The invention will be described in detail below with reference to the drawings and the detailed description.
The invention discloses a wall surface reconstruction method in an indoor environment, which is implemented according to the following steps:
step 1, extracting contour points of wall point cloud data, as shown in fig. 1, specifically:
step 1.1, obtaining normal vectors of each point in wall point cloud data through PCA principal component analysis, wherein the normal vectors are specifically as follows:
step 1.1.1, wall point cloud data is p= { P 1 ,p 2 ,…,p n P is }, where i Coordinate information (x) indicating the i-th point in the point cloud file i ,y i ,z i ) I=1, 2,3, …, n, n is the number of wall points in the wall point cloud data;
step 1.1.2, for a point p of the wall point cloud data i The k neighbor points are queried by utilizing KD-tree, and the k neighbor points are defined as k neighbor points of the sampling point
Step 1.1.3, constructing covariance cov (X, X), cov (X, Y), cov (X, Z), cov (Y, Y), cov (Y, Z), cov (Z, Z) from k neighboring points;
where q is a three-dimensional data point p i Is a certain adjacent point, q x Representing the x-coordinate, q of the point y Representing the y-coordinate, q of the point z The z-coordinate of the point is represented,
step 1.1.4, establishing a covariance matrix C according to the covariance i
Step 1.1.5 according to covariance matrix C i Three eigenvalues λ are found 1 、λ 2 、λ 3 And the corresponding feature vector is omega 1 ,ω 2 ,ω 31 ≥λ 2 ≥λ 3 Wherein the minimum eigenvalue lambda 3 Corresponding feature vector omega 3 Is point p i Wherein the minimum eigenvalue lambda 3 Corresponding feature vector omega 3 =(x n ,y n ,z n ) Is point p i Normal vector of (c), point p i Is n i
Step 1.1.6, repeating 1.1.2-1.1.5 to obtain normal vectors of all points;
step 1.2, determining contour points of wall point cloud data according to the normal vector of the points obtained in the step 1.1, wherein the contour points specifically comprise:
step 1.2.1 for each Point p of wall data i According to its point coordinates and normal vector n i Obtaining a local plane pi at the point i :x n (x-x i )+y n (y-y i )+z n (z-z i )=0;
Step 1.2.2 p is determined according to the planar normal vector direction i Projection of neighboring points of a point to a local plane n i On the above, the adjacent point q is recorded j Is q' j =(x q ,y q ,z q );
Step 1.2.3, junction p i And any one projection point q' j Taking the connecting line direction as an x-axis, and according to the fact that the x-axis and the y-axis are mutually perpendicular, obtaining the y-axis, and establishing a local coordinate system on a local plane;
step 1.2.4, all the projection points are combined with the local partThe origin of coordinates is connected, the included angles between all adjacent connecting lines are calculated, and if the maximum included angle is larger than a given threshold value, the point p is calculated i Marked as boundary points;
step 1.2.5, all boundary points marked in step 1.2.4 are contour points, using a m Representation, put in the set { a } 1 ,a 2 ,…a m In, where m is the number of contour points.
Step 2, obtaining an initial wall characteristic line according to the contour points of the wall extracted in the step 1, filtering redundant lines of the obtained initial wall characteristic line to obtain a final wall characteristic line, and forming a two-dimensional unit complex; the initial wall characteristic line obtained according to the contour points of the wall extracted in the step 1 is specifically:
step 2.1.1, respectively determining the maximum value and the minimum value of the x coordinate and the y coordinate of the wall contour point, and marking as maxx, minx, maxy and miny;
step 2.1.2 for each wall contour point a i Marking, initializing to 0, and marking as pointx [ a ] i ]=0, indicating that the ith contour point is not processed on the operation for the x-axis, pointy [ a ] i ]=0, indicating that the ith contour point is not processed on the operation for the y-axis;
step 2.1.3 for a certain wall contour point a i Find its neighboring point, if pointx [ a ] i ]If the value is=0, the step 2.1.4 is entered, otherwise, the next contour point is continuously selected for marking judgment;
step 2.1.4, calculating the contour point a i The difference value of the x coordinate value of the rest contour points is less than 0.02, and the contour point a is obtained i Adding 1 to the number of adjacent points;
step 2.1.5, if the contour point a i If the number of the adjacent points is larger than the threshold sigma, the marked pointx of the adjacent points is set as 1, and the contour point a is obtained i And the average value of the x coordinate values of the adjacent pointsWherein the threshold sigma is the number of contour points divided by the number of contour categories;
step 2.1.6, according to the average valueDetermining an initial characteristic line by maxy and miny;
step 2.1.7, repeating 2.1.3-2.1.6, and determining an initial feature line set in the x-axis direction, wherein the initial feature line set in the x-axis direction is coredinatex= { coredinatex [1], coredinatex [2],. Coredinatex [ kx ] }, kx is the number of initial feature lines in the x-axis direction, and coredinatex [ kx ] is the initial feature line in the x-axis direction;
step 2.1.7 for a certain wall contour point a i Find its neighboring point, if point [ a ] i ]If=0, go to step 2.1.8, otherwise continue to select the next contour point for label judgment;
step 2.1.8, calculating the contour point a i A difference value of y coordinate value from the rest contour points, if the difference value is less than 0.02, the contour point a is obtained i Adding 1 to the adjacent point of (2);
step 2.1.9, if the contour point a i If the number of the adjacent points is larger than the threshold sigma, the marker point y of the adjacent points is set as 1, and the contour point a is obtained i And the average value of the y coordinate values of the adjacent pointsWherein the threshold sigma is the number of contour points divided by the number of contour categories;
step 2.1.10 according to the average valuemaxx and minx determine an initial feature line;
step 2.1.11, repeating 2.1.7-2.1.10, and determining an initial feature line set in the y-axis direction, wherein the initial feature line set in the y-axis direction is = { cordingtey [1], cordingtey [2],. The number of initial feature lines in the y-axis direction is described in the description of the drawings, and ky is the number of initial feature lines in the y-axis direction, and the initial feature lines in the y-axis direction are shown in fig. 2;
the method comprises the steps of filtering redundant lines of the obtained initial wall surface characteristic lines to obtain final wall surface characteristic lines, and forming a two-dimensional unit complex specifically comprises the following steps:
step 2.2.1, applying bubbling sequencing to sequence the initial characteristic lines in the x-axis direction and the y-axis direction respectively, and updating the sequence of the initial characteristic lines;
step 2.2.2, if the distance between two feature lines, namely, the coredinatex [ i ] -coredinatex [ j ] < = 0.045, calculating the average value of the x values of the two feature lines, and giving the average value to the coredinatex [ i ], deleting the coredinatex [ j ], and updating the initial feature line in the x-axis direction;
step 2.2.3, repeating the step 2.2.2 until each initial characteristic line is completed, and obtaining a wall characteristic line in the final x-axis direction;
2.2.4, if the distance between two feature lines, namely, the coredinatey [ i ] -coredinatey [ j ] < = 0.04, calculating the average value of y values of the two feature lines, giving the average value to the coredinatey [ i ], deleting the coredinatey [ j ], and updating the initial feature line in the y-axis direction;
step 2.2.5, repeating step 2.2.4 until each initial characteristic line is completed, and obtaining a wall characteristic line in the final y-axis direction, as shown in fig. 3;
2.2.6, forming a two-dimensional unit complex on the wall surface by the wall surface characteristic line in the final y-axis direction and the wall surface characteristic line in the final x-axis direction;
step 3, analyzing the space occupation condition of each wall unit of the two-dimensional unit complex formed in the step 2 and marking the state of each wall unit; the method comprises the following steps:
step 3.1, obtaining an AABB bounding box of the wall according to the contour points of the wall point cloud data extracted in the step 1, dividing the bounding box into smaller cells, and specifically performing the following steps:
step 3.1.1, respectively determining the maximum value and the minimum value of the x coordinate, the y coordinate and the z coordinate of the wall profile point, and marking as maxx, minx, maxy, miny, maxz and minz;
step 3.1.2, determining eight vertexes of the bounding box according to the maximum value and the minimum value of the x coordinate, the y coordinate and the z coordinate of the wall data obtained in the step 3.1.1, thereby obtaining the bounding box, as shown in fig. 4;
step 3.1.3 according to the wall point cloudThe coordinate value of the data determines the length, width and height of the cells, and the length of the cells isThe width of the cell is +.>The height of the unit cell is maxz-minuz;
step 3.1.4, dividing the AABB bounding box according to the cell data obtained in the step 3.1.3;
step 3.1.5, initializing the state of each cell, and marking as 0;
step 3.1.6, if the wall data point exists in the cell, changing the state of the cell to 1;
step 3.1.7, repeating step 3.1.6 to complete the marking of all the cells, as shown in fig. 5;
step 3.2, marking the space occupation condition of each unit of the two-dimensional unit complex obtained in the step 2 according to the step 3.1, wherein the space occupation condition is respectively empty and occupied, and the method specifically comprises the following steps of:
step 3.2.1, recording the number recordk of the cells with the state 0 and the number recordz of the cells with the state 1 obtained in the step 3.1 for the cells of a certain two-dimensional cell complex;
step 3.2.2, marking the unit of the two-dimensional unit complex as 0 if the recordk > recordz, indicating that the unit is empty, otherwise marking as 1, indicating that the unit is occupied;
step 3.2.3, repeating the step 3.2.2, marking all units of the two-dimensional unit complex, and completing analysis of space occupation conditions;
and 4, performing graph cutting operation on the wall data processed in the step 3, and reconstructing a complete wall structure, wherein the method specifically comprises the following steps:
step 4.1, establishing an undirected graph G according to the two-dimensional unit complex of the marked wall obtained in the step 3, wherein the undirected graph G specifically comprises the following steps:
step 4.1.1, converting the wall units of each two-dimensional unit complex into vertexes in an undirected graph G, adding side information to the wall units which can be connected with each other, wherein the sides are provided with weights, the weights of the sides are defined by the values of two wall units connected with the sides, wherein the values of the wall units are set according to the unit states marked in the step 3, and are represented as d, the state is 1, namely the wall units are occupied states, the values are random numbers between 400 and 500, the state is 0, namely the wall units are empty states, and the values are random numbers between 1 and 100;
step 4.1.2, determining the weight w of each vertex edge in the undirected graph G according to the formula (8) <p,q>
Wherein,
wherein p, q respectively represent two vertices in the undirected graph G, i.e. two wall elements, d p ,d q Values of p and q points, respectively, delta 2 Representing variance, ex represents mean, n is the number of vertices in undirected graph G;
step 4.1.3, adding 2 terminal vertexes, namely a source point S and a sink point T, into the undirected graph G, wherein the rest vertexes are connected with the terminal vertexes;
step 4.1.4, randomly selecting two points in different states from the vertex of the undirected graph G as seed points, wherein the occupied state seed points are marked as zseed, and the empty state seed points are marked as kzeed;
step 4.1.5, determining weights of the seed point and the terminal vertex edge, if the seed point is the seed point, setting the weight of the edge connected with the source point by zseed to be s, setting the weight of the edge connected with the source point by kzeed to be 0, setting the weight of the edge connected with the sink point by zseed to be 0, setting the weight of the edge connected with the sink point by kzeed to be s, and s=5000000;
step 4.1.6, determining the weight of each of the rest of the vertexes except the seed point and the terminal vertex edge according to the formula (11) (12),
wherein w is (s-t) Weights, w, for edges of wall units and source points (t-t) For the weight of the edges of the wall units and the sink, V represents the set of G vertices of the undirected graph, beta w Is a constant with a value of 10 4 ,K H ,K L Is an empirical constant with values of 0.9 and 0.2, respectively, l p Label, pr (l) p ) Representing the probability of each wall element tag, Q p Is a unit density equation, K is a normalization factor, B p Representing the number of points on the cell boundary, cv p Points representing units excluding boundary points, S p Representing the area of the cell, rD is a scale factor, having a value of 10 3
And 4.2, carrying out energy minimization solution on the undirected graph G by using a maximum flow/minimum cut algorithm, finding an augmented path from a source point 'S' to a sink point 'T', using the augmented path as a minimum cut of the undirected graph G, finally mapping the minimum cut onto a wall, and reconstructing a complete wall structure, wherein the method comprises the following steps of:
step 4.2.1, establishing an energy equation of the undirected graph G, as shown in formula (15),
E=w <p,q> +w (s-t) +w (t-t) (15)
wherein w is <p,q> Weight, w, of edges between wall units (s-t) Weights, w, for edges of wall units and source points (t-t) The weight of the edge of the wall unit and the sink is calculated;
step 4.2.2, from the source point 'S' as a starting point, searching an augmented path which can reach the sink point 'T' in the undirected graph G by applying breadth-first search, wherein the maximum flow of the path is the minimum weight value in the edge;
step 4.2.3, subtracting the maximum flow of the path from all sides in the augmented path, updating the weight of the sides to an undirected graph G, and recording the sides with the size of 0 by using tranzero [ i ];
step 4.2.4, repeating 4.2.2-4.2.3 until there is no augmented path from source point "S" to sink point "T";
step 4.2.5, the edges recorded in tranzero [ i ] are the edges that need to be segmented;
step 4.2.6, the edges obtained in step 4.2.5 are mapped onto the wall, and the reconstruction of the wall is completed, as shown in fig. 6.
According to the invention, research is carried out on the reconstruction of the indoor scene wall surface of the point cloud only comprising vertex coordinate information, firstly, contour points of wall point cloud data are extracted, then, wall surface characteristic lines are extracted according to the point density, a two-dimensional unit complex is formed by intersecting the characteristic lines, the space occupation condition of each wall unit is analyzed, finally, a figure cutting algorithm based on Ford-Fulkerson is adopted to determine doors and windows, and the reconstruction of the wall surface in the indoor environment is completed. The invention has the advantages of complete technical route, great improvement on effectiveness and stability, better robustness and enrichment of a computer graphics and visual intelligence method system.

Claims (6)

1. The wall surface reconstruction method in the indoor environment is characterized by comprising the following steps:
step 1, extracting contour points of wall point cloud data;
step 2, obtaining an initial wall characteristic line according to the contour points of the wall extracted in the step 1, filtering redundant lines of the obtained initial wall characteristic line to obtain a final wall characteristic line, and forming a two-dimensional unit complex;
step 3, analyzing the space occupation condition of each wall unit of the two-dimensional unit complex formed in the step 2 and marking the state of each wall unit; the method comprises the following steps:
step 3.1, obtaining an AABB bounding box of the wall according to the contour points of the wall point cloud data extracted in the step 1, dividing the bounding box into smaller cells, and specifically performing the following steps:
step 3.1.1, respectively determining the maximum value and the minimum value of the x coordinate, the y coordinate and the z coordinate of the wall profile point, and marking as maxx, minx, maxy, miny, maxz and minz;
step 3.1.2, determining eight vertexes of the bounding box according to the maximum value and the minimum value of the x coordinate, the y coordinate and the z coordinate of the wall data obtained in the step 3.1.1, so as to obtain the bounding box;
step 3.1.3, determining the length, width and height of the cells according to the coordinate values of the wall point cloud data, wherein the length of the cells isThe width of the cell is +.>The height of the unit cell is maxz-minuz;
step 3.1.4, dividing the AABB bounding box according to the cell data obtained in the step 3.1.3;
step 3.1.5, initializing the state of each cell, and marking as 0;
step 3.1.6, if the wall data point exists in the cell, changing the state of the cell to 1;
step 3.1.7, repeating the step 3.1.6 to finish marking all the cells;
step 3.2, marking the space occupation condition of each unit of the two-dimensional unit complex obtained in the step 2 according to the step 3.1, wherein the space occupation condition is respectively empty and occupied, and the method specifically comprises the following steps of:
step 3.2.1, recording the number recordk of the cells with the state 0 and the number recordz of the cells with the state 1 obtained in the step 3.1 for the cells of a certain two-dimensional cell complex;
step 3.2.2, marking the unit of the two-dimensional unit complex as 0 if the recordk > recordz, indicating that the unit is empty, otherwise marking as 1, indicating that the unit is occupied;
step 3.2.3, repeating the step 3.2.2, marking all units of the two-dimensional unit complex, and completing analysis of space occupation conditions;
step 4, performing graph cutting operation on the wall data processed in the step 3, and reconstructing a complete wall structure; the method comprises the following steps:
step 4.1, establishing an undirected graph G according to the two-dimensional unit complex of the marked wall obtained in the step 3; the method comprises the following steps:
step 4.1.1, converting the wall units of each two-dimensional unit complex into vertexes in an undirected graph G, adding side information to the wall units which can be connected with each other, wherein the sides are provided with weights, the weights of the sides are defined by the values of two wall units connected with the sides, wherein the values of the wall units are set according to the unit states marked in the step 3, and are represented as d, the state is 1, namely the wall units are occupied states, the values are random numbers between 400 and 500, the state is 0, namely the wall units are empty states, and the values are random numbers between 1 and 100;
step 4.1.2, determining the weight w of each vertex edge in the undirected graph G according to the formula (8) <p,q>
Wherein,
wherein p, q respectively represent two vertices in the undirected graph G, i.e. two wall elements, d ρ ,d q Values of p and q points, respectively, delta 2 Representing variance, ex represents mean, n is the number of vertices in undirected graph G;
step 4.1.3, adding 2 terminal vertexes, namely a source point S and a sink point T, into the undirected graph G, wherein the rest vertexes are connected with the terminal vertexes;
step 4.1.4, randomly selecting two points in different states from the vertex of the undirected graph G as seed points, wherein the occupied state seed points are marked as zseed, and the empty state seed points are marked as kzeed;
step 4.1.5, determining weights of the seed point and the terminal vertex edge, if the seed point is the seed point, setting the weight of the edge connected with the source point by zseed to be s, setting the weight of the edge connected with the source point by kzeed to be 0, setting the weight of the edge connected with the sink point by zseed to be 0, setting the weight of the edge connected with the sink point by kzeed to be s, and s=5000000;
step 4.1.6, determining the weight of each of the rest of the vertexes except the seed point and the terminal vertex edge according to the formula (11) (12),
wherein w is (s-t) Weights, w, for edges of wall units and source points (t-t) For the weight of the edges of the wall units and the sink, V represents the set of G vertices of the undirected graph, beta w Is a constant with a value of 10 4 ,K H ,K L Is an empirical constant with values of 0.9 and 0.2, respectively, l p Label, pr (l) p ) Representing the probability of each wall element tag, Q p Is a unit density equation, K is a normalization factor, B p Representing the number of points on the cell boundary, cv p Points representing units excluding boundary points, S p Representing the area of the cell, rD is a scale factor, having a value of 10 3
Step 4.2, carrying out energy minimization solution on the undirected graph G by using a maximum flow/minimum cut algorithm, finding an augmented path from a source point S to a sink point T, using the augmented path as a minimum cut of the undirected graph G, mapping the minimum cut onto a wall, and reconstructing a complete wall structure; the method comprises the following steps:
step 4.2.1, establishing an energy equation of the undirected graph G, as shown in formula (15),
E=w <p,q> +w (s-t) +w (t-t) (15)
wherein w is <p,q> Weight, w, of edges between wall units (s-t) Weights, w, for edges of wall units and source points (t-t) The weight of the edge of the wall unit and the sink is calculated;
step 4.2.2, from the source point 'S' as a starting point, searching an augmented path which can reach the sink point 'T' in the undirected graph G by applying breadth-first search, wherein the maximum flow of the path is the minimum weight value in the edge;
step 4.2.3, subtracting the maximum flow of the path from all sides in the augmented path, updating the weight of the sides to an undirected graph G, and recording the sides with the size of 0 by using tranzero [ i ];
step 4.2.4, repeating 4.2.2-4.2.3 until there is no augmented path from source point "S" to sink point "T";
step 4.2.5, the edges recorded in tratnzero [ i ] are the edges that need to be segmented;
and 4.2.6, mapping the edges obtained in the step 4.2.5 onto the wall to finish the reconstruction of the wall.
2. The method for reconstructing a wall surface in an indoor environment according to claim 1, wherein the step 1 specifically comprises:
step 1.1, obtaining normal vectors of each point in wall point cloud data through a PCA principal component analysis method;
and step 1.2, determining contour points of the wall point cloud data according to the normal vector of the points obtained in the step 1.1.
3. The method for reconstructing a wall surface in an indoor environment according to claim 2, wherein the step 1.1 specifically comprises:
step 1.1.1, wall point cloud data is p= { P 1 ,p 2 ,…,p n P is }, where i Coordinate information (x) indicating the i-th point in the point cloud file i ,y i ,z i ) I=1, 2,3, n, n is the number of wall points in the wall point cloud data;
step 1.1.2, for a point p of the wall point cloud data i The k neighbor points are queried by utilizing KD-tree, and the k neighbor points are defined as k neighbor points of the sampling point
Step 1.1.3, constructing covariance cov (X, X), cov (X, Y), cov (X, Z), cov (Y, Y), c0v (Y, Z), cov (Z, Z) from k neighboring points;
where q is a three-dimensional data point p i Is a certain adjacent point, q x Representing the x-coordinate, q of the point y Representing the y-coordinate, q of the point z The z-coordinate of the point is represented,
step 1.1.4, establishing a covariance matrix C according to the covariance i
Step 1.1.5 according to covariance matrix C i Three eigenvalues λ are found 1 、λ 2 、λ 3 And the corresponding feature vector is omega 1 ,ω 2 ,ω 3 ,λ 1 ≥λ 2 ≥λ 3 Wherein the minimum eigenvalue lambda 3 Corresponding feature vector omega 3 Is point p i Wherein the minimum eigenvalue lambda 3 Corresponding feature vector omega 3 =(x n ,y n ,z n ) Is point p i Normal vector of (c), point p i Is n i
Step 1.1.6, repeat 1.1.2-1.1.5 to obtain normal vector of all points.
4. The method for reconstructing a wall surface in an indoor environment according to claim 3, wherein the step 1.2 specifically comprises:
step 1.2.1 for each Point p of wall data i According to its point coordinates and normal vector n i Obtaining a local plane pi at the point i :x n (x-x i )+y n (y-y i )+z n (z-z i )=0;
Step 1.2.2 p is determined according to the planar normal vector direction i Projection of neighboring points of a point to a local plane n i On the above, the adjacent point q is recorded j Is q' j =(x q ,y q ,z q );
Step 1.2.3, junction p i And any one projection point q' j Taking the connecting line direction as an x-axis, and according to the fact that the x-axis and the y-axis are mutually perpendicular, obtaining the y-axis, and establishing a local coordinate system on a local plane;
step 1.2.4, connecting all projection points with the local origin of coordinates, calculating the included angles between all adjacent connecting lines, and if the maximum included angle is greater than a given threshold value, then the point p i Marked as boundary points;
step 1.2.5, all boundary points marked in step 1.2.4 are contour points, using a m Representation, put in the set { a } 1 ,a 2 ,...a m In, where m is the number of contour points.
5. The method for reconstructing a wall surface in an indoor environment according to claim 4, wherein the obtaining the initial wall characteristic line according to the contour points of the wall extracted in step 1 in step 2 is specifically:
step 2.1.1, respectively determining the maximum value and the minimum value of the x coordinate and the y coordinate of the wall contour point, and marking as maxx, minx, maxy and miny;
step 2.1.2 for each wall contour point a i Marking, initializing to 0, and marking as pointx [ a ] i ]=0, indicating that the ith contour point is not processed on the operation for the x-axis, pointy [ a ] i ]=0, indicating that the ith contour point is not processed on the operation for the y-axis;
step 2.1.3 for a certain wall contour point a i Find its neighboring point, if pointx [ a ] i ]If the value is=0, the step 2.1.4 is entered, otherwise, the next contour point is continuously selected for marking judgment;
step 2.1.4, calculating the contour point a i The difference value of the x coordinate value of the rest contour points is less than 0.02, and the contour point a is obtained i Adding 1 to the number of adjacent points;
step 2.1.5, if the contour point a i If the number of the adjacent points is greater than the threshold sigma, the marked pointx of the adjacent points is set to 1, and the number of the adjacent points is calculatedObtaining the contour point a i And the average value of the x coordinate values of the adjacent pointsWherein the threshold sigma is the number of contour points divided by the number of contour categories;
step 2.1.6, according to the average valueDetermining an initial characteristic line by maxy and miny;
step 2.1.7, repeating 2.1.3-2.1.6, and determining an initial feature line set in the x-axis direction, wherein the initial feature line set in the x-axis direction is coredinatex= { coredinatex [1], coredinatex [2],. Coredinatex [ kx ] }, kx is the number of initial feature lines in the x-axis direction, and coredinatex [ kx ] is the initial feature line in the x-axis direction;
step 2.1.7 for a certain wall contour point a i Find its neighboring point, if point [ a ] i ]If=0, go to step 2.1.8, otherwise continue to select the next contour point for label judgment;
step 2.1.8, calculating the contour point a i A difference value of y coordinate value from the rest contour points, if the difference value is less than 0.02, the contour point a is obtained i Adding 1 to the adjacent point of (2);
step 2.1.9, if the contour point a i If the number of the adjacent points is larger than the threshold sigma, the marker point y of the adjacent points is set as 1, and the contour point a is obtained i And the average value of the y coordinate values of the adjacent pointsWherein the threshold sigma is the number of contour points divided by the number of contour categories;
step 2.1.10 according to the average valuemaxx and minx determine an initial feature line;
step 2.1.11, repeating 2.1.7-2.1.10, and determining an initial feature line set in the y-axis direction, wherein the initial feature line set in the y-axis direction is coredinatey [1], coredinatey [2],. Coredinatey [ ky ] }, ky is the number of initial feature lines in the y-axis direction, and coredinatey [ ky ] is the initial feature line in the y-axis direction.
6. The method for reconstructing a wall surface in an indoor environment according to claim 5, wherein filtering the redundant lines of the obtained initial wall surface feature line in the step 2 to obtain a final wall surface feature line, and forming a two-dimensional unit complex specifically comprises:
step 2.2.1, applying bubbling sequencing to sequence the initial characteristic lines in the x-axis direction and the y-axis direction respectively, and updating the sequence of the initial characteristic lines;
step 2.2.2, if the distance between two feature lines, namely, the coredinatex [ i ] -coredinatex [ j ] < = 0.045, calculating the average value of the x values of the two feature lines, and giving the average value to the coredinatex [ i ], deleting the coredinatex [ j ], and updating the initial feature line in the x-axis direction;
step 2.2.3, repeating the step 2.2.2 until each initial characteristic line is completed, and obtaining a wall characteristic line in the final x-axis direction;
step 2.2.4, if the distance between two feature lines, namely, coredinatey [ i ] -coredinatey [ j ] <=0.04, calculating the average value of y values of the two feature lines, giving the average value to coredinatey [ i ], deleting coredinatey [ j ], and updating the initial feature line in the y-axis direction;
step 2.2.5, repeating the step 2.2.4 until each initial characteristic line is completed, and obtaining a wall characteristic line in the final y-axis direction;
and 2.2.6, forming a two-dimensional unit complex on the wall surface by the wall surface characteristic line in the final y-axis direction and the wall surface characteristic line in the final x-axis direction.
CN202010897546.7A 2020-08-31 2020-08-31 Wall reconstruction method in indoor environment Active CN112085834B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010897546.7A CN112085834B (en) 2020-08-31 2020-08-31 Wall reconstruction method in indoor environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010897546.7A CN112085834B (en) 2020-08-31 2020-08-31 Wall reconstruction method in indoor environment

Publications (2)

Publication Number Publication Date
CN112085834A CN112085834A (en) 2020-12-15
CN112085834B true CN112085834B (en) 2024-04-09

Family

ID=73731398

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010897546.7A Active CN112085834B (en) 2020-08-31 2020-08-31 Wall reconstruction method in indoor environment

Country Status (1)

Country Link
CN (1) CN112085834B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009143986A1 (en) * 2008-05-27 2009-12-03 The Provost, Fellows And Scholars Of The College Of The Holy And Undivided Trinity Of Queen Elizabeth Near Dublin Automated building outline detection
CN107146280A (en) * 2017-05-09 2017-09-08 西安理工大学 A kind of point cloud building method for reconstructing based on cutting

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009143986A1 (en) * 2008-05-27 2009-12-03 The Provost, Fellows And Scholars Of The College Of The Holy And Undivided Trinity Of Queen Elizabeth Near Dublin Automated building outline detection
CN107146280A (en) * 2017-05-09 2017-09-08 西安理工大学 A kind of point cloud building method for reconstructing based on cutting

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于基本形状及其拓扑关系的点云建筑物重建方法;郝雯;王映辉;宁小娟;;西安理工大学学报(03);全文 *

Also Published As

Publication number Publication date
CN112085834A (en) 2020-12-15

Similar Documents

Publication Publication Date Title
CN110189412B (en) Multi-floor indoor structured three-dimensional modeling method and system based on laser point cloud
CN111932688B (en) Indoor plane element extraction method, system and equipment based on three-dimensional point cloud
CN107146280B (en) Point cloud building reconstruction method based on segmentation
Zhang et al. Online structure analysis for real-time indoor scene reconstruction
CN107292276B (en) Vehicle-mounted point cloud clustering method and system
Xu et al. Reconstruction of scaffolds from a photogrammetric point cloud of construction sites using a novel 3D local feature descriptor
CN107767453B (en) Building LIDAR point cloud reconstruction optimization method based on rule constraint
CN111986322B (en) Point cloud indoor scene layout reconstruction method based on structural analysis
CN115761172A (en) Single building three-dimensional reconstruction method based on point cloud semantic segmentation and structure fitting
Lee et al. Automatic integration of facade textures into 3D building models with a projective geometry based line clustering
CN110222642A (en) A kind of planar architectural component point cloud contour extraction method based on global figure cluster
CN111581776B (en) Iso-geometric analysis method based on geometric reconstruction model
Galvanin et al. Extraction of building roof contours from LiDAR data using a Markov-random-field-based approach
CN105139379A (en) Airborne Lidar point cloud building top surface gradual extraction method based on classifying and laying
CN111340822B (en) Multi-scale self-adaptive airborne LiDAR point cloud building single segmentation method
CN114359226A (en) Three-dimensional model set visual area extraction method based on hierarchical superposition and region growth
Han et al. Urban scene LOD vectorized modeling from photogrammetry meshes
CN113933859A (en) Pavement and two-side retaining wall detection method for unmanned mine card driving scene
CN114463338A (en) Automatic building laser foot point extraction method based on graph cutting and post-processing
Zhao et al. A 3D modeling method for buildings based on LiDAR point cloud and DLG
CN113129443A (en) Three-dimensional building simplification method and system for maintaining topological relation
Sun et al. Automated segmentation of LiDAR point clouds for building rooftop extraction
Hao et al. Automatic building extraction from terrestrial laser scanning data
CN112085834B (en) Wall reconstruction method in indoor environment
Xie et al. Automatic indoor building reconstruction from mobile laser scanning data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant