LU503531B1 - Building outline extraction method based on laser point cloud - Google Patents
Building outline extraction method based on laser point cloud Download PDFInfo
- Publication number
- LU503531B1 LU503531B1 LU503531A LU503531A LU503531B1 LU 503531 B1 LU503531 B1 LU 503531B1 LU 503531 A LU503531 A LU 503531A LU 503531 A LU503531 A LU 503531A LU 503531 B1 LU503531 B1 LU 503531B1
- Authority
- LU
- Luxembourg
- Prior art keywords
- point cloud
- point
- building outline
- building
- cloud data
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/176—Urban or other man-made structures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4802—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4808—Evaluating distance, position or velocity data
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30184—Infrastructure
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Processing Or Creating Images (AREA)
Abstract
Disclosed is a building outline extraction method based on a laser point cloud. The present invention projects all original point cloud data onto an xoy plane, a point cloud can preserve a building wall well, filtering the point cloud data can remove a point cloud of an area with smaller point cloud density, only a building outline point cloud with higher point cloud density and other noise point clouds with higher point cloud density are left, then a boundary extraction algorithm based on normal estimation is used to extract boundaries of the building outline point cloud and the noise point clouds, the obtained point clouds are boundaries of the building outline and the noise point clouds, and the point clouds are fitted by a straight line and partitioned into line segments to be output, to accurately obtain a building outline point cloud.
Description
BL-5625
BUILDING OUTLINE EXTRACTION METHOD BASED ON LASER POINT | |V°08881
CLOUD
[01] The present invention relates to the technical field of building outline extraction, and particularly relates to a building outline extraction method based on a laser point cloud.
[02] Urban three-dimensional modeling is an important step in construction of digital cities and real scenes. Buildings is one of the main bodies of a city and an important element of urban three-dimensional modeling. A peripheral outline of a building is important information for construction of a building three-dimensional model, and the building outline information can also be applied to urban spatial analysis, map updating, etc. Therefore, it is of great significance to extract a building outline efficiently and accurately.
[03] An airborne LiDAR technology is widely used in urban planning, terrain mapping, natural disaster prevention, urban three-dimensional modeling, etc. Most of the algorithms are based on a separated building point cloud for outline extraction from classification of ground points and non-ground points, to extraction of a building point cloud, and then to extraction of the building outline, but there is a loss of accuracy in each step of this processing method, leading to a final building outline extraction result is inaccurate.
[04] In order to solve the above problem existing in the art, the present invention provides a building outline extraction method based on a laser point cloud.
[05] To implement the above objective, the present invention provides the following solution:
[06] A building outline extraction method based on a laser point cloud includes:
[07] obtaining initial point cloud data of a building to be extracted,
[08] projecting the initial point cloud data onto an xoy plane to obtain three-dimensional point cloud data;
[09] filtering the three-dimensional point cloud data to obtain filtered point cloud data;
[10] using a boundary extraction algorithm based on normal estimation to extract the filtered point cloud data to obtain a boundary point cloud,
[11] fitting the boundary point cloud by a straight line to obtain a linear point cloud, and
[12] partitioning the linear point cloud into line segments to obtain a building outline point cloud line segment result.
[13] Based on specific embodiments provided in the present invention, the present invention has the following technical effects:
[14] The present invention projects all initial point cloud data onto an xoy plane, 1
BL-5625 with high precision and high density, a point cloud may preserve a building wall well, LU503531 after being projected onto the plane, the part of the building wall has density much higher than other areas, filtering the point cloud data may remove a point cloud of an area with smaller point cloud density, only a building outline point cloud with higher point cloud density and other noise point clouds with higher point cloud density are left, then a boundary extraction algorithm based on normal estimation is used to extract boundaries of the building outline point cloud and the noise point clouds, the obtained point clouds are boundaries of the building outline and the noise point clouds, and the point clouds are fitted by a straight line and partitioned into line segments to be output, so as to accurately obtain a building outline point cloud finally.
[15] In order to explain the technical solutions in examples of the present invention or in the prior art more clearly, the accompanying drawings required in the examples will be described below briefly. Apparently, the accompanying drawings in the following description show merely some examples of the present invention, and other drawings can be derived from these accompanying drawings by those of ordinary skill in the art without creative efforts.
[16] FIG 1 is a flowchart of a building outline extraction method based on a laser point cloud according to the present invention;
[17] FIG 2 is a flowchart of implementation of a building outline extraction method based on a laser point cloud according to an example of the present invention;
[18] FIG 3 is a projected point cloud data map according to an example of the present invention;
[19] FIG 4 is a point cloud data map after statistical outlier removal (SOR) filtering according to an example of the present invention;
[20] FIG 5 is a schematic diagram of vector included angles according to an example of the present invention;
[21] FIG 6 is boundary-extracted point cloud data maps according to an example of the present invention; where part (a) in FIG 6 is an initial point cloud data map; part (b) in FIG 6 is a boundary-extracted point cloud data map when an angle threshold is 1/2; part (c) in FIG 6 is a boundary-extracted point cloud data map when the angle threshold is 1/4; and part (d) in FIG 6 is a boundary-extracted point cloud data map when the angle threshold is 1/6;
[22] FIG 7 is a schematic diagram of application of a random sample consensus (RANSAC) algorithm to line extraction according to an example of the present invention; where part (a) in FIG 7 is a schematic diagram of application of the
RANSAC algorithm to the point cloud line extraction with less interference; and part (b) in FIG 7 is a schematic diagram of application of the RANSAC algorithm to the point cloud line extraction with larger actual interference;
[23] FIG 8 is a schematic diagram of a straight line containing a plurality of building outlines according to an example of the present invention;
[24] FIG 9 is a flowchart of a line partitioning algorithm according to an example of the present invention; 2
BL-5625
[25] FIG 10 is a schematic diagram of point cloud block partitioning according to LU503531 an example of the present invention; where part (a) in FIG 10 is a schematic diagram of correct block partitioning; and part (b) in FIG 10 is a schematic diagram of wrong block partitioning;
[26] FIG 11 is a rendering diagram of point cloud data according to an example of the present invention;
[27] FIG 12 is a schematic diagram of a point cloud projected onto an xoy plane according to an example of the present invention;
[28] FIG 13 is a schematic diagram of point cloud data obtained by using SOR filtering to filter a ground point cloud and a point cloud in a roof after projecting according to an example of the present invention;
[29] FIG 14 is a schematic diagram of point cloud data when a boundary extraction threshold is 7/6 according to an example of the present invention;
[30] FIG 15 is a schematic diagram of an extracted outline according to an example of the present invention; and
[31] FIG 16 is a schematic diagram of twenty representative buildings according to an example of the present invention.
[32] The technical solutions of examples of the present invention will be described below clearly and comprehensively in conjunction with accompanying drawings of the examples of the present invention. Apparently, the examples described are merely some examples rather than all examples of the present invention. Based on the examples of the present invention, all other examples acquired by those of ordinary skill in the art without making creative efforts fall within the scope of protection of the present invention.
[33] The objective of the present invention is to provide a building outline extraction method based on a laser point cloud, so as to improve the accuracy of building outline extraction.
[34] To make the foregoing objective, features, and advantages of the present invention clearer and more comprehensible, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.
[35] As shown in FIGs. 1 and 2, the building outline extraction method based on a laser point cloud provided in the present invention includes:
[36] Step 100: obtain initial point cloud data of a building to be extracted The obtained initial point cloud data is an unmanned aerial vehicle-mounted lidar point cloud.
[37] Step 101: project the initial point cloud data onto an xoy plane to obtain three-dimensional point cloud data
[38] Step 102: filter the three-dimensional point cloud data to obtain filtered point cloud data With the high precision and high density, the point cloud may preserve a building wall well, and after being projected onto the plane, the part of the building wall has density much higher than other areas. Therefore, in the present invention, an 3
BL-5625 statistical outlier removal (SOR) technique may be used to filter three-dimensional point LU503531 cloud data so as to obtain filtered point cloud data, and a point cloud of an area with smaller point cloud density may be removed, and only a building outline point cloud with higher point cloud density and other noise point clouds with higher point cloud density are left.
[39] Step 103: use a boundary extraction algorithm based on normal estimation to extract the filtered point cloud data to obtain a boundary point cloud. The obtained boundary point cloud is a building outline and a boundary of the noise point cloud.
[40] As shown in FIG 4, in the point cloud data after SOR filtering, only points of interest (the building outline point cloud) and the noise point cloud (for instance, a vegetation tree point cloud) are left, but they cannot be directly applied to random sample consensus (RANSAC) linear fitting to extract the building outline. The present invention performs boundary extraction once on the point cloud data after SOR filtering.
For a large vegetation point cloud, the boundary extraction removes points in an outline of the vegetation point cloud, only the outlines of the boundary point cloud and the vegetation tree point cloud are left, and the building outline point cloud is unchanged.
The above point cloud data is applied to RANSAC linear fitting, a fitted straight line is partitioned by the partitioning algorithm to obtain a straight line segment point cloud, a length threshold and a number threshold of the straight line segment point cloud are set, and the straight line segments which do not meet the requirements are removed, so as to implement building outline extraction.
[41] The current point cloud boundary extraction mainly includes identifying the boundary points based on an average effect of the k-neighborhood point set on the sampling point and identifying the boundary points based on the sampling point and included angles between the vectors formed by the k-neighborhood. However, the boundary extraction algorithm in the step of the present invention is mainly achieved based on a maximum included angle between the vectors, and specifically:
[42] k nearest points of the sampling point p are obtained, p is taken as a center to be connected to each nearest point to form a vector separately, and a certain vector is selected as a reference direction and rotates clockwise to the other vectors, so as to obtain corresponding k rotation angles, as shown in FIG 5. The included angle # between adjacent vectors may be computed as follows: a, — 0, 1=1,2,...,Kk—1 143] fo ~a,i=k D
[44] An angle threshold fg is set, and when , the point is determined to
WAIN, > be a boundary point.
[45] The angle threshold tg determines an effect of boundary extraction. The larger the angle threshold is, the finer and clearer the boundary is extracted, and the fewer points are reserved. In the present invention, the purpose of boundary extraction 4
BL-5625 is to remove a point cloud in a tree point cloud, and a requirement for the accuracy of LU503531 the boundary is not high. A larger angle threshold may instead cause a phenomenon that a building outline point cloud is removed, and therefore, the threshold may be set to be small, so as to remove most of the point clouds in the tree point cloud. By means of experiments, it is found that when the angle threshold is set to be 7/6, both the majority of points in the tree point cloud can be effectively removed and the building outline point cloud can be preserved, as shown in FIG. 6.
[46] Step 104: fit the boundary point cloud by a straight line to obtain a linear point cloud Based on the above description, the present invention uses the RANSAC algorithm to fit a boundary point cloud to obtain a boundary line. The RANSAC algorithm is an effective method for linear fitting.
[47] However, the algorithm has the following disadvantages when applied to the projected point cloud building outline extraction:
[48] (1) The algorithm can only fit a correct linear point cloud in the case of less interference. The projected point cloud usually has a large noise point cloud, and an influence still exists after SOR filtering and boundary extraction, as shown in FIG 7.
[49] (2) The algorithm is to fit a straight line in a point cloud surface, the fitted straight line usually runs through the whole point cloud, and the building outline is usually a line segment of a straight line, or the straight line contains a plurality of building outline point cloud line segments. Therefore, the building outline point cloud can not be effectively extracted only by linear fitting, as shown in FIG. 8.
[50] (3) Since the RANSAC linear fitting algorithm randomly selects points from the entire point cloud region for fitting, it is inefficient and time-consuming, and in the entire point cloud, the different regions have different degrees of adaptation to the threshold, and setting one threshold uniformly generally causes an imbalance result.
[51] Step 105: partition the linear point cloud into line segments to obtain a building outline point cloud line segment result In order to overcome the shortcomings of the above embodiment of step 104, the step of the present invention proposes a straight line partitioning algorithm, and a algorithm flow is as follows:
[52] 1) Linear fitting: use the preprocessed point cloud data DF, to fit a linear point cloud R,, by means of the RANSAC linear fitting algorithm.
[53] 2) Compute an included angle À angle with the x-axis, and determine a sorting mode: obtain linear model parameters (direction Vectors yy; and „4 of the straight line) of the linear point cloud R,,, compute an included angle between the . . . my __ I straight line and the x-axis: X,,,, —arctan( =P , under the condition that À angle € 4 sort the point clouds in R,,, in an ascending order according to x values of the points,
BL-5625 x LU503531 and under the condition that X angle > 4° sort R,, isin an ascending order according to y values of the points.
[54] 3) Compute a difference partitioning straight line: compute an x or y difference between the (i+1)" point and the i" point of the sorted linear point cloud
Rain . 2
[55] { dis = x; — x, linedngle < = @ dis = 4 dis = Yızı 7 Fo ‚JineAnsle > = 3 X =f (= ai... 1)
[56] Under the condition that it is less than the threshold 7, , add the i point to the line segment point cloud Seg,.,, under the condition that it is greater than the threshold, take the current point in Seg,,., as a line segment point cloud, and enter step (4).
[57] 4) Set a point number threshold min points and a line segment length threshold max distance | under the condition that the number of points and the line segment length of Seg, ., are both greater than the two above thresholds, output same, and under the condition that one threshold is not satisfied, add Seg, to the point cloud set SegQut,,, and empty Seg,., to prepare the next line segment point cloud.
[58] 5) After traverse the entire linear point cloud R,,, return SegQOut,,, to cloud, fit the next straight line, and repeat steps 1)-5).
[59] 6) Set a value Ifer,,, of an iteration counter /fer,„.. In the process of line segment partitioning, under the condition that no line segment satisfying the condition is obtained from the entire straight line, the entire straight line is returned to DPF, to participate in the next linear fitting, it means that the point cloud participating in the linear fitting this time is the same as the point cloud participating in the next linear fitting, and under the condition that the condition occurs once, 1 is added to ter, - In a certain iteration, the line segment is obtained by partitioning, then [Ifer,,. is cleared, 6
BL-5625
LU503531 when ter. = N, it means that no qualified line segment is obtained by partitioning continuously for N times, and when [fer is greater than a set threshold of the number of iterations, the entire algorithm flow ends. The overall algorithm flow is shown in FIG 9.
[60] Moreover, the RANSAC linear fitting algorithm selects points randomly from the entire point cloud region for fitting, the more the number of point clouds, the lower the probability of selecting a correct point each time and the more points participating in fitting computation each time, the number of point clouds and running time is not a simple proportional relation, but as the number of point clouds increases, the running time will increase rapidly. In order to solve the problem, the present invention may also partition the point cloud data into blocks so as to improve the running efficiency of the algorithm. It is to be noted that in the process of partitioning, the situation that a short building outline point cloud is divided into two point cloud blocks should be avoided.
Because after dividing it into two point cloud blocks, the line segment length or the number of points of the point cloud is further reduced, resulting in not satisfying the threshold requirements, and finally resulting in that the short building outline point cloud extraction fails, as shown in FIG. 10.
[61] A traditional building outline cloud extraction algorithm is easily interfered by a high vegetation point cloud, especially in the stage of building point cloud extraction.
Based on the above description, the present invention may map an initial three-dimensional point cloud into a two-dimensional plane, and there are two cases for the mapped point cloud: one is a sparse building outline interior point cloud and a ground point cloud, and the other is a building outline point cloud and a vegetation point cloud with higher point cloud density. In the two-dimensional plane, the building outline interior point cloud and the ground point cloud may be effectively removed by using a point cloud noise algorithm. Moreover, for the mapped vegetation point cloud, a boundary of the vegetation point cloud is extracted by means of a scattered point cloud boundary extraction algorithm, so as to remove a point cloud in an outline of the vegetation point cloud. For the left building outline point cloud and the irregular vegetation outline point cloud in the point clouds, the linear fitting algorithm is improved, that is: the fitted linear point cloud is partitioned into line segments, the length and the number of points of the point cloud line segments are limited by means of double thresholds, the point cloud line segments satisfying the double thresholds are output, and finally the building point cloud outline extraction is completed.
[62] In order to verify the above building outline extraction method based on a laser point cloud provided in the present invention, an unmanned aerial vehicle-mounted LiDAR point cloud in a certain area is selected as experimental data, and the original unmanned aerial vehicle point cloud is down-sampled. After down-sampling, an average point distance is 0.7 m, and the number of point clouds is about 550, 000. The point cloud data contains the ground, roads, and vegetation with low shrubbery, medium vegetation, and large trees. A building structure is also complicated, and is a structure of a building main body, an atrium and a corridor, as 7
BL-5625 shown in FIG. 11. LUS03531
[63] Firstly, the original point cloud is projected onto an xoy plane, as shown in
FIG 12, then SOR filtering is performed to filter out a projected ground point cloud and point cloud in a roof, as shown in FIG 13, and a point cloud after boundary extraction is performed by setting a boundary extraction threshold to be 7/6 is as shown in FIG 14.
[64] The improved RANSAC algorithm is used to fit a straight line, to extract a building outline point cloud. The present invention firstly partitions the fitted straight line into several line segments, and the basis of partitioning is an adjacent-point distance.
When the point distance is greater than a threshold, the linear point cloud is partitioned, and a partitioning distance threshold is set to be 0.8 m. For each partitioned line segment, the line segment length and the number of line segment point clouds are limited, and an output length threshold of the line segment is set to be 2 m, a line segment point cloud number threshold is set to be 60, and finally the building outline is extracted, as shown in FIG. 15.
[65] In conjunction with FIG. 15 and Table 1, it can be seen that a small portion of non-building points are misclassified as an building outline due to regular interferences existing in a survey area, such as a temporary tent; and some buildings have complex roof structures, and interference is severe after point cloud projection. Some of the building outline points are mistakenly taken as non-building points, this case mainly exists at a junction of the atrium and the corridor, and this portion of the outline is short in length and misclassified as the non-building points. Moreover, under the condition that an elevation point cloud has large holes due to shielding when an unmanned aerial vehicle collects data, it may also cause this phenomenon. Through visual comparison with satellite images, the general building outline in the survey area may be better extracted, and the effect on a region with more serious vegetation interference is also desirable, which verifies the feasibility of the above method provided in the present invention.
[66] Table 1 Comparison table of present invention result with reference data
Reference data Extraction result in the present invention
Number Number of Number of building outline Number of non-building outline of non-building point/pcs points/pcs building outline Total Correct Wrong Relative Total Correct Wrong Relative outline points/pcs point error point error point/pcs 65019 385786 62952 60982 1970 3.12% 387853 383816 4037 1.04%
[67] In order to further quantitatively evaluate the accuracy of extracting a building outline point cloud by the above method provided in the present invention, a building outline point cloud in an experimental region is manually extracted as reference data for accuracy evaluation by referring to an existing accuracy evaluation method in combination with an actual building in the survey area. Quality factor Q, precision rate, accurate rate, shape similarity and position accuracy are used to evaluate the accuracy of the extraction result.
[68] (1) Evaluation index of image classification accuracy 8
BL-5625
[69] 1) Quality factor LU50353f
A
[701 C= BC ©
[71] Where: A is a building outline point cloud correctly extracted in the present invention, B is a building outline point cloud misclassified in the present invention, and C is a misclassified non-building outline point cloud, that is, a point cloud which should belong to a building outline but is not extracted in the present invention.
[72] 2) Precision rate
[73] A = rh 4)
[74] Where: D is a non-building outline point cloud correctly classified, and 7 is the total number of point clouds.
[75] 3) Accurate rate
A
[76] P= 4+B (5)
[77] 4) Recall rate
A
[78] Ca = 450 (6)
[79] Building outline accuracy evaluation values obtained based on Table 1 are shown in Table 2.
[80] Table 2 Building outline (image classification index) accuracy evaluation table
Evaluation index Value
Q 0.910
Ac 0.986
P 0.968
Ca 0.937
[81] It can be seen from Table 2 that the quality factor Q reaches 0.91, indicating that the overall quality of the extraction results of the present invention is better. The precision rate reaches 0.986, indicating that the present invention can well distinguish a building outline point cloud from a non-building outline point cloud. The accurate rate reaches 0.968, indicating that an error in the extraction results is small. The recall rate reaches 0.937, indicating that the present invention can well extract a target point cloud from an original point cloud.
[82] (2) Shape similarity and position accuracy
[83] It can be seen from the above evaluation index of image classification accuracy that the present invention can effectively extract a building outline point cloud, but the index cannot fully reflect whether the extraction result can represent an actual building. Therefore, the present invention further uses the shape similarity and the position accuracy to evaluate the matching degree of the extraction result and the actual building.
[84] 1) Indexes for evaluating similarity are an area difference Ad and a perimeter difference Pd: 9
BL-5625
LU503531 (PT aa = Het pa — peal 0 ds Bx
[86] Where: As is a building area of a reference topographic map, Ae is a building area extracted in the present invention, Ps is a building perimeter of the reference topographic map, and Pe is a building perimeter extracted in the present invention.
[87] 2) A position accuracy factor Md is computed by using a distance difference between plane coordinates of an inflection point of the building outline:
Md = ue
[89] Where: h 1s the number of an inflection point of each building, ‘ ; sir Nas is a reference building inflection point coordinate, a source of the reference building inflection point coordinate is a topographic map measured by Leica TMS50 total station (with an angular accuracy of 0.5” and a ranging accuracy of 0.6 mm + 1 ppm), and is an extracted building inflection point coordinate.
X Fer
[90] As shown in FIG 16, the present invention performs matching degree statistics on twenty main buildings in the survey area. Table 5. Matching degree evaluation table
Evaluation Minimum Maximum Average Standard index value value value deviation
Ad 0.0870 3.6160 1.7399 1.2443
Pd 0.0210 0.4070 0.1806 0.1258
Md 0.005 0.119 0.062 0.017
[91] As can be seen from Table 5, average values of the area difference and the perimeter difference are 1.7399 and 0.1806 respectively, indicating that the present invention can extract a building shape more completely, and satisfy requirements of plane position accuracy of a 1: 500 topographic map in accuracy.
[92] Based on the above description, the present invention proposes a building outline extraction algorithm which integrates the SOR filtering algorithm, the scattered point cloud boundary extraction algorithm and the improved RANSAC algorithm on the basis of analyzing the advantages and disadvantages of the existing algorithms. This method can directly process the original point cloud, and it does not need to distinguish the ground point and non-ground point in advance, so as to avoid an error caused by the above steps. Most sparse ground point clouds and point clouds in a building roof outline can be filtered out by SOR filtering. The point cloud boundary extraction algorithm based on a normal vector can filter out point clouds in an outline of a vegetation point cloud. Using disorder of the outline of the vegetation point cloud, the improved
RANSAC algorithm can accurately and efficiently extract the building outline point cloud. Experimental results show that the quality factor in the present invention is 0.910,
BL-5625 ; “14: ; ; 2 LU503531 compared with the actual building, the average area difference is 1.74 m°, the average perimeter difference is 0.186 m, and an average difference of inflexion point coordinates of the building 1s 0.062 m, which satisfies the requirements of large-scale basic geographic information data update specifications, and validates the effectiveness of the present invention.
11
Claims (7)
1. À building outline extraction method based on a laser point cloud, comprising: obtaining initial point cloud data of a building to be extracted, projecting the initial point cloud data onto an xoy plane to obtain three-dimensional point cloud data; filtering the three-dimensional point cloud data to obtain filtered point cloud data; using a boundary extraction algorithm based on normal estimation to extract the filtered point cloud data to obtain a boundary point cloud; fitting the boundary point cloud by a straight line to obtain a linear point cloud; and partitioning the linear point cloud into line segments to obtain a building outline point cloud line segment result.
2. The building outline extraction method based on a laser point cloud according to claim 1, wherein a statistical outlier removal technique is used to filter the three-dimensional point cloud data to obtain the filtered point cloud data.
3. The building outline extraction method based on a laser point cloud according to claim 2, wherein the process that a statistical outlier removal technique is used to filter the three-dimensional point cloud data to obtain the filtered point cloud data specifically comprises: obtaining a sampling point based on the three-dimensional point cloud data, and establishing a k-neighborhood of the sampling point; determining an average distance from the sampling point to points in the k-neighborhood, wherein the average distance obeys a Gaussian distribution; marking a point in the k-neighborhood as an outlier when a distance between the point in the k-neighborhood and the sampling point exceeds m standard deviations of the average distance; and removing the outlier to obtain the filtered point cloud data.
4. The building outline extraction method based on a laser point cloud according to claim 3, wherein the using a boundary extraction algorithm based on normal estimation to extract the filtered point cloud data to obtain a boundary point cloud specifically comprises: obtaining neighbor points of the sampling point; taking the sampling point as a center, and connecting the sampling point to the neighbor points to obtain a plurality of vectors; taking any one of the plurality of vectors as a reference direction, and rotating to other vectors in a clockwise direction to obtain a plurality of rotation angles; determining an angle threshold based on the plurality of the rotation angles; and extracting the filtered point cloud data based on the angle threshold to obtain the boundary point cloud.
5. The building outline extraction method based on a laser point cloud according to claim 4, wherein the angle threshold is 7/6.
6. The building outline extraction method based on a laser point cloud according to claim 1, wherein a boundary line is obtained by fitting the boundary point cloud by means of a random sample consensus (RANSAC) algorithm. 12
BL-5625
7. The building outline extraction method based on a laser point cloud according to LU503531 claim 6, wherein the partitioning the linear point cloud into line segments to obtain a building outline point cloud line segment result specifically comprises: obtaining linear model parameters of the linear point cloud, wherein the linear model parameters comprise direction vectors of the linear point cloud on an x-axis and a Y-axis; determining an included angle between the linear point cloud and the x-axis based on the linear model parameters; determining a sorting mode of the linear point cloud based on a relation between the included angle and a preset included angle; determining a difference value between points in the sorted linear point cloud; determining a line segment point cloud based on the difference; obtaining a point number threshold and a line segment length threshold; and determining whether the line segment point cloud is a building outline point cloud line segment result based on the point number threshold and the line segment length threshold. 13
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| LU503531A LU503531B1 (en) | 2023-02-23 | 2023-02-23 | Building outline extraction method based on laser point cloud |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| LU503531A LU503531B1 (en) | 2023-02-23 | 2023-02-23 | Building outline extraction method based on laser point cloud |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| LU503531B1 true LU503531B1 (en) | 2024-08-23 |
Family
ID=92424869
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| LU503531A LU503531B1 (en) | 2023-02-23 | 2023-02-23 | Building outline extraction method based on laser point cloud |
Country Status (1)
| Country | Link |
|---|---|
| LU (1) | LU503531B1 (en) |
-
2023
- 2023-02-23 LU LU503531A patent/LU503531B1/en active IP Right Grant
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN110570428B (en) | Method and system for dividing building roof sheet from large-scale image dense matching point cloud | |
| US7752018B2 (en) | Geospatial modeling system providing building roof type identification features and related methods | |
| CN106570468B (en) | A Method for Reconstructing Building Outlines from LiDAR Raw Point Clouds | |
| WO2021232463A1 (en) | Multi-source mobile measurement point cloud data air-ground integrated fusion method and storage medium | |
| WO2021143778A1 (en) | Positioning method based on laser radar | |
| Safaie et al. | Automated street tree inventory using mobile LiDAR point clouds based on Hough transform and active contours | |
| Awrangjeb et al. | Automatic building extraction from LiDAR data covering complex urban scenes | |
| CN114419085A (en) | Automatic building contour line extraction method and device, terminal device and storage medium | |
| CN106097311A (en) | The building three-dimensional rebuilding method of airborne laser radar data | |
| CN106204705A (en) | A kind of 3D point cloud segmentation method based on multi-line laser radar | |
| CN109461207A (en) | A kind of point cloud data building singulation method and device | |
| CN106970375A (en) | A kind of method that building information is automatically extracted in airborne laser radar point cloud | |
| CN108562885B (en) | High-voltage transmission line airborne LiDAR point cloud extraction method | |
| CN106127857A (en) | Synthetic data drives the on-board LiDAR data modeling method with model-driven | |
| CN113296543B (en) | Method and system for planning aerial route | |
| CN110598541A (en) | Method and equipment for extracting road edge information | |
| CN111950589B (en) | Optimal segmentation method of point cloud region growth combined with K-means clustering | |
| CN107944383A (en) | Building roof patch division method based on three-dimensional Voronoi diagram | |
| CN116258857A (en) | Outdoor tree-oriented laser point cloud segmentation and extraction method | |
| CN104050473A (en) | Road data extraction method based on rectangular neighborhood analysis | |
| CN114722944A (en) | Point cloud precision determination method, electronic device and computer storage medium | |
| Dey et al. | Building boundary extraction from LiDAR point cloud data | |
| Wu et al. | Real-time point cloud clustering algorithm based on roadside LiDAR | |
| CN112669461A (en) | Airport clearance safety detection method and device, electronic equipment and storage medium | |
| CN115760876A (en) | Progressive extraction method of street tree point cloud based on morphological features of ground objects |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FG | Patent granted |
Effective date: 20240823 |