CN112215958B - Laser radar point cloud data projection method based on distributed computation - Google Patents

Laser radar point cloud data projection method based on distributed computation Download PDF

Info

Publication number
CN112215958B
CN112215958B CN202011076160.6A CN202011076160A CN112215958B CN 112215958 B CN112215958 B CN 112215958B CN 202011076160 A CN202011076160 A CN 202011076160A CN 112215958 B CN112215958 B CN 112215958B
Authority
CN
China
Prior art keywords
point cloud
cloud data
tunnel
coordinate system
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011076160.6A
Other languages
Chinese (zh)
Other versions
CN112215958A (en
Inventor
赵霞
齐利军
于重重
苏维均
尤相骏
马延辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Shuliankongjian Plotting Technology Co ltd
Beijing Technology and Business University
Original Assignee
Nanjing Shuliankongjian Plotting Technology Co ltd
Beijing Technology and Business University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Shuliankongjian Plotting Technology Co ltd, Beijing Technology and Business University filed Critical Nanjing Shuliankongjian Plotting Technology Co ltd
Priority to CN202011076160.6A priority Critical patent/CN112215958B/en
Publication of CN112215958A publication Critical patent/CN112215958A/en
Application granted granted Critical
Publication of CN112215958B publication Critical patent/CN112215958B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/18File system types
    • G06F16/182Distributed file systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a laser radar point cloud data projection method based on distributed computation, which comprises the following steps: establishing a coordinate model of a projection space; acquiring laser radar point cloud data and cleaning noise point data; dividing the point cloud data into data blocks according to the mileage information, and distributing the data blocks to the computing nodes; projecting the point cloud data of each data block to generate an orthographic projection gray image; calculating a space characteristic value Rn of each tunnel actual measurement point in a section coordinate system; and adding a space feature channel for the orthographic projection gray level image to generate an image matrix with multi-dimensional feature fusion of a required scale. The method and the device efficiently process the point cloud data into the multidimensional characteristic image matrix containing the surface and the space characteristics, are beneficial to improving the accuracy of tunnel deformation and defect detection, and have higher application value and economic benefit.

Description

Laser radar point cloud data projection method based on distributed computation
Technical Field
The invention relates to laser radar point cloud data processing, in particular to a laser radar point cloud data projection method based on distributed computation, and belongs to the field of distributed computation and laser radar point cloud data processing.
Background
The three-dimensional laser radar is an advanced microwave remote sensing technology at home and abroad at present. Is an important means for directly acquiring the target precision three-dimensional data. The method improves the operation efficiency to a great extent, reduces the measurement cost, and is particularly suitable for the fine measurement of large-area and complex-surface objects. The method has specific application in engineering measurement, deformation monitoring, cultural relic protection, digital city, unmanned, petrochemical pipe network, reverse engineering and other aspects.
The laser radar scanner has high scanning point precision and high density, occupies a large amount of storage space, and cannot process large-scale point clouds rapidly and effectively by the traditional single-machine serial algorithm. The related technology of big data processing is needed to be applied to point cloud processing, and reliable and dynamic expansion of storage space and strong computing power are provided for massive point cloud data.
The point cloud data is measurement data in a three-dimensional space measured by laser radar scanning, and cannot directly express geometric characteristics of the inner surface of a tunnel. In order to detect deformation or defects of the tunnel by using the laser radar point cloud data. The existing research is to project point cloud data into a two-dimensional orthographic projection gray scale map, and identify deformation or defect of a tunnel by using an image detection technology. This approach may lose the three-dimensional characteristics of the tunnel surface, resulting in lower accuracy of deformation or defect detection.
In order to efficiently detect deformation or defect of a tunnel based on laser radar scanning, a method for realizing distributed parallelization processing of laser radar point cloud data by using a distributed computing platform needs to be studied, and physical reflection characteristics of a tunnel surface and spatial characteristics of deformation or defect of an inner wall contained in the point cloud data can be fully utilized to identify and detect deformation and defect.
Disclosure of Invention
The invention aims to realize a laser radar point cloud data projection method based on distributed computation, which comprises the following steps: 1) Establishing a coordinate model of a projection space; 2) Acquiring laser radar point cloud data and cleaning noise point data; 3) Dividing the point cloud data into data blocks according to the mileage information, and distributing the data blocks to the computing nodes; 4) Projecting the point cloud data of each data block to generate an orthographic projection gray image; 5) Calculating a space characteristic value Rn of each tunnel actual measurement point in a section coordinate system; 6) Adding a space feature channel for the orthographic projection gray level image to generate an image matrix fused by multidimensional features with required dimensions; specifically, the method of the present invention comprises the steps of:
A. establishing a coordinate model of a projection space, wherein the coordinate model comprises a laser radar measurement three-dimensional coordinate system, an orthographic projection coordinate system, a tunnel section coordinate system and a corresponding relation among the coordinate systems;
the laser radar point cloud data refers to a set of XYZ coordinates and reflectivity Ref formed by each scanning point on the inner wall of a tunnel in a laser radar measurement three-dimensional coordinate system, and is abbreviated as point cloud data;
A1. constructing a laser radar measurement three-dimensional coordinate system;
the laser radar measures the three-dimensional coordinate system, namely a measurement coordinate system for short, the origin is a tunnel center point, the positive X-axis direction is a direction facing the increasing direction of the mileage value of the tunnel and is a direction of 3 points on a cross section, the positive Y-axis direction is a direction of increasing the mileage value of the tunnel, and the positive Z-axis direction is a direction of 12 points on the cross section;
A2. constructing an orthographic projection coordinate system;
the method comprises the following steps of regarding a section of tunnel as a section of cylinder, establishing an orthographic projection coordinate system on a two-dimensional plane formed after the cylinder is unfolded into a plane, and unfolding the plane as follows:
a2.1, cutting the side surface of the cylinder along the increasing direction of the tunnel mileage value in the 6 o' clock direction to obtain a tangent line L;
a2.2, expanding the side surface of the tunnel column body into A2-dimensional plane along a tangent line L, wherein the left side is arranged at the bottom and the right side is arranged at the top; the X axis of the orthographic projection coordinate system corresponds to the right tangent line of the expansion surface of the tunnel cylinder, and the positive direction of the X axis is the incremental direction of the mileage of the tunnel; the Y axis corresponds to the leftmost side of the unfolding surface of the tunnel column body, and the direction is downward;
A3. constructing a section coordinate system;
the origin of the tunnel section coordinate system is the center point of the tunnel, the positive X-axis direction is the direction of 3 points on the cross section facing the increasing direction of the line mileage, and the positive Y-axis direction is the direction of 12 points;
B. the method comprises the following specific steps of obtaining laser radar point cloud data and cleaning noise point data:
B1. storing laser radar point cloud data on a distributed file system (HDFS) according to a hierarchical directory structure, wherein the method comprises the following specific steps of:
b1.1, in the first-level catalogue, establishing different catalogues for different item periodic codes;
b1.2, the second-level catalogue establishes different catalogues for different path segment codes under the same project period;
b1.3, establishing different catalogues for different regional codes under the same road section;
B2. reading laser radar point cloud data, and recording three-dimensional coordinates of a tunnel real-time point and laser reflectivity of the point (Xsn, ysn, zsn, ref) of each record;
B3. the method comprises the following specific steps of:
b3.1 deleting records with insufficient field number;
b3.2 deleting records containing values outside the domain range;
b3.3 statistics of maximum Ref of laser reflectivity Ref of the tunnel actual measurement point MAX And a minimum value Ref MIN
B4. Taking mileage of point cloud data as a keyword, and establishing indexes for all the point cloud data;
C. dividing the point cloud data into data blocks according to the mileage information, and distributing the data blocks to each computing node, wherein the specific steps are as follows:
C1. the mileage of each actual measurement point in the point cloud data set is subjected to integer division operation according to the designated mileage length M, so that mileage segment keywords are obtained;
C2. indexing and segmenting data according to the mileage segment keywords, merging point cloud data of the same mileage segment into the same data block, and taking the value of the mileage segment as a mark of the data block;
C3. distributing the data blocks with the mileage mark to each computing node for distributed parallel processing;
D. the point cloud data of each data block is projected to generate an orthographic projection gray image, and the specific steps are as follows:
D1. creating an all 0 gray matrix W with resolution of l×h on each computing node;
the length L of the orthographic gray-scale image is the width of the image pixel in the X-axis direction of the orthographic coordinate system, and the width H of the orthographic gray-scale image is the width of the image pixel in the Y-axis direction of the orthographic coordinate system, and the calculation formula is as follows:
L=[MAX(Y sn )-MIN(Y sn )]·K
H=2∏·RK
k is an accuracy parameter of the laser radar scanner, the number of real measuring points of the laser radar scanner in the scanning length range of 1 meter is K, and R is a standard tunnel radius;
D2. traversing the actual measurement point data in each data block, and projecting the data points into an orthographic projection coordinate system to generate a gray matrix W of an orthographic projection gray image, wherein the specific steps are as follows:
d2.1 calculates the (Xen, yen) coordinates of the point in the orthographic projection coordinate system from the coordinates (Xsn, ysn, zsn) of the real point, with the following calculation formula:
X en =(Y sn -Y min )·K
wherein Y is min The minimum value of mileage of all real measurement points in the data block is obtained;
Figure BDA0002716887220000031
wherein θ is k =R*K;
D2.2 maps the lidar reflectivity Ref of the point (Xsn, ysn, zsn) to an integer value in the range 0-255 as a gray value Pix for the (Xen, yen) position in the gray matrix W, calculated as follows:
Figure BDA0002716887220000041
E. calculating a space characteristic value Rn of each tunnel actual measurement point in a section coordinate system, wherein the method comprises the following specific steps:
E1. projecting each tunnel real point (Xsn, ysn, zsn) into a section coordinate system, wherein the calculation formula of the corresponding coordinates (Xmn, ymn) is as follows:
X mn =X sn
Y mn =Z sn
E2. calculating a space characteristic quantity Rn corresponding to each tunnel actual point (Xsn, ysn, zsn), wherein the calculation formula is as follows:
Figure BDA0002716887220000042
E3. mapping the space feature quantity Rn into integer values ranging from 0 to 255, and calculating the space feature quantity Rn as a normalized feature quantity;
F. adding a space feature channel for a gray matrix W of an orthographic projection gray image to generate a multi-dimensional feature fused image matrix, wherein the method comprises the following specific steps of:
F1. creating an RGB three-channel image matrix WC with resolution of L.times.H and all 0;
F2. adding normalized feature quantity corresponding to each point in a gray matrix W of an orthographic projection gray image into an R channel of a matrix WC;
F3. copying a gray matrix W of the orthographic projection gray image into a G channel of a matrix WC;
F4. the B channel of the matrix WC keeps an all-0 state, and an image matrix with multi-dimensional feature fusion is obtained;
and F5, splicing the image matrixes fused by the multidimensional features into an image matrix with a required length for target detection analysis.
The method has the advantages that the distributed processing related technology is adopted, the laser radar scanning point cloud data is efficiently processed into the image matrix with multi-dimensional feature fusion, the texture features and the space features of the laser radar scanning point cloud data are extracted more completely, a large amount of storage space is saved, the calculation cost for carrying out deformation and defect detection processing on the tunnel is reduced, and the method has higher application value and economic benefit.
Drawings
Fig. 1: laser radar point cloud data projection method flow chart based on distributed computation
Detailed Description
The invention will now be described in detail with reference to the drawings and specific examples.
The method flow chart is shown in figure 1, the method of the invention comprises 1) establishing a coordinate model of a projection space; 2) Acquiring laser radar point cloud data and cleaning noise point data; 3) Dividing the point cloud data into data blocks according to the mileage information, and distributing the data blocks to the computing nodes; 4) Projecting the point cloud data of each data block to generate an orthographic projection gray image; 5) Calculating a space characteristic value Rn of each tunnel actual measurement point in a section coordinate system; 6) Adding a space feature channel for the orthographic projection gray level image to generate an image matrix fused by multidimensional features with required dimensions;
the invention is further described in terms of steps in connection with data instances, taking a piece of data of a section of a tunnel as an example:
1. establishing a coordinate model of a projection space;
2. the method comprises the following specific steps of obtaining laser radar point cloud data and cleaning noise point data:
2.1, storing laser radar point cloud data on a distributed file system HDFS according to a hierarchical directory structure; the hierarchical directory structure is as follows:
Figure BDA0002716887220000051
2.2, reading laser radar point cloud data, wherein each record corresponds to the three-dimensional coordinate of a tunnel real point and the laser reflectivity (Xsn, ysn, zsn, ref) of the point, and partial laser radar point cloud data is obtained as follows:
Figure BDA0002716887220000052
/>
Figure BDA0002716887220000061
2.3 cleaning and data structuring treatment are carried out on the point cloud data;
counting the maximum Ref of the reflectivity field of the point cloud data of the whole interval MAX 65535, minimum value Ref MIN And (3) obtaining point cloud data after partial cleaning and data structuring treatment, wherein the point cloud data is as follows:
Figure BDA0002716887220000062
Figure BDA0002716887220000071
/>
2.4, taking mileage of point cloud data as a keyword, establishing indexes for all the point cloud data, and obtaining the point cloud data with the partially established indexes as follows:
Figure BDA0002716887220000072
3. dividing the point cloud data into data blocks according to the mileage information, and distributing the data blocks to each computing node, wherein the specific steps are as follows:
3.1, performing integer division operation on mileage of each actual measurement point in the point cloud data set according to the designated mileage length M=2 to obtain mileage section keywords; the partial point cloud index data established by taking the mileage segments as keywords is as follows:
Figure BDA0002716887220000073
/>
Figure BDA0002716887220000081
3.2, indexing and segmenting the data according to the mileage segment keywords, merging the point cloud data of the same mileage segment into the same data block, and using the value of the mileage segment as a mark of the data block; the data block segmentation results are as follows:
Figure BDA0002716887220000082
Figure BDA0002716887220000091
/>
3.3, distributing the data block with the mileage mark to each computing node for distributed parallel processing;
4. the point cloud data of each data block is projected to generate an orthographic projection gray image, and the specific steps are as follows:
4.1 creating an all 0 gray matrix W with resolution l×h on each computing node;
the tunnel radius R is 3 meters, and the accuracy parameter K of the laser radar scanner is 1000; MAX (Y) dn )-MIN(Y dn ) Has a value of 0.7;
calculating the length L=1000 and the height H= 18849 of the orthographic projection diagram
The 0 gray matrix W is as follows:
[[[0],[0],[0],...]
[[0],[0],[0],...]
...
[[0],[0],[0],...]]
4.2 traversing the real-test point data in each data block, projecting the data points into an orthographic projection coordinate system to generate orthographic
Projecting a gray matrix W of the gray image;
the gray matrix results generated by the partial data blocks are as follows:
matrix 1: [[[124],[122],[132],...]
[[124],[124],[130],...]
...
[[50],[0],[0],...]]
Matrix 2: [[[115],[120],[117],...]
[[116],[116],[116],...]
...
[[0],[20],[0],...]]
5. Calculating a space characteristic value Rn of each tunnel actual measurement point in a section coordinate system, wherein the method comprises the following specific steps:
5.1 three-dimensional point coordinates (X sn ,Y sn ,Z sn ) Projecting the image into a section coordinate system;
5.2 calculating the actual point (X) sn ,Y sn ,Z sn ) Corresponding spatial feature quantity R n
5.3 spatial feature quantity R n Mapping to integer values ranging from 0 to 255, and recording as normalized feature quantity;
6. adding a space feature channel for an orthographic projection gray image to generate an image matrix fused by multidimensional features with required dimensions, wherein the method comprises the following specific steps of:
6.1 creating an RGB three-channel image matrix WC with resolution L x H of all 0;
an RGB three-channel image matrix WC with resolution L x H of all 0 is as follows:
[[[0,0,0],[0,0,0],[0,0,0],...]
[[0,0,0],[0,0,0],[0,0,0],...]
...
[[0,0,0],[0,0,0],[0,0,0],...]]
6.2 adding normalized feature quantity corresponding to each point in the gray matrix W of the orthographic projection gray image into an R channel of the matrix WC as follows:
matrix 1: [[[24,0,0],[22,0,0],[21,0,0],...]
[[26,0,0],[20,0,0],[20,0,0],...]
...
[[12,0,0],[22,0,0],[21,0,0],...]]
Matrix 2: [[[9,0,0],[10,0,0],[10,0,0],...]
[[8,0,0],[8,0,0],[7,0,0],...]
...
[[12,0,0],[14,0,0],[14,0,0],...]]
6.3 copying the gray matrix W of the orthographic projection gray image into the G channel of matrix WC, resulting in the matrix as follows:
matrix 1: [[[24,124,0],[22,122,0],[21,132,0],...]
[[26,124,0],[20,124,0],[20,130,0],...]
...
[[12,50,0],[22,0,0],[21,0,0],...]]
Matrix 2: [[[9,115,0],[10,120,0],[10,117,0],...]
[[8,116,0],[8,116,0],[7,116,0],...]
...
[[12,0,0],[14,20,0],[14,0,0],...]]
6.4, maintaining the B channel of the matrix WC in an all-0 state to obtain a multidimensional feature fused image matrix;
6.5, splicing the image matrixes fused by the multidimensional features into an image matrix with a required length for target detection analysis;
the image matrix spliced into the multidimensional feature fusion of the required length is as follows:
[[[24,124,0],[22,122,0],[21,132,0],...,[9,115,0],[10,120,0],[10,117,0],...]
[[26,124,0],[20,124,0],[20,130,0],...,[8,116,0],[8,116,0],[7,116,0],...]
...
[[12,50,0],[22,0,0],[21,0,0],...,[12,0,0],[14,20,0],[14,0,0],...]]
according to the invention, the laser radar scanning point cloud data is efficiently processed into the image matrix with multi-dimensional feature fusion, the texture features and the spatial features of the more complete laser radar scanning point cloud data are extracted, a large amount of storage space is saved, the calculation cost for carrying out deformation and defect detection processing on the tunnel in the follow-up process is reduced, and the method has higher application value and economic benefit.
Finally, it should be noted that the examples are disclosed for the purpose of aiding in the further understanding of the present invention, but those skilled in the art will appreciate that: various alternatives and modifications are possible without departing from the spirit and scope of the invention and the appended claims. Therefore, the invention should not be limited to the disclosed embodiments, but rather the scope of the invention is defined by the appended claims.

Claims (5)

1. A laser radar point cloud data projection method based on distributed computation comprises the following steps:
A. establishing a coordinate model of a projection space, wherein the coordinate model comprises a laser radar measurement three-dimensional coordinate system, an orthographic projection coordinate system, a tunnel section coordinate system and a corresponding relation among the coordinate systems;
the laser radar point cloud data refers to a set of XYZ coordinates and reflectivity Ref formed by each scanning point on the inner wall of a tunnel in a laser radar measurement three-dimensional coordinate system, and is abbreviated as point cloud data;
A1. constructing a laser radar measurement three-dimensional coordinate system;
the laser radar measures the three-dimensional coordinate system, namely a measurement coordinate system for short, the origin is a tunnel center point, the positive X-axis direction is a direction facing the increasing direction of the mileage value of the tunnel and is a direction of 3 points on a cross section, the positive Y-axis direction is a direction of increasing the mileage value of the tunnel, and the positive Z-axis direction is a direction of 12 points on the cross section;
A2. constructing an orthographic projection coordinate system;
taking a section of tunnel as a section of cylinder, and establishing an orthographic projection coordinate system on a two-dimensional plane formed by expanding the cylinder into a plane;
A3. constructing a section coordinate system;
the origin of the tunnel section coordinate system is the center point of the tunnel, the positive X-axis direction is the direction of 3 points on the cross section facing the increasing direction of the line mileage, and the positive Y-axis direction is the direction of 12 points;
B. the method comprises the following specific steps of obtaining laser radar point cloud data and cleaning noise point data:
B1. storing the laser radar point cloud data on a distributed file system (HDFS) according to a hierarchical directory structure;
B2. reading laser radar point cloud data, and recording three-dimensional coordinates of a tunnel real-time point and laser reflectivity of the point (Xsn, ysn, zsn, ref) of each record;
B3. cleaning and structuring the point cloud data;
B4. taking mileage of point cloud data as a keyword, and establishing indexes for all the point cloud data;
C. dividing the point cloud data into data blocks according to the mileage information, and distributing the data blocks to each computing node, wherein the specific steps are as follows:
C1. the mileage of each actual measurement point in the point cloud data set is subjected to integer division operation according to the designated mileage length M, so that mileage segment keywords are obtained;
C2. indexing and segmenting data according to the mileage segment keywords, merging point cloud data of the same mileage segment into the same data block, and taking the value of the mileage segment as a mark of the data block;
C3. distributing the data blocks with the mileage mark to each computing node for distributed parallel processing;
D. the point cloud data of each data block is projected to generate an orthographic projection gray image, and the specific steps are as follows:
D1. creating an all 0 gray matrix W with resolution of l×h on each computing node;
the length L of the orthographic gray-scale image is the width of the image pixel in the X-axis direction of the orthographic coordinate system, and the width H of the orthographic gray-scale image is the width of the image pixel in the Y-axis direction of the orthographic coordinate system, and the calculation formula is as follows:
L=[MAX(Y sn )-MIN(Y sn )]·K
H=2Π·RK
k is an accuracy parameter of the laser radar scanner, the number of real measuring points of the laser radar scanner in the scanning length range of 1 meter is K, and R is a standard tunnel radius;
D2. traversing the actual measurement point data in each data block, and projecting the data points into an orthographic projection coordinate system to generate a gray matrix W of an orthographic projection gray image;
e, calculating a space characteristic value Rn of each tunnel actual measurement point in a section coordinate system, wherein the specific steps are as follows:
E1. projecting the three-dimensional point coordinates (Xsn, ysn, zsn) of each tunnel real point into a section coordinate system, wherein the calculation formula of the corresponding coordinates (Xmn, ymn) is as follows:
X mn =X sn
Y mn =Z sn
E2. calculating a space characteristic quantity Rn corresponding to each tunnel actual point (Xsn, ysn, zsn), wherein the calculation formula is as follows:
Figure FDA0002716887210000021
E3. mapping the space feature quantity Rn into integer values ranging from 0 to 255, and recording the integer values as normalized feature quantities;
F. adding a space feature channel for a gray matrix W of an orthographic projection gray image to generate a multi-dimensional feature fused image matrix, wherein the method comprises the following specific steps of:
F1. creating an RGB three-channel image matrix WC with resolution of L.times.H and all 0;
F2. adding normalized feature quantity corresponding to each point in a gray matrix W of an orthographic projection gray image into an R channel of a matrix WC;
F3. copying a gray matrix W of the orthographic projection gray image into a G channel of a matrix WC;
F4. the B channel of the matrix WC keeps an all-0 state, and an image matrix with multi-dimensional feature fusion is obtained;
F5. and splicing the image matrixes with the multi-dimensional feature fusion into an image matrix with a required length for target detection analysis.
2. The method for projecting laser radar point cloud data based on distributed computation according to claim 1, wherein when an orthographic projection coordinate system is constructed, a section of tunnel is regarded as a section of cylinder, the orthographic projection coordinate system is established on a two-dimensional plane formed after the cylinder is unfolded into a plane, and the unfolding steps are as follows:
a2.1, cutting the side surface of the cylinder along the increasing direction of the tunnel mileage value in the 6 o' clock direction to obtain a tangent line L;
a2.2, expanding the side surface of the tunnel column body into A2-dimensional plane along a tangent line L, wherein the left side is arranged at the bottom and the right side is arranged at the top; the X axis of the orthographic projection coordinate system corresponds to the right tangent line of the expansion surface of the tunnel cylinder, and the positive direction of the X axis is the incremental direction of the mileage of the tunnel; the Y axis corresponds to the leftmost side of the development surface of the tunnel column body and faces downwards.
3. The method for projecting laser radar point cloud data based on distributed computing as claimed in claim 1, wherein the step of storing the laser radar point cloud data on the distributed file system HDFS according to a hierarchical directory structure is as follows:
b1.1, in the first-level catalogue, establishing different catalogues for different item periodic codes;
b1.2, the second-level catalogue establishes different catalogues for different path segment codes under the same project period;
and B1.3, establishing different catalogs for different regional codes under the same road section.
4. The method for projecting laser radar point cloud data based on distributed computing as claimed in claim 1, wherein the step of cleaning and data structuring the point cloud data is as follows:
b3.1 deleting records with insufficient field number;
b3.2 deleting records containing values outside the domain range;
b3.3 statistics of maximum Ref of laser reflectivity Ref of the tunnel actual measurement point MAX And a minimum value Ref MIN
5. The method for projecting laser radar point cloud data based on distributed computation according to claim 1, wherein the steps of traversing the real point data in each data block, projecting the data points into an orthographic projection coordinate system to generate a gray matrix W of an orthographic projection gray image are as follows:
d2.1 calculates the (Xen, yen) coordinates of the point in the orthographic projection coordinate system from the coordinates (Xsn, ysn, zsn) of the real point, with the following calculation formula:
X en =(Y sn -Y min )·K
wherein Y is min The minimum value of mileage of all real measurement points in the data block is obtained;
Figure FDA0002716887210000031
wherein θ is k =R*K;
D2.2. The lidar reflectivity Ref of the point (Xsn, ysn, zsn) is mapped to an integer value in the range 0-255 as a gray value Pix at the (Xen, yen) position in the gray matrix W, and the calculation formula is as follows:
Figure FDA0002716887210000032
/>
CN202011076160.6A 2020-10-10 2020-10-10 Laser radar point cloud data projection method based on distributed computation Active CN112215958B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011076160.6A CN112215958B (en) 2020-10-10 2020-10-10 Laser radar point cloud data projection method based on distributed computation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011076160.6A CN112215958B (en) 2020-10-10 2020-10-10 Laser radar point cloud data projection method based on distributed computation

Publications (2)

Publication Number Publication Date
CN112215958A CN112215958A (en) 2021-01-12
CN112215958B true CN112215958B (en) 2023-05-02

Family

ID=74053004

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011076160.6A Active CN112215958B (en) 2020-10-10 2020-10-10 Laser radar point cloud data projection method based on distributed computation

Country Status (1)

Country Link
CN (1) CN112215958B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113223180B (en) * 2021-05-12 2022-09-20 武汉中仪物联技术股份有限公司 Pipeline three-dimensional modeling method and system based on multi-sensor fusion
CN113920191B (en) * 2021-07-30 2024-06-04 北京工商大学 6D data set construction method based on depth camera
CN113837996B (en) * 2021-08-17 2023-09-29 北京工商大学 Automatic subway tunnel defect detection method supporting manual verification
CN114372916B (en) * 2021-12-31 2024-05-31 易思维(杭州)科技股份有限公司 Automatic point cloud splicing method
CN115422387B (en) * 2022-11-04 2023-02-24 山东矩阵软件工程股份有限公司 Point cloud data processing method and system based on multi-dimensional point cloud fusion data

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106127771A (en) * 2016-06-28 2016-11-16 上海数联空间科技有限公司 Tunnel orthography system and method is obtained based on laser radar LIDAR cloud data
WO2018205119A1 (en) * 2017-05-09 2018-11-15 深圳市速腾聚创科技有限公司 Roadside detection method and system based on laser radar scanning
CN110853037A (en) * 2019-09-26 2020-02-28 西安交通大学 Lightweight color point cloud segmentation method based on spherical projection

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109064506B (en) * 2018-07-04 2020-03-13 百度在线网络技术(北京)有限公司 High-precision map generation method and device and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106127771A (en) * 2016-06-28 2016-11-16 上海数联空间科技有限公司 Tunnel orthography system and method is obtained based on laser radar LIDAR cloud data
WO2018205119A1 (en) * 2017-05-09 2018-11-15 深圳市速腾聚创科技有限公司 Roadside detection method and system based on laser radar scanning
CN110853037A (en) * 2019-09-26 2020-02-28 西安交通大学 Lightweight color point cloud segmentation method based on spherical projection

Also Published As

Publication number Publication date
CN112215958A (en) 2021-01-12

Similar Documents

Publication Publication Date Title
CN112215958B (en) Laser radar point cloud data projection method based on distributed computation
Yu et al. Automatic 3D building reconstruction from multi-view aerial images with deep learning
Ma et al. A review of 3D reconstruction techniques in civil engineering and their applications
Xu et al. Reconstruction of scaffolds from a photogrammetric point cloud of construction sites using a novel 3D local feature descriptor
Shi et al. Adaptive simplification of point cloud using k-means clustering
CN113432600A (en) Robot instant positioning and map construction method and system based on multiple information sources
CN115372989A (en) Laser radar-based long-distance real-time positioning system and method for cross-country automatic trolley
CN113916130B (en) Building position measuring method based on least square method
CN111797836A (en) Extraterrestrial celestial body patrolling device obstacle segmentation method based on deep learning
Xu et al. A 3D reconstruction method for buildings based on monocular vision
CN112197773B (en) Visual and laser positioning mapping method based on plane information
CN116518864A (en) Engineering structure full-field deformation detection method based on three-dimensional point cloud comparison analysis
Lu et al. A lightweight real-time 3D LiDAR SLAM for autonomous vehicles in large-scale urban environment
Zhang et al. Point cloud registration methods for long‐span bridge spatial deformation monitoring using terrestrial laser scanning
Quan et al. Filtering LiDAR data based on adjacent triangle of triangulated irregular network
Demir Automated detection of 3D roof planes from Lidar data
CN117523403A (en) Method, system, equipment and medium for detecting spot change of residence map
Elkhrachy Feature extraction of laser scan data based on geometric properties
Dong et al. Lightweight and edge-preserving speckle matching network for precise single-shot 3D shape measurement
CN116012737A (en) High-speed construction monitoring method and system based on unmanned aerial vehicle laser and vision fusion
CN114563000B (en) Indoor and outdoor SLAM method based on improved laser radar odometer
CN115713548A (en) Automatic registration method for multi-stage live-action three-dimensional model
Lee et al. Determination of building model key points using multidirectional shaded relief images generated from airborne LiDAR data
CN114387488A (en) Road extraction system and method based on Potree point cloud image fusion
Griffioen A voxel-based methodology to detect (clustered) outliers in aerial lidar point clouds

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant