CN114494287A - Long-distance laser radar point cloud data processing method - Google Patents

Long-distance laser radar point cloud data processing method Download PDF

Info

Publication number
CN114494287A
CN114494287A CN202111671088.6A CN202111671088A CN114494287A CN 114494287 A CN114494287 A CN 114494287A CN 202111671088 A CN202111671088 A CN 202111671088A CN 114494287 A CN114494287 A CN 114494287A
Authority
CN
China
Prior art keywords
point cloud
data
dimensional
cloud data
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111671088.6A
Other languages
Chinese (zh)
Inventor
陈钱
丁玲慧
何伟基
张闻文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN202111671088.6A priority Critical patent/CN114494287A/en
Publication of CN114494287A publication Critical patent/CN114494287A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Probability & Statistics with Applications (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a remote laser radar point cloud data processing method, which comprises the steps of scanning a target environment of data to be acquired through an optical machine scanning system, acquiring the photon flight time of each pixel point, and exporting and extracting the acquired data; calculating filter parameters by using a point cloud data segmentation method to complete parameter self-adaptation; performing dimensionality reduction on the data by applying a principal component analysis technology; and (3) carrying out point cloud clustering by using a density-based spatial clustering Denoising (DBSCAN) algorithm, deleting outliers, namely noise points, and finally upgrading the dimension of the obtained two-dimensional point cloud to three-dimensional point cloud data to complete denoising and filtering of the original point cloud data. The invention effectively reduces the complexity of the algorithm while keeping the environmental characteristics.

Description

Long-distance laser radar point cloud data processing method
Technical Field
The invention belongs to the technical field of laser radars, and particularly relates to a method for processing point cloud data of a long-distance laser radar.
Technical Field
A system in which a laser radar detects a distance, an azimuth, an altitude, a speed, an intensity, and the like of a target using laser light as a carrier wave of a signal. The laser is used for detecting the remote non-cooperative target, and the method has the advantages of strong anti-electromagnetic interference capability, good inhibition effect on ground objects and background noise, strong anti-stealth capability, good distance, angle and speed resolution, capability of realizing synchronous acquisition of distance, speed and three-dimensional image information of the target and the like. The laser radar well utilizes the advantages of laser in this aspect, and can obtain higher imaging precision compared with the traditional microwave radar. With the development of laser radars, the demand for weak signal detection is becoming stronger and stronger. When the detection distance is relatively long or the detection distance is influenced by environmental factors such as rain, fog and dust, the energy of the optical echo signal received by the detector is extremely weak, and usually, the optical echo signal only has the magnitude of a plurality of photon energies. The intensity of weak signals is lower than the detection limit of the traditional photoelectric device, so that the traditional photoelectric detector is difficult to meet the requirements of modern application. Against this background, the advent of lidar based on geiger mode avalanche photodiodes effectively addressed this problem. The Gm-APD works at a bias voltage slightly higher than the critical breakdown voltage of the PN junction of the Gm-APD, and an incoming single photon can enable the Gm-APD to generate an avalanche effect, so that the arrival time of the photon is recorded. Compared with other detectors, the Gm-APD has the advantages of small volume, high integration, high reliability and the like. The laser radar can achieve the detection sensitivity of a single photon level and has the advantages of long distance measurement and high imaging speed. The method is mainly applied to the fields of remote laser ranging, underwater target detection, atmospheric pollution measurement, astronomical observation, biological waveguide detection, weak light wavefront sensing, particle physics, fluorescence medical imaging and the like.
With the application of new probe materials and the development of manufacturing techniques, lidar is also continuously improved and developed. Pulse type photon laser radar, pulse modulation type photon laser radar and chirp modulation type photon laser radar appear in sequence. At present, the main research work of the laser radar is mostly focused on the aspects of system structure design and manufacturing and improvement of a detector, the research results of performance analysis and later data processing of the laser radar system are relatively rare, and the research on data processing methods in the later period of the system is relatively less. The photon counting laser radar adopts Gm-APD as a detector, and the laser radar is very easily interfered by noise due to the extremely high sensitivity of the laser radar. The research on the later data processing method of the laser radar can enlarge the application range of the laser radar and obtain more accurate imaging effect.
In the existing point cloud filtering algorithm, a Statistical Outlier Removal (SOR) filter distinguishes signals and noise by calculating the average distance between each point and the nearest K points; the Spatial Frequency (SF) outlier filter constructs a sphere with each point as the center, and removes noise points according to the number of points in the sphere. The statistical outlier removal filter described above cannot identify outlier groups, while the radius outlier removal filter may erroneously remove a large number of useful environmental features. Wang et al propose an adaptive ellipsoid search filter based on an existing radius filter, which takes the center of an ellipsoid as a target point to be detected and identifies noise according to the number of neighboring points of the ellipsoid. However, the algorithm is too complex and the real-time performance is not ideal enough.
Disclosure of Invention
Aiming at the problems, the invention provides a remote laser radar point cloud data processing method, which aims to reduce the dimension based on a Principal Component Analysis (PCA) technology and reduce the noise based on a density space clustering Denoising (DBSCAN) algorithm, and can effectively reduce the algorithm complexity while retaining the environmental characteristics.
The technical scheme for realizing the purpose of the invention is as follows: a long-distance laser radar point cloud data processing method comprises the following steps:
scanning a target environment of data to be acquired by using a remote laser radar point cloud acquisition system, acquiring the photon flight time of each pixel point, and exporting and extracting the acquired data;
carrying out region segmentation on the collected data by applying a point cloud data segmentation method, and calculating a clustering parameter corresponding to each region data;
performing principal component analysis technology on the data of each data block to reduce the dimension of the data;
for the data of each data block, performing point cloud clustering by using a density-based spatial clustering denoising algorithm, deleting outliers, and increasing the dimension of the obtained two-dimensional point cloud to three-dimensional point cloud data;
and splicing the new three-dimensional data obtained by processing each region to obtain complete new point cloud data subjected to filtering and denoising, and completing denoising and filtering of the original point cloud data.
Preferably, the collected point cloud is three-dimensional data composed of the coordinates of a target pixel point and the flight time of the returned photons of the pixel point target.
Preferably, the collected data is segmented by applying point cloud data by a space segmentation method with a cylindrical structure, specifically:
dividing the whole space into a plurality of cylindrical blocks by taking a vertical line of the laser radar sensor as a central axis, wherein the space between every two cylinders is a point cloud dividing area;
in the cylinder model, each region is guaranteed to have the same volume, and the cross-sectional area between every two adjacent cylinders satisfies:
Figure BDA0003449772110000031
namely, it is
Figure BDA0003449772110000032
Wherein r isi(i ═ 1,2, …, t) is the radius of the ith cylinder, r1The distance from a point on the cylinder to the center of the sensor; h is the height of the cylinder and t is the total number of regions into which the region is divided.
Preferably, the specific process of applying the point cloud data to the collected data to segment by using the spatial segmentation method with the cylindrical structure is as follows:
step 2.1: determining the total area number t of the area segmentation according to the conditions required to be met by the boundary radius of each area, and calculating the radius r of the boundary of the nearest area to the detector1,r1From the point furthest from the sensor determines:
Figure BDA0003449772110000033
step 2.2: according to when the i-th cylinder cross-sectional radius is the first cylinder cross-sectional radius
Figure BDA0003449772110000034
Double instant
Figure BDA0003449772110000035
The volumes of all the areas are equal, the boundary radius of each area is calculated, and the area segmentation of the detection space is completed;
step 2.3: dividing the collected original point cloud data into a plurality of data blocks according to a space area equal-volume division principle: calculating the distance between the space point cloud and a vertical line with the laser radar sensor as a central axis by using the flight time of the point cloud in the space: if the distance is between ri-1And riThe point cloud is classified in the ith group of data block;
step 2.4: when denoising and filtering are carried out on the point cloud data of each region, the domain radius of each region is defined as:
Epsi=kri
wherein Epsi(i-1, 2, …, t) represents the area radius of the i-th area, riIndicating the distance between the point cloud of the area i and the detector, due to
Figure BDA0003449772110000036
Therefore, the method comprises the following steps:
Figure BDA0003449772110000037
preferably, the specific method for performing principal component analysis on the data of each data block to perform dimensionality reduction on the data is as follows:
step 3.1: arranging the three-dimensional data points into a matrix of 3 rows and m columns, m being the number of three-dimensional data points, and subtracting the mean value
Figure BDA0003449772110000038
To ensure the resulting matrixXi each row is centered on zero; the covariance matrix C is obtained by:
Figure BDA0003449772110000041
step 3.2: arranging the three feature vectors into E ═ E (E)1,e2,e3) The matrix E satisfies ETCE ═ Λ, where Λ is the diagonal matrix;
obtaining three eigenvectors and corresponding eigenvalues thereof, arranging the eigenvectors into a matrix according to rows of the eigenvalues, taking the first two eigenvectors to generate a matrix P, and constructing a two-dimensional matrix according to the formula:
Y2×m=P2×3×ξ3×m
preferably, for the data of each data block, a density-based spatial clustering denoising algorithm is applied to perform point cloud clustering, and a specific method for deleting outliers is as follows:
step 4.1: for each non-access point p in the datasetiDetermine all its points in the Eps field using the RANGE _ QUERY function and collect these points into the subset NiWherein the subset NiThe number of points in is | Ni|;
Step 4.2: if | NiIf | is less than min pts, then piLabeling as outliers; | Ni| is more than or equal to min pts, and the subset NiAdding all points in (S ← N) to the set Si) In, minpts is a density threshold parameter;
step 4.3: for subset NiAll points in the Eps field of the points which are not visited are collected into a subset NjIn, if | Nj| is more than or equal to min pts, and the subset NjAdding all points in the set S (S ← S ≧ N ^ N)i) In (1).
Step 4.4: repeating steps 4.1 to 4.3 until each point in the data is traversed, the data set being divided into k clusters S1,S2,…,SkTherein each cluster SiThe number of points in (i ═ 1,2, …, k) is | SiBy mixing | SiCompare | with a threshold ψ if | Si|>Psi, then SiIs a signal cluster; otherwise it is identified as a noise cluster and removed.
Preferably, the specific formula for raising the obtained two-dimensional point cloud to three-dimensional point cloud data is as follows:
Figure BDA0003449772110000042
wherein Y' is a point cloud data set obtained after filtering in a two-dimensional space,
Figure BDA0003449772110000043
is the average value subtracted from the original matrix, and P is the matrix generated by the first two eigenvectors.
Compared with the prior art, the invention has the following remarkable advantages: the method firstly adopts the PCA technology to carry out dimensionality reduction on the original point cloud data, generates two-dimensional data by extracting a first principal component and a second principal component of the original data, has small information loss, can obviously reduce the overall complexity because the principal signal processing is executed on the two-dimensional data, and simultaneously effectively removes noise and retains environmental characteristics; in addition, the invention also provides a method for carrying out self-adaptive clustering on the two-dimensional point cloud before the two-dimensional point cloud is restored to the three-dimensional point cloud, and the accuracy rate of restoring the target point cloud image can be obviously improved by carrying out self-adaptive parameter setting on the two-dimensional space of the two main components; meanwhile, in order to solve the near-far effect caused by the distance difference between the viewpoint and the scanning point, the invention adopts a point cloud data segmentation method to process the point cloud data with different area characteristics in a partition manner, thereby obtaining better noise reduction effect.
Drawings
Fig. 1 is a schematic overall framework diagram of a method for processing remote lidar point cloud data according to the present invention.
Fig. 2 is a schematic diagram of a long-distance lidar point cloud acquisition system according to the present invention.
FIG. 3 is a schematic diagram of the principle of the density-based spatial clustering Denoising (DBSCAN) algorithm provided by the present invention.
FIG. 4 is a flow chart of the density-based spatial clustering Denoising (DBSCAN) algorithm proposed by the present invention.
Fig. 5 is a schematic diagram of a method for segmenting point cloud data of a long-distance lidar according to the present invention.
Detailed Description
The following detailed description of the preferred embodiments of the present invention, taken in conjunction with the accompanying drawings, will make the advantages and features of the invention easier to understand by those skilled in the art, and will thus make the scope of the invention more clearly and clearly defined.
As shown in fig. 1 to 5, a method for processing point cloud data of a long-distance lidar according to an embodiment of the present invention includes:
step 1: and scanning a target environment of data to be acquired by using a remote laser radar point cloud acquisition system, acquiring the flight time of each pixel point, and exporting and extracting the acquired data. The collected point cloud is three-dimensional data formed by the coordinates of a target pixel point and the flight time of return photons of the pixel point target.
Fig. 2 is a schematic diagram of a long-distance lidar point cloud acquisition system according to the present invention. The functional distinction is mainly divided into six units: the device comprises an optical transmitting unit, a beam splitting and expanding unit, a scanning unit, an optical receiving unit and a control unit. The laser comprises two output ends, namely a standard output end and a signal monitoring end, the photoelectric detector receives a signal of the monitoring end and inputs the signal into a 0 channel of TCSPC, and the TCSPC records the signal and marks the signal as starting time. The pulse laser output by the standard output end is collimated by the collimating mirror and then enters the optical beam expanding antenna through a small hole in the middle of the spectroscope, the pulse laser is further focused and shaped by the optical beam expanding antenna and then is incident on the reflecting surface of the quick reflecting mirror, the pulse laser reaches the surface of a target after being reflected by the mirror surface, and the quick reflecting mirror changes a pulse emission light path through biaxial shimmy so as to realize the scanning of the target. The target diffusely reflects the pulse laser back to the original optical path, the echo pulse is reflected by the reflecting surface of the light splitting unit after passing through the fast reflector and the optical beam expanding antenna, and is coupled into the single photon detector through the optical fiber after sequentially passing through the reflector, the filter and the focusing lens. Incident photons trigger the single photon detector to cause an avalanche effect, a response signal is output after quantization and input into the 1 channel of TCSPC, and the TCSPC records the 1 channel photon response time and marks the response time as the cutoff time. The FPGA controller reads data of a channel 0 and a channel 1 of the TCSPC, the upper computer calculates time difference of the two channels to obtain photon flight time data, and target original point cloud data are obtained after subsequent processing.
Step 2: since the point cloud density is strongly correlated with the point cloud spatial distribution, it depends on the distance from the lidar sensor. Thus, the further away from the sensor, the faster the signal obtained by clustering, and the larger the corresponding clustering parameter should be. Therefore, before processing the point cloud data, the point cloud data needs to be subjected to region segmentation, and a clustering parameter corresponding to each region data is calculated. The invention provides a space segmentation method of a cylindrical structure, which is characterized in that the whole space is segmented into a plurality of cylindrical blocks by taking the vertical line of a laser radar sensor as a central axis as shown in figure 5 (left), and the space between every two cylinders is the segmentation area of point cloud. In the cylinder model, to ensure that each region has the same volume, the cross-sectional area (shown in fig. 5 (right)) between each two adjacent cylinders should satisfy:
Figure BDA0003449772110000061
wherein r isi(i ═ 1,2, …, t) is the radius of the ith cylinder, i.e. the distance of a point on the cylinder from the center of the sensor; h is the height of the cylinder.
Step 2.1: determining the total area number t of the area segmentation according to the condition that the boundary radius of each area needs to be met by the space segmentation method of the cylindrical structure, and calculating the radius r of the boundary of the area closest to the detector1,r1From the point furthest from the sensor determines:
Figure BDA0003449772110000062
step 2.2: according to when the i-th cylinder cross-sectional radius is the first cylinder cross-sectional radius
Figure BDA0003449772110000063
Double instant
Figure BDA0003449772110000064
And the volumes of all the areas are equal, the boundary radius of each area is calculated, and the area segmentation of the detection space is completed.
Step 2.3: dividing the collected original point cloud data into a plurality of data blocks according to a space area equal-volume division principle: calculating the distance between the space point cloud and a vertical line with the laser radar sensor as a central axis by using the flight time of the point cloud in the space: if the distance is between ri-1And riAnd the point cloud is classified in the ith group of data block.
Step 2.4: when denoising and filtering are carried out on the point cloud data of each region, Eps of each region is defined as:
Epsi=kri
wherein Epsi(i-1, 2, …, t) represents the area radius of the i-th area, riRepresenting the distance between the point cloud of region i and the detector. Due to the fact that
Figure BDA0003449772110000071
Therefore, the method comprises the following steps:
Figure BDA0003449772110000072
and 3, step 3: before using a two-dimensional density-based spatial clustering (DBSCAN) algorithm to cluster point clouds, raw three-dimensional data needs to be preprocessed. After the point cloud data segmentation is completed, a principal component analysis technique is performed on the data of each data block to extract principal components and convert the three-dimensional data into two-dimensional data. The dimensionality reduction technology based on principal component analysis adopted by the invention is a component technology capable of extracting main features of data, and the corresponding feature vector of the minimum feature value is usually related to noise. By extracting the first principal component and the second principal component of the original data and discarding the feature vector related to noise to generate two-dimensional data, the complexity of the algorithm can be reduced while the reconstructed target feature is retained.
Step 3.1: assuming that the number of three-dimensional data points is m, the data points are first arranged in a matrix of 3 rows and m columns. And by subtracting the mean value
Figure BDA0003449772110000073
To ensure that each row of the resulting matrix ξ is centered at zero, i.e., the average value of each row is zero. The covariance matrix C is then obtained by:
Figure BDA0003449772110000074
step 3.2: arranging the three feature vectors into E ═ E (E)1,e2,e3) The matrix E needs to satisfy ETCE ═ Λ, where Λ is the diagonal matrix. The three eigenvectors and their corresponding eigenvalues can be obtained from the above equation. Since the variance contribution rate of the first principal component and the second principal component exceeds 95%, the third principal component contains less information and is deleted, and the first two eigenvectors are reserved. The eigenvectors are arranged into a matrix according to the rows of the eigenvalues, and the first two eigenvectors are taken to generate a matrix P. Constructing a two-dimensional matrix by operating the following equation:
Y2×m=P2×3×ξ3×m
and 4, step 4: based on the long-distance laser radar point cloud obtaining method and the dimensionality reduction theory, the invention provides a density-based spatial clustering Denoising (DBSCAN) algorithm for filtering two-dimensional point cloud on a plane. Assuming that the point cloud data sample set is D, D ═ x1,x2,…,xn) For sample xjBelongs to D, and the E domain set of the E domain set is the sum of x in the sample set DjSet of subsamples with a distance not greater than E, i.e. NE(xj)={xi∈D|d(xi,xj) More than or equal to E }, the number of the subsample sets is | NE(xj)|:
For sample xje.D if the number of the sample sets contained in the E field set is not less than PminI.e. | NE(xj)|≥PminThen xjIs a core object;
for sample xje.D if the number of sample sets contained in E field set is less than PminI.e. | NE(xj)|<PminThen xjIs a boundary;
for sample xje.D if the number of sample sets contained in E field set is less than PminAnd N isE(xj) Without a core object, then xjAre isolated points.
Fig. 3 is a schematic diagram of the DBSCAN algorithm, in which point a is a core point, point B, C is a boundary point, and N is an isolated point. The core points and the boundary points together form signal points, and the isolated points form outliers. The DBSCAN algorithm can find all dense areas of sample points and treat these dense areas as clusters. By counting the number of point clouds within each cluster, noise can be identified by comparison to a threshold. Identifying point cloud clusters having less than a threshold as signal clusters; otherwise it is identified as a noise cluster.
The original three-dimensional point cloud is represented by a set of unordered points:
ξ={(xi,yi,zi)T,i=1,2,…,m}
wherein (x)i,yi,zi)T∈R3Coordinate vector, m is the total number of points in the point cloud.
According to the principal component analysis-based dimensionality reduction technology provided by the invention, the original three-dimensional point cloud data can be converted into two-dimensional data. Each two-dimensional point cloud is composed of a first principal component and a second principal component, denoted by f and s, respectively. The point cloud input into the DBSCAN cluster can thus be represented as:
Y={(fi,si)T,i=1,2,…,m}
wherein (f)i,si)T∈R3Is a coordinate vector in a two-dimensional space made up of two principal components. Fig. 4 is a flow chart of the density-based spatial clustering (DBSCAN) algorithm proposed by the present invention. Data meansThere is a data set consisting of a first principal component f and a second principal component s. In a two-dimensional plane, the calculation formula of the distance can be expressed as:
Figure BDA0003449772110000081
step 4.1: for each non-access point p in the datasetiDetermine all its points in the Eps field using the RANGE _ QUERY function and collect these points into the subset NiWherein the subset NiThe number of points in is | Ni|。
Step 4.2: if | NiIf | is less than min pts, then piLabeling as outliers; | Ni| is more than or equal to min pts, and the subset NiAdding all points in (S ← N) to the set Si) In (1).
Step 4.3: for subset NiAll points in the Eps field of the points which are not visited are collected into a subset NjIn (1). If | Nj| is more than or equal to min pts, and the subset NjAdding all points in the set S (S ← S ≧ N ^ N)i) In (1).
Step 4.4: repeating steps 4.1 to 4.3 until each point in the data is traversed, the data set being divided into k clusters S1,S2,…,Sk}. Wherein each cluster SiThe number of points in (i ═ 1,2, …, k) is | SiBy mixing | SiCompare | with a threshold ψ if | SiIf | > psi, SiIs a signal cluster; otherwise it is identified as a noise cluster and removed.
And 5: and carrying out inverse PCA (principal component analysis) upscaling calculation on the new two-dimensional data obtained after denoising and filtering. The two-dimensional point cloud data Y' obtained by filtering and removing noise may be represented as Y { (f)i,si)TI is 1,2, …, n, which is the filtered coordinate vector in the two-dimensional space, and n is the number of point clouds remaining after filtering. After removing noise, the two-dimensional data needs to be restored to three-dimensional data, which can be obtained by the following formula:
Figure BDA0003449772110000091
wherein Y' is a point cloud data set obtained after filtering in a two-dimensional space,
Figure BDA0003449772110000092
is the average value subtracted from the original matrix.
Step 6: repeating the step 3 to the step 5: and performing the processes of reducing the dimensions of the point cloud data of all the areas by a principal component analysis technology, denoising by a density-based spatial clustering algorithm and performing inverse PAC (programmable automation controller) dimension increasing calculation, and then splicing the new three-dimensional data obtained by processing each area to obtain complete new point cloud data subjected to filtering and denoising.

Claims (7)

1. A long-distance laser radar point cloud data processing method is characterized by comprising the following steps:
scanning a target environment of data to be acquired by using a remote laser radar point cloud acquisition system, acquiring the photon flight time of each pixel point, and exporting and extracting the acquired data;
carrying out region segmentation on the collected data by applying a point cloud data segmentation method, and calculating a clustering parameter corresponding to each region data;
performing principal component analysis technology on the data of each data block to reduce the dimension of the data;
for the data of each data block, performing point cloud clustering by using a density-based spatial clustering denoising algorithm, deleting outliers, and increasing the dimension of the obtained two-dimensional point cloud to three-dimensional point cloud data;
and splicing the new three-dimensional data obtained by processing each region to obtain complete new point cloud data subjected to filtering and denoising, and completing denoising and filtering of the original point cloud data.
2. The method of claim 1, wherein the collected point cloud is three-dimensional data consisting of coordinates of a pixel point of the target and a time of flight of returned photons from the pixel point target.
3. A remote lidar point cloud data processing method according to claim 1, characterized in that the collected data is segmented using a spatial segmentation method of a cylindrical structure, specifically:
dividing the whole space into a plurality of cylindrical blocks by taking a vertical line of the laser radar sensor as a central axis, wherein the space between every two cylinders is a point cloud dividing area;
in the cylinder model, each region is guaranteed to have the same volume, and the cross-sectional area between every two adjacent cylinders satisfies:
Figure FDA0003449772100000011
namely that
Figure FDA0003449772100000012
Wherein r isi(i-1, 2, …, t) is the radius of the ith cylinder, r1The distance from a point on the cylinder to the center of the sensor; h is the height of the cylinder and t is the total number of regions into which the region is divided.
4. A long-range lidar point cloud data processing method as claimed in claim 3, wherein the specific process of applying the point cloud data to the collected data by the cylindrical structure space segmentation method is as follows:
step 2.1: determining the total area number t of the area segmentation according to the conditions required to be met by the boundary radius of each area, and calculating the radius r of the boundary of the nearest area to the detector1,r1Determined by the point furthest from the sensor:
Figure FDA0003449772100000013
step 2.2: according to when the i-th cylinder cross-sectional radius is the first cylinder cross-sectional radius
Figure FDA0003449772100000021
Double instant
Figure FDA0003449772100000022
The volumes of all the areas are equal, the boundary radius of each area is calculated, and the area segmentation of the detection space is completed;
step 2.3: dividing the collected original point cloud data into a plurality of data blocks according to a space area equal-volume division principle: calculating the distance between the space point cloud and a vertical line with the laser radar sensor as a central axis by using the flight time of the point cloud in the space: if the distance is between ri-1And riThe point cloud is classified in the ith group of data block;
step 2.4: when denoising and filtering are carried out on the point cloud data of each region, the domain radius of each region is defined as:
Epsi=kri
wherein Epsi(i-1, 2, …, t) represents the area radius of the i-th area, riIndicating the distance between the point cloud of the area i and the detector, due to
Figure FDA0003449772100000023
Therefore, the method comprises the following steps:
Figure FDA0003449772100000024
5. the long-distance lidar point cloud data processing method of claim 1, wherein the specific method for performing principal component analysis on the data of each data block to reduce the dimension of the data is as follows:
step 3.1: arranging the three-dimensional data points into a matrix of 3 rows and m columns, m being the number of three-dimensional data points, and subtracting the mean value
Figure FDA0003449772100000025
To ensureEach row of the obtained matrix xi takes zero as the center; the covariance matrix C is obtained by:
Figure FDA0003449772100000026
step 3.2: arranging three feature vectors into E ═ E (E)1,e2,e3) The matrix E satisfies ETCE ═ Λ, where Λ is the diagonal matrix;
obtaining three eigenvectors and corresponding eigenvalues thereof, arranging the eigenvectors into a matrix according to rows of the eigenvalues, taking the first two eigenvectors to generate a matrix P, and constructing a two-dimensional matrix according to the formula:
Y2×m=P2×3×ξ3×m
6. the long-distance lidar point cloud data processing method of claim 1, wherein for the data of each data block, a density-based spatial clustering denoising algorithm is applied to perform point cloud clustering, and the specific method for deleting outliers is as follows:
step 4.1: for each non-access point p in the datasetiDetermine all its points in the Eps field using the RANGE _ QUERY function and collect these points into the subset NiWherein the subset NiThe number of points in is | Ni|;
Step 4.2: if | NiIf | is less than min pts, then piLabeling as outliers; | Ni| is more than or equal to min pts, and the subset NiAdding all the points in (S ← N) to the set S (S ← N)i) In, minpts is a density threshold parameter;
step 4.3: for subset NiThe points which are not visited are collected into a subset NjIn, if | Nj| is more than or equal to min pts, and the subset NjAdding all points in the set S (S ← S ≧ N ^ N)i) In (1).
Step 4.4: repeating steps 4.1 to 4.3 until each point in the data is traversed, the data set being divided into k clusters S1,S2,…,SkTherein each cluster SiThe number of points in (i ═ 1,2, …, k) is | SiBy mixing | SiCompare | with a threshold ψ if | SiIf | > psi, SiIs a signal cluster; otherwise it is identified as a noise cluster and removed.
7. The method for processing the point cloud data of the long-distance laser radar as claimed in claim 1, wherein a specific formula for increasing the dimension of the obtained two-dimensional point cloud to the three-dimensional point cloud data is as follows:
Figure FDA0003449772100000031
wherein Y' is a point cloud data set obtained after filtering in a two-dimensional space,
Figure FDA0003449772100000032
is the average value subtracted from the original matrix, and P is the first two eigenvector generating matrix.
CN202111671088.6A 2021-12-31 2021-12-31 Long-distance laser radar point cloud data processing method Pending CN114494287A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111671088.6A CN114494287A (en) 2021-12-31 2021-12-31 Long-distance laser radar point cloud data processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111671088.6A CN114494287A (en) 2021-12-31 2021-12-31 Long-distance laser radar point cloud data processing method

Publications (1)

Publication Number Publication Date
CN114494287A true CN114494287A (en) 2022-05-13

Family

ID=81508184

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111671088.6A Pending CN114494287A (en) 2021-12-31 2021-12-31 Long-distance laser radar point cloud data processing method

Country Status (1)

Country Link
CN (1) CN114494287A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114897026A (en) * 2022-05-24 2022-08-12 上海枢光科技有限公司 Point cloud filtering method
CN115512099A (en) * 2022-06-10 2022-12-23 探维科技(北京)有限公司 Laser point cloud data processing method and device
CN115656982A (en) * 2022-12-13 2023-01-31 中国南方电网有限责任公司超高压输电公司广州局 Water surface clutter removal method and device, computer equipment and storage medium
CN115641553B (en) * 2022-12-26 2023-03-10 太原理工大学 Online detection device and method for invaders in heading machine working environment
CN116105694A (en) * 2022-12-09 2023-05-12 中国科学院上海技术物理研究所 Multi-means optical load composite space target three-dimensional vision measurement method
CN117148279A (en) * 2023-09-05 2023-12-01 中国电子科技集团公司第三十八研究所 Single-photon laser radar real-time signal detection method, system and electronic equipment
CN117272086A (en) * 2023-11-22 2023-12-22 中国电子科技集团公司第二十九研究所 Radar signal scanning envelope segmentation method based on DBSCAN

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114897026A (en) * 2022-05-24 2022-08-12 上海枢光科技有限公司 Point cloud filtering method
CN115512099A (en) * 2022-06-10 2022-12-23 探维科技(北京)有限公司 Laser point cloud data processing method and device
CN116105694A (en) * 2022-12-09 2023-05-12 中国科学院上海技术物理研究所 Multi-means optical load composite space target three-dimensional vision measurement method
CN116105694B (en) * 2022-12-09 2024-03-12 中国科学院上海技术物理研究所 Multi-means optical load composite space target three-dimensional vision measurement method
CN115656982A (en) * 2022-12-13 2023-01-31 中国南方电网有限责任公司超高压输电公司广州局 Water surface clutter removal method and device, computer equipment and storage medium
CN115641553B (en) * 2022-12-26 2023-03-10 太原理工大学 Online detection device and method for invaders in heading machine working environment
CN117148279A (en) * 2023-09-05 2023-12-01 中国电子科技集团公司第三十八研究所 Single-photon laser radar real-time signal detection method, system and electronic equipment
CN117272086A (en) * 2023-11-22 2023-12-22 中国电子科技集团公司第二十九研究所 Radar signal scanning envelope segmentation method based on DBSCAN
CN117272086B (en) * 2023-11-22 2024-02-13 中国电子科技集团公司第二十九研究所 Radar signal scanning envelope segmentation method based on DBSCAN

Similar Documents

Publication Publication Date Title
CN114494287A (en) Long-distance laser radar point cloud data processing method
US20240036207A1 (en) Multiple Resolution, Simultaneous Localization And Mapping Based On 3-D Lidar Measurements
Duan et al. Low-complexity point cloud denoising for LiDAR by PCA-based dimension reduction
CN112526513B (en) Millimeter wave radar environment map construction method and device based on clustering algorithm
CN111665517B (en) Density statistics-based single photon laser height finding data denoising method and device
CN111123212B (en) Signal processing method of scene surveillance radar based on complex clutter background
CN111781608A (en) Moving target detection method and system based on FMCW laser radar
CN105160649A (en) Multi-target tracking method and system based on kernel function unsupervised clustering
CN111913177A (en) Method and device for detecting target object and storage medium
Wang et al. An adaptive ellipsoid searching filter for airborne single-photon lidar
CN112130142A (en) Micro Doppler feature extraction method and system for complex moving target
Xu et al. Plane segmentation and fitting method of point clouds based on improved density clustering algorithm for laser radar
Stolz et al. High resolution automotive radar data clustering with novel cluster method
Chen et al. A graph-based track-before-detect algorithm for automotive radar target detection
CN115761534A (en) Method for detecting and tracking small target of infrared unmanned aerial vehicle under air background
CN115685185A (en) 4D millimeter wave radar and vision fusion perception method
CN112183330A (en) Target detection method based on point cloud
CN111311640B (en) Unmanned aerial vehicle identification and tracking method based on motion estimation
Zaletnyik et al. LIDAR waveform classification using self-organizing map
CN115453570A (en) Multi-feature fusion mining area dust filtering method
CN113177966B (en) Three-dimensional scanning coherent laser radar point cloud processing method based on velocity clustering statistics
Ni et al. Research on 3D image reconstruction of sparse power lines by array gm-apd lidar
Wang et al. Point cloud classification and accuracy analysis based on feature fusion
Wang et al. DBSCAN clustering algorithm of millimeter wave radar based on multi frame joint
Zhao et al. 3D target detection of Geiger mode APD array lidar image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination