CN110969624A - Laser radar three-dimensional point cloud segmentation method - Google Patents
Laser radar three-dimensional point cloud segmentation method Download PDFInfo
- Publication number
- CN110969624A CN110969624A CN201911082345.5A CN201911082345A CN110969624A CN 110969624 A CN110969624 A CN 110969624A CN 201911082345 A CN201911082345 A CN 201911082345A CN 110969624 A CN110969624 A CN 110969624A
- Authority
- CN
- China
- Prior art keywords
- point cloud
- segmentation
- dimensional
- point
- neighborhood
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/005—Tree description, e.g. octree, quadtree
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/187—Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/66—Analysis of geometric attributes of image moments or centre of gravity
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20101—Interactive definition of point of interest, landmark or seed
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The invention discloses a laser radar three-dimensional point cloud segmentation method, which comprises the steps of firstly extracting original three-dimensional point cloud data acquired by a laser radar, then preprocessing the original point cloud, including denoising, simplifying and coordinate transformation of the original point cloud data, constructing basic point cloud data under a three-dimensional Cartesian coordinate system, storing the three-dimensional data in a two-dimensional array form, then adopting a variable neighborhood dispersion search strategy, dynamically adjusting the resolution of a neighborhood range surrounding seeds and a search matching range by a region growing method, carrying out primary segmentation work of the point cloud, designing a point cloud segmentation envelope diffusion strategy on the basis, further searching the periphery of a point cloud segmentation set, realizing fusion of multiple sets, further obtaining a point cloud segmentation set, and finally designing a point cloud segmentation result visualization function for checking the point cloud segmentation effect. The invention effectively improves the segmentation rate, effectively inhibits the over-segmentation condition, keeps the integrity of each target and is convenient for observing the scanning result of the segmented target.
Description
Technical Field
The invention relates to the technical field of computer vision, in particular to a laser radar 3D point cloud segmentation method which can be used for rapidly segmenting three-dimensional point cloud data into independent point cloud sets.
Background
With the development of scientific technology, the acquisition of surrounding information during the running process of a ship is gradually expanded from a two-dimensional image mainly based on optical imaging to a three-dimensional image mainly based on laser scanning, and the marine laser radar is widely used as a sensor for acquiring three-dimensional point cloud data by virtue of the characteristics of high precision, interference resistance, small volume, light weight and the like, and is gradually popularized and installed at the high position of a ship mast.
The common three-dimensional point cloud segmentation method is mostly based on the traditional region growing method, and can segment connected regions with the same characteristics on the basis of keeping good boundary performance, for example, a 3D point cloud segmentation method is proposed in the document 'fast 3D point cloud segmentation combining voxel geometry and color information' of Huangzhen, and a new measurement variable for capturing the geometry and color information of a scanned target is utilized to obtain a hierarchical structure of the kept segmented regions, so that the method has better segmentation efficiency; for example, the wang Wen proposes a fusion clustering idea in a region growing method in an improved region growing color 3D point cloud segmentation algorithm, and sets a clustering threshold by referring to a normal line included angle and a color difference in a local neighborhood to realize a three-dimensional point cloud segmentation task, so that the method can improve the point cloud segmentation speed and stability. In the three-dimensional point cloud segmentation process, most of the documents deal with the point cloud segmentation speed or stability in a targeted manner, but the point cloud segmentation precision is not ideal, and particularly in the aspect of laser radar point cloud segmentation for ships, the problem of point cloud under segmentation caused by mutual shielding of multiple ships and the problem of point cloud over segmentation caused by a large ship being close to a laser radar are not effectively dealt with, so that a method for rapidly and accurately segmenting the three-dimensional point cloud of the laser radar for the ships is needed.
Disclosure of Invention
Aiming at the prior art, the invention aims to provide a laser radar three-dimensional point cloud segmentation method for effectively improving the segmentation precision and rapidity.
In order to solve the technical problem, the laser radar three-dimensional point cloud segmentation method provided by the invention comprises the following steps of:
step 1: acquiring laser radar three-dimensional point cloud data:
acquiring original three-dimensional point cloud data through a laser radar, wherein the point cloud data is represented by three elements (r, theta, psi), wherein r represents the distance of a scanning point, theta represents a horizontal direction angle, and psi represents a vertical direction angle;
step 2: the method comprises the following steps of preprocessing an original three-dimensional point cloud:
step 2.1: denoising an original three-dimensional point cloud: calculating the average distance R from each point to all points in the neighborhoodaveAssuming that the obtained distribution is a Gaussian distribution having a mean and a standard deviation, an outlier determination boundary radius R is setsetR is to beaveAnd RsetBy comparison, when R isave≤RsetIf yes, the determined point is reserved; when R isave>RsetIf the point is judged to be an outlier, deleting the outlier from the original point cloud data set;
step 2.2: simplifying down-sampling of an original three-dimensional point cloud:
utilizing voxel grid filtering to carry out down-sampling on the point cloud, specifically comprising the following steps: dividing the point cloud into cubic voxels by using an octree, calculating the gravity centers of all points in the cubic voxels, and replacing all point clouds in the voxels by using the gravity center points;
step 2.3: two-dimensional representation of a three-dimensional point cloud:
converting the three-dimensional vector coordinate into a three-dimensional rectangular coordinate, wherein the conversion formula is as follows:
wherein (x, y, z) in the above formula is the coordinate of each point in the point cloud data under the three-dimensional rectangular coordinate system;
two-dimensional array coordinate (x, y) position and storage distance data r of the positionxyThe relationship with the corresponding original three-dimensional point cloud vector coordinates (r, θ, ψ) can be represented by the following formula:
(x,y,rxy) For the preprocessed point cloud data, the horizontal direction angle resolution of the laser radar isA vertical angular resolution of
And step 3: point cloud preliminary segmentation based on a variable neighborhood dispersed search strategy:
the variable neighborhood decentralized searching strategy specifically comprises the following steps:
further expanding the eight neighborhoods outwards to form a larger neighborhood, setting the number of circles of the outwards expanded neighborhood as dynamic, and setting the outward detection distance of the laser radar in the horizontal direction as omega, the outward detection distance in the vertical direction as h, and the expansion lattice number N in the horizontal direction and the vertical direction as Nx_exdAnd Ny_exdSatisfies the following conditions:
the traversal range is simplified, namely, the non-search range is properly selected in the variable neighborhood range
After eight neighborhoods of the selected seed are expanded, the rear horizontal direction of the neighborhoods is changedWidth W of non-search range in vertical and horizontal directionsUnserAnd height HUnserSatisfies the following conditions:
by the above formula, in the region between the eight neighborhood region and the outer layer of the selected seed, the point cloud searching work is not carried out;
performing preliminary point cloud segmentation according to the variable neighborhood dispersed search strategy;
and 4, step 4: point cloud repartitioning is carried out based on a preliminary segmentation envelope diffusion strategy:
obtaining a point cloud set according to the region growing method in the step 3, extracting a segmentation envelope boundary of the set, expanding the sigma distance outwards along the normal direction, calculating the gravity center O of the set, selecting a boundary point N of the set optionally, and calculating the distance d between two pointsONThen extend the distance σ in the horizontal and vertical directionsxecAnd σyecSatisfies the following conditions:
and comparing whether the expanded range and other sets have intersection, if so, indicating that the two sets can be fused into one set, and if not, indicating that the two sets still exist as two different sets.
The invention also includes:
1. visualizing the point cloud segmentation result, specifically: and displaying the point cloud segmentation result in a three-dimensional scene by adopting an OpenGL three-dimensional scene display technology.
2. Step 3, the preliminary point cloud segmentation specifically comprises the following steps:
step 3.1: sequentially scanning the point cloud data, finding the 1 st unaffiliated data point as a seed, and recording the point as S (x)0,y0);
Step 3.2: with S (x)0,y0) As the center, determining the neighborhood and the non-searching range according to the coordinate position of the seed point, and determining the point S in the effective searching rangenew(x*,y*) S is calculated as followsnew(x*,y*) And S (x)0,y0) Is a spatial distance dS:
In the above formula (x)*,y*,z*) Setting the judgment distance to be d meters for the space coordinates of the points, and when d is measuredSWhen d is less than d, the region growing criterion is satisfied, and S is representednew(x*,y*) And S (x)0,y0) Belonging to the same set, while adding Snew(x*,y*) Pressing into a stack;
step 3.3: taking out a pixel from the stack, taking the pixel as a new seed point, repeating the step 3.2, and continuously searching points belonging to the same set;
step 3.4: when the stack is empty, returning to the step 3.1;
step 3.5: and (5) repeating the step 3.1 to the step 3.4 until the region growing is finished when each point in the three-dimensional point cloud has the attributive set.
The invention has the beneficial effects that: aiming at the problems of segmentation precision and rapidity in the laser radar three-dimensional point cloud segmentation process, the invention dynamically changes the neighborhood scanning range according to the different distances from the laser radar by improving the traditional region growing method, and simultaneously provides a dispersion search strategy to avoid repeated judgment of part of point clouds and solve the problem of over-segmentation caused by different arc lengths between adjacent points of the point clouds at different distances. And on the basis of primary segmentation, an envelope diffusion strategy is provided, further matching and association work of the point cloud set is carried out, the problem of three-dimensional point cloud under-segmentation is solved, and finally the task of improving the point cloud segmentation precision and speed and ensuring the integrity of a target is realized.
On the basis of the original point cloud, the three-dimensional point cloud data can be denoised and simplified by adopting a filtering and down-sampling mode, and the three-dimensional point cloud is stored in a two-dimensional plane mode, so that subsequent point cloud processing is facilitated; by adopting a variable neighborhood dispersion search strategy, the invariance of the geometrical characteristics of the point cloud in a three-dimensional space is kept, meanwhile, the segmentation rate is improved, and the over-segmentation condition is effectively inhibited; a preliminary segmentation envelope diffusion strategy is adopted, and re-diffusion searching operation is carried out on the set after preliminary segmentation, so that the condition of under-segmentation is effectively inhibited, and the integrity of each target is kept; and the algorithm segmentation effect is displayed in a visual mode, so that the scanning result of the segmented target ship can be observed conveniently. In conclusion, the invention realizes the task of quickly and accurately segmenting the laser radar three-dimensional point cloud.
Drawings
FIG. 1 is a flow chart of the three-dimensional point cloud segmentation process of the present invention;
FIG. 2 is a schematic view of a laser radar scan;
FIG. 3a is a schematic diagram of a point cloud denoising condition of the invention A;
FIG. 3B is a schematic diagram illustrating a point cloud denoising condition of the invention B;
FIG. 4 is a schematic diagram of a down-sampling principle of a three-dimensional point cloud;
FIG. 5 is a schematic diagram of a laser radar point cloud data expansion;
FIG. 6 is a diagram of a variable neighborhood range;
FIG. 7 is a simplified traversal range diagram;
FIG. 8 is a schematic diagram of a preliminary segmentation envelope extension strategy;
FIG. 9 shows the effect of point cloud segmentation;
FIG. 10 is a graph of the effect of over-segmentation in the conventional region growing method;
FIG. 11 is a graph of the effect of suppressing over-segmentation in accordance with the present invention;
FIG. 12 is a diagram of the effect of under-segmentation in the conventional region growing method;
FIG. 13 is a graph of the effect of suppressing under-segmentation according to the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
With reference to fig. 1, the laser radar three-dimensional point cloud segmentation process of the present invention is shown in the following figure. The method comprises the steps of firstly extracting original three-dimensional point cloud data acquired by a 360-degree laser radar, then carrying out preprocessing work on the original point cloud, wherein the preprocessing process comprises denoising, simplifying and coordinate transformation on the original point cloud data, constructing basic point cloud data under a three-dimensional Cartesian coordinate system, storing the three-dimensional data in a two-dimensional array form, then adopting a variable neighborhood dispersion search strategy, dynamically adjusting a neighborhood range surrounding seeds and the resolution ratio of a search matching range by a region growing method, carrying out primary point cloud segmentation work, designing a point cloud segmentation envelope diffusion strategy on the basis, further searching the periphery of a point cloud segmentation set, realizing fusion of multiple sets, further obtaining the point cloud segmentation set, and finally designing a visualization function of a point cloud segmentation result, wherein the visualization function is specially used for viewing the point cloud segmentation effect.
1. Acquisition of laser radar three-dimensional point cloud data
The three-dimensional point cloud data of the invention is from a laser radar for a ship, the laser radar is installed at the high position of a mast of a deck of the ship, and can rotate 360 degrees in the horizontal direction to scan other ships around, and a laser radar scanning schematic diagram is shown in figure 2.
The raw point cloud data acquired by the laser radar is represented by three elements (r, theta, psi), wherein r represents the distance of the scanning point, theta represents the horizontal direction angle, and psi represents the vertical direction angle.
2. Preprocessing of raw three-dimensional point clouds
(1) Denoising of original three-dimensional point cloud
In the invention, the measurement error of the marine laser radar hardware is considered, so that the original point cloud always has noise interference to generate a small amount of outliers, and therefore, before point cloud segmentation, denoising work is carried out on the original point cloud data: the method is based on distributed calculation of the distance from each point to the neighborhood, the average distance from each point to all points in the neighborhood is calculated by carrying out statistical analysis on the neighborhood of each point, and if the obtained distribution is Gaussian distribution with the mean value and the standard deviation, whether the scanning point of the laser radar is reserved or not is judged by comparing the average distance with the set mean value and standard deviation of the spherical distance;
combining FIG. 3a and FIG. 3B, for example, determining whether the point A and the point B, i.e. the center point in the graph, are reserved, setting the outlier to determine the boundary radius Rset0.4 meters, the average distance R between point A and point B and all points in their respective neighborhoods (i.e. pointed by the arrows) is determinedaveI.e. indicating circles of average distance from each point in the neighborhood in the upper graphRadius, and RsetA comparison is made to determine whether the determined point remains by:
according to the formula, the point A needs to be reserved, the point B is considered to be an outlier when the distance between the point B and other points is too large, and the point A is deleted from the original point cloud data set.
(2) Original three-dimensional point cloud down-sampling simplification
In consideration of the large quantity and high density of the original point clouds, the point cloud segmentation rate is seriously influenced, and the practicability of the segmentation method is seriously influenced, the method utilizes voxel grid filtering to carry out down-sampling on the 3D point cloud, and is shown in a schematic diagram 4;
and (3) dividing the point cloud into cubic voxels by using an octree with the lower sampling points being solid points, calculating the gravity centers of the voxels to obtain hollow points, and replacing all the point clouds in the voxels by using the gravity centers by using the method.
In addition, the characteristic that ships which can be scanned by the marine laser radar are high is considered, point cloud data which are higher than the horizontal plane by 0.5 m are filtered, the number of the point clouds can be reduced on the premise that the geometric characteristics are not changed through the method, and a data base is laid for improving the point cloud segmentation speed.
(3) Two-dimensional representation of a three-dimensional point cloud
In order to facilitate subsequent calculation, the three-dimensional vector coordinate is converted into a three-dimensional rectangular coordinate, and the conversion formula is as follows:
in the above formula, (x, y, z) is the coordinate of each point in the point cloud data under the three-dimensional rectangular coordinate system.
In order to improve the traditional region growing method, the three-dimensional point cloud data is stored in a two-dimensional array form, so that subsequent data access and extraction are facilitated, and the stored two-dimensional array form is as follows: weaving in horizontal direction by using x-axis coordinate as laser beamNumber, y-axis coordinate being the number of the laser beam in the vertical direction, rxyThe distance to the target is scanned for the laser beam.
Because the amount of the point cloud data is large, the original data acquired by the laser radar is stored by using the two-dimensional array, so that the subsequent data access and extraction are convenient, and the point cloud is expanded into the two-dimensional array form as shown in fig. 5.
In FIG. 5, where the x-axis coordinate is the number of the laser beam in the horizontal direction, the y-axis coordinate is the number of the laser beam in the vertical direction, rxyThe distance from the laser beam to the target encountered is scanned. The angular resolution of the laser radar in the horizontal direction is set asA vertical angular resolution ofTwo-dimensional array coordinate (x, y) position and storage distance data r of the positionxyThe relationship with the corresponding original three-dimensional point cloud vector coordinates (r, θ, ψ) can be represented by the following formula:
the invention will (x, y, r)xy) The point cloud data is used as the preprocessed point cloud data, and introduction of a subsequent processing method is carried out on the basis.
3. Point cloud preliminary segmentation based on variable neighborhood dispersion search strategy
On the basis of eight neighborhoods of the traditional region growing method, the invention provides a variable neighborhood dispersed search strategy, and the specific implementation process is shown in FIG. 6: the neighborhood searched in the traditional region growing method is an eight-neighborhood, such as a grid region 1, the invention provides a variable neighborhood, such as a grid region 2, namely the traditional eight-neighborhood is further expanded by one circle to form a larger neighborhood, the A part and the C part are connected and divided into one class, and the over-division condition is inhibited to a certain extent.
In the above manner, for a target vessel point cloud at a relatively short distance, since the point cloud density is relatively high in this case, for exampleIf the number of the fruit expansion turns is small, the over-segmentation condition can not be avoided; for the point cloud of the target vessel at a longer distance, if the number of expansion turns is large, two adjacent vessels may be divided into one vessel, resulting in the point cloud under-division condition. In order to take short-distance and long-distance three-dimensional point cloud into consideration, the number of outward expansion neighborhood circles is set to be dynamic, the outward detection distance of a laser radar in the horizontal direction is set to be omega, the outward detection distance of the laser radar in the vertical direction is set to be h, and the expansion grid number of the laser radar in the horizontal direction and the expansion grid number of the laser radar in the vertical direction are set to be Nx_exdAnd Ny_exdCalculated using the formula:
according to the method, the point cloud segmentation time is prolonged by neighborhood expansion, particularly for ships close to a laser radar, the corresponding three-dimensional point cloud is large in quantity and high in density, the range of the neighborhood needing to be traversed is larger, in order to determine whether the point cloud array slightly far away from the selected seed belongs to the same set as the seed, the traversal range is simplified, namely the non-search range is properly selected in the variable neighborhood range, and the schematic diagram is shown in fig. 7.
With reference to FIG. 7, after expanding from eight neighborhoods of selected seeds, the invention changes the width W of the horizontal and vertical convenient non-search range after the neighborhoodUnserAnd height HUnserCalculated from the following formula:
according to the method, the point cloud searching work is not carried out in the region between the eight-neighborhood region and the outer layer of the selected seed, the point cloud segmentation speed can be increased compared with the traditional region growing method, the eight-neighborhood region ensures the growing continuity, each circle of the outer layer can check whether the point clouds of the arrays at a longer distance belong to the same type as the seed, and the over-segmentation problem caused by the over-dense point clouds is solved.
On the basis of the variable neighborhood dispersed search strategy, the preliminary point cloud segmentation step is as follows:
step 1: sequentially scanning the point cloud data, finding the 1 st unaffiliated data point as a seed, and recording the point as S (x)0,y0);
Step 2: with S (x)0,y0) As the center, determining the neighborhood and the non-searching range according to the coordinate position of the seed point, and determining the point S in the effective searching rangenew(x*,y*) S is calculated as followsnew(x*,y*) And S (x)0,y0) Is a spatial distance dS:
In the above formula (x)*,y*,z*) Is the spatial coordinates of the point. The invention sets the judgment distance to be 5 m when dSWhen the average value is less than 5, the region growing criterion is satisfied, and S is representednew(x*,y*) And S (x)0,y0) Belonging to the same set, while adding Snew(x*,y*) Pressing into a stack;
step 3: taking out a pixel from the stack, taking the pixel as a new seed point, repeating the Step 2 Step, and continuously searching points belonging to the same set;
step 4: when the stack is empty, returning to the Step 1;
step 5: and repeating the steps from Step 1 to Step 4 until each point in the three-dimensional point cloud has the attributive set, and finishing the region growing.
4. Point cloud re-segmentation based on preliminary segmentation envelope diffusion strategy
After obtaining the initial segmentation result of the three-dimensional point cloud, the invention further considers the situation of under-segmentation which still may occur in the initial segmentation result, that is, when two ships are shielded, one ship is easily segmented into two parts by the other ship. The invention proposes a preliminary segmentation envelope extension strategy, the principle of which is shown in fig. 8.
In conjunction with fig. 8, in accordance with the improved novel region growing method described above, if A, B, C point cloud sets are available,taking the A set as an example, extracting the segmentation envelope boundary of the A set, expanding the sigma distance outwards along the normal direction, calculating the gravity center O of the A set, selecting one boundary point N of the A set optionally, and calculating the distance d between the two pointsONThen extend the distance σ in the horizontal and vertical directionsxecAnd σyecCalculated from the following formula:
and comparing whether the expanded range and other sets have intersection, if so, indicating that the two sets can be fused into one set, and if not, indicating that the two sets still exist as two different sets. Through the steps, the point cloud under-segmentation condition can be further effectively inhibited. As shown in fig. 8, the relationship between the envelope extension boundary of the a set and the C set is detected, and there is an intersection between the envelope extension boundary of the a set and the C set, and there is no intersection between the envelope extension boundary of the a set and the C set, so that the a set and the C set represent the same ship, and the B set represents a ship separately.
5. Point cloud segmentation result visualization
The invention adopts OpenGL three-dimensional scene display technology to display a point cloud segmentation result in a three-dimensional scene, the display result of the three-dimensional scene takes white or black as a background, and different colors are used for representing all part sets after point cloud segmentation, in order to highlight and compare the point cloud segmentation result, the invention adds a filtering function, the filtering principle is that when the point cloud set of one ship is smaller than a certain threshold value, the point cloud set is not displayed, and the threshold value is set to be 500. The point cloud segmentation result is visualized as shown in fig. 9, and different part sets are represented as different gray scales and represent different colors.
6. The test results of the method of the invention are compared with the test results of the traditional method
In order to verify the effectiveness of the point cloud segmentation method, the comparison work is carried out with the traditional region growing method, and the comparison result is as follows:
(1) comparison of processing results for over-segmentation cases
When the simulation working condition is that one large ship is close to the laser radar, the point cloud segmentation effect of the traditional area growing method is as shown in fig. 10, and it can be seen that the three-dimensional point cloud of the ship is divided and then consists of colors 1 to 4, and the three-dimensional point cloud is displayed as different gray scales in fig. 10, which illustrates that the traditional area growing method divides one ship into a plurality of ships, so that the obvious over-segmentation condition is caused.
By utilizing the method of the invention to carry out segmentation on the three-dimensional point cloud data of the ship, the effect graph is shown in fig. 11, and the result shows that although the ship still consists of a plurality of small blocks and gaps are formed among the small blocks, the small blocks are red and are displayed in the same gray scale in fig. 11, namely, the small blocks are classified into one class by the segmentation algorithm, and no over-segmentation phenomenon occurs.
(2) Comparison of processing results for under-segmentation cases
In the simulation condition, when two ships are blocked, the point cloud segmentation effect of the traditional region growing method is as shown in fig. 12, and it can be seen that the point cloud segmentation of the two ships has colors from 1 to 3, which are 3 different colors, and the point cloud segmentation in fig. 12 has three different gray scales, that is, the point cloud segmentation is divided into three ships by the traditional method, which is incorrect.
By utilizing the method of the invention, the segmentation work is carried out on the two ship point clouds under the working condition, the effect is shown in fig. 13, and the segmentation result only has two colors of color 1 and color 2, which are shown as two different gray scales in fig. 13, thereby showing that the point cloud segmentation is accurately completed and effectively inhibiting the under-segmentation condition.
(3) Segmentation time result comparison
Selecting a sample containing 40 ten thousand points in the three-dimensional point cloud data, wherein the time for completing segmentation calculation by the traditional method is about 10 seconds through testing running time; by adopting the point cloud segmentation method, the time for completing point cloud segmentation is reduced to about 1.0 second, which is shortened to 1/10 of the segmentation time of the traditional method, and the method obviously shortens the point cloud segmentation time and has higher speed.
Claims (3)
1. A laser radar three-dimensional point cloud segmentation method is characterized by comprising the following steps:
step 1: acquiring laser radar three-dimensional point cloud data:
acquiring original three-dimensional point cloud data through a laser radar, wherein the point cloud data is represented by three elements (r, theta, psi), wherein r represents the distance of a scanning point, theta represents a horizontal direction angle, and psi represents a vertical direction angle;
step 2: the method comprises the following steps of preprocessing an original three-dimensional point cloud:
step 2.1: denoising an original three-dimensional point cloud: calculating the average distance R from each point to all points in the neighborhoodaveAssuming that the obtained distribution is a Gaussian distribution having a mean and a standard deviation, an outlier determination boundary radius R is setsetR is to beaveAnd RsetBy comparison, when R isave≤RsetIf yes, the determined point is reserved; when R isave>RsetIf the point is judged to be an outlier, deleting the outlier from the original point cloud data set;
step 2.2: simplifying down-sampling of an original three-dimensional point cloud:
utilizing voxel grid filtering to carry out down-sampling on the point cloud, specifically comprising the following steps: dividing the point cloud into cubic voxels by using an octree, calculating the gravity centers of all points in the cubic voxels, and replacing all point clouds in the voxels by using the gravity center points;
step 2.3: two-dimensional representation of a three-dimensional point cloud:
converting the three-dimensional vector coordinate into a three-dimensional rectangular coordinate, wherein the conversion formula is as follows:
wherein (x, y, z) in the above formula is the coordinate of each point in the point cloud data under the three-dimensional rectangular coordinate system;
two-dimensional array coordinate (x, y) position and storage distance data r of the positionxyThe relationship with the corresponding original three-dimensional point cloud vector coordinates (r, θ, ψ) can be represented by the following formula:
(x,y,rxy) For the preprocessed point cloud data, the horizontal direction angle of the laser radarThe degree resolution isA vertical angular resolution of
And step 3: point cloud preliminary segmentation based on a variable neighborhood dispersed search strategy:
the variable neighborhood decentralized searching strategy specifically comprises the following steps:
further expanding the eight neighborhoods outwards to form a larger neighborhood, setting the number of circles of the outwards expanded neighborhood as dynamic, and setting the outward detection distance of the laser radar in the horizontal direction as omega, the outward detection distance in the vertical direction as h, and the expansion lattice number N in the horizontal direction and the vertical direction as Nx_exdAnd Ny_exdSatisfies the following conditions:
the traversal range is simplified, namely, the non-search range is properly selected in the variable neighborhood range
When the eight neighborhoods of the selected seeds are expanded, the widths W of the non-search ranges in the horizontal direction and the vertical direction after the neighborhood is changedUnserAnd height HUnserSatisfies the following conditions:
by the above formula, in the region between the eight neighborhood region and the outer layer of the selected seed, the point cloud searching work is not carried out;
performing preliminary point cloud segmentation according to the variable neighborhood dispersed search strategy;
and 4, step 4: point cloud repartitioning is carried out based on a preliminary segmentation envelope diffusion strategy:
obtaining a point cloud set according to the region growing method in the step 3, extracting a segmentation envelope boundary of the set, expanding the sigma distance outwards along the normal direction, calculating the gravity center O of the set, and optionally selecting the gravity center O of the setA boundary point N, calculating the distance d between the two pointsONThen extend the distance σ in the horizontal and vertical directionsxecAnd σyecSatisfies the following conditions:
and comparing whether the expanded range and other sets have intersection, if so, indicating that the two sets can be fused into one set, and if not, indicating that the two sets still exist as two different sets.
2. The lidar three-dimensional point cloud segmentation method according to claim 1, wherein: visualizing the point cloud segmentation result, specifically: and displaying the point cloud segmentation result in a three-dimensional scene by adopting an OpenGL three-dimensional scene display technology.
3. The lidar three-dimensional point cloud segmentation method according to claim 1, wherein: step 3, the preliminary point cloud segmentation specifically comprises the following steps:
step 3.1: sequentially scanning the point cloud data, finding the 1 st unaffiliated data point as a seed, and recording the point as S (x)0,y0);
Step 3.2: with S (x)0,y0) As the center, determining the neighborhood and the non-searching range according to the coordinate position of the seed point, and determining the point S in the effective searching rangenew(x*,y*) S is calculated as followsnew(x*,y*) And S (x)0,y0) Is a spatial distance dS:
In the above formula (x)*,y*,z*) Setting the judgment distance to be d meters for the space coordinates of the points, and when d is measuredSWhen d is less than d, the region growing criterion is satisfied, and S is representednew(x*,y*) And S (x)0,y0) Belonging to the same set, while adding Snew(x*,y*) Pressing into a stack;
step 3.3: taking out a pixel from the stack, taking the pixel as a new seed point, repeating the step 3.2, and continuously searching points belonging to the same set;
step 3.4: when the stack is empty, returning to the step 3.1;
step 3.5: and (5) repeating the step 3.1 to the step 3.4 until the region growing is finished when each point in the three-dimensional point cloud has the attributive set.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911082345.5A CN110969624B (en) | 2019-11-07 | 2019-11-07 | Laser radar three-dimensional point cloud segmentation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911082345.5A CN110969624B (en) | 2019-11-07 | 2019-11-07 | Laser radar three-dimensional point cloud segmentation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110969624A true CN110969624A (en) | 2020-04-07 |
CN110969624B CN110969624B (en) | 2023-08-01 |
Family
ID=70030357
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911082345.5A Active CN110969624B (en) | 2019-11-07 | 2019-11-07 | Laser radar three-dimensional point cloud segmentation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110969624B (en) |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111507919A (en) * | 2020-04-16 | 2020-08-07 | 北京深测科技有限公司 | Denoising processing method for three-dimensional point cloud data |
CN111722201A (en) * | 2020-06-23 | 2020-09-29 | 常州市贝叶斯智能科技有限公司 | Data denoising method for indoor robot laser radar |
CN111724323A (en) * | 2020-06-19 | 2020-09-29 | 武汉海达数云技术有限公司 | Laser radar point cloud sunlight noise removing method and device |
CN111859612A (en) * | 2020-06-08 | 2020-10-30 | 北京经纬恒润科技有限公司 | Laser radar simulation method and device |
CN111929657A (en) * | 2020-08-26 | 2020-11-13 | 北京布科思科技有限公司 | Laser radar noise filtering method, device and equipment |
CN111951197A (en) * | 2020-08-14 | 2020-11-17 | 中国科学院自动化研究所苏州研究院 | Point cloud segmentation method based on structured light |
CN112184867A (en) * | 2020-09-23 | 2021-01-05 | 中国第一汽车股份有限公司 | Point cloud feature extraction method, device, equipment and storage medium |
CN112199991A (en) * | 2020-08-27 | 2021-01-08 | 广州中国科学院软件应用技术研究所 | Simulation point cloud filtering method and system applied to vehicle-road cooperative roadside sensing |
CN112465829A (en) * | 2020-10-26 | 2021-03-09 | 南京理工大学 | Interactive point cloud segmentation method based on feedback control |
CN112560747A (en) * | 2020-12-23 | 2021-03-26 | 苏州工业园区测绘地理信息有限公司 | Vehicle-mounted point cloud data-based lane boundary interactive extraction method |
CN112991435A (en) * | 2021-02-09 | 2021-06-18 | 中国农业大学 | Orchard end-of-row and head-of-row identification method based on 3D LiDAR |
CN113011113A (en) * | 2021-03-23 | 2021-06-22 | 中国空气动力研究与发展中心计算空气动力研究所 | Method for rapidly searching discrete point cloud repetition points |
CN113034682A (en) * | 2021-04-13 | 2021-06-25 | 阿波罗智联(北京)科技有限公司 | Point cloud data processing method, device, equipment and storage medium |
CN113177477A (en) * | 2021-04-29 | 2021-07-27 | 湖南大学 | Target detection and identification method based on three-dimensional point cloud analysis |
CN113866743A (en) * | 2021-12-06 | 2021-12-31 | 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) | Roadside laser point cloud simplification method and system for cooperative vehicle and road sensing |
CN114049448A (en) * | 2021-11-16 | 2022-02-15 | 武汉中海庭数据技术有限公司 | POI matching and system based on area interpolation |
CN114111628A (en) * | 2021-12-07 | 2022-03-01 | 西安理工大学 | Three-dimensional reconstruction algorithm for underwater target laser point cloud data |
WO2022061604A1 (en) * | 2020-09-23 | 2022-03-31 | 株式会社Ntt都科摩 | Electronic device and signal envelop obtaining method |
CN114820986A (en) * | 2022-05-13 | 2022-07-29 | 广西微车检智能科技有限公司 | Trailer outline parameter measuring method based on laser radar |
CN116012613A (en) * | 2023-01-04 | 2023-04-25 | 北京数字绿土科技股份有限公司 | Method and system for measuring and calculating earthwork variation of strip mine based on laser point cloud |
CN116168386A (en) * | 2023-03-06 | 2023-05-26 | 东南大学 | Bridge construction progress identification method based on laser radar scanning |
CN116523414A (en) * | 2023-06-29 | 2023-08-01 | 深圳市鑫冠亚科技有限公司 | Production management method and system for composite nickel-copper heat dissipation bottom plate |
CN116797704A (en) * | 2023-08-24 | 2023-09-22 | 山东云海国创云计算装备产业创新中心有限公司 | Point cloud data processing method, system, device, electronic equipment and storage medium |
CN114049448B (en) * | 2021-11-16 | 2024-10-01 | 武汉中海庭数据技术有限公司 | POI matching and system based on area interpolation |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106056614A (en) * | 2016-06-03 | 2016-10-26 | 武汉大学 | Building segmentation and contour line extraction method of ground laser point cloud data |
US20190197311A1 (en) * | 2017-12-26 | 2019-06-27 | Harbin Institute Of Technology | Evaluation Method of Solar Energy Utilization Potential in Urban High-density Areas Based on Low-altitude Photogrammetry |
CN109961440A (en) * | 2019-03-11 | 2019-07-02 | 重庆邮电大学 | A kind of three-dimensional laser radar point cloud Target Segmentation method based on depth map |
-
2019
- 2019-11-07 CN CN201911082345.5A patent/CN110969624B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106056614A (en) * | 2016-06-03 | 2016-10-26 | 武汉大学 | Building segmentation and contour line extraction method of ground laser point cloud data |
US20190197311A1 (en) * | 2017-12-26 | 2019-06-27 | Harbin Institute Of Technology | Evaluation Method of Solar Energy Utilization Potential in Urban High-density Areas Based on Low-altitude Photogrammetry |
CN109961440A (en) * | 2019-03-11 | 2019-07-02 | 重庆邮电大学 | A kind of three-dimensional laser radar point cloud Target Segmentation method based on depth map |
Non-Patent Citations (1)
Title |
---|
盛仲飙;韩慧妍: "散乱点云分割技术研究与实现" * |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111507919B (en) * | 2020-04-16 | 2023-07-14 | 北京深测科技有限公司 | Denoising processing method for three-dimensional point cloud data |
CN111507919A (en) * | 2020-04-16 | 2020-08-07 | 北京深测科技有限公司 | Denoising processing method for three-dimensional point cloud data |
CN111859612A (en) * | 2020-06-08 | 2020-10-30 | 北京经纬恒润科技有限公司 | Laser radar simulation method and device |
CN111724323B (en) * | 2020-06-19 | 2024-01-26 | 武汉海达数云技术有限公司 | Method and device for removing sunlight noise of laser radar point cloud |
CN111724323A (en) * | 2020-06-19 | 2020-09-29 | 武汉海达数云技术有限公司 | Laser radar point cloud sunlight noise removing method and device |
CN111722201A (en) * | 2020-06-23 | 2020-09-29 | 常州市贝叶斯智能科技有限公司 | Data denoising method for indoor robot laser radar |
CN111951197A (en) * | 2020-08-14 | 2020-11-17 | 中国科学院自动化研究所苏州研究院 | Point cloud segmentation method based on structured light |
CN111951197B (en) * | 2020-08-14 | 2023-10-20 | 中国科学院自动化研究所苏州研究院 | Point cloud segmentation method based on structured light |
CN111929657A (en) * | 2020-08-26 | 2020-11-13 | 北京布科思科技有限公司 | Laser radar noise filtering method, device and equipment |
CN111929657B (en) * | 2020-08-26 | 2023-09-19 | 北京布科思科技有限公司 | Noise filtering method, device and equipment for laser radar |
CN112199991B (en) * | 2020-08-27 | 2024-04-30 | 广州中国科学院软件应用技术研究所 | Simulation point cloud filtering method and system applied to vehicle-road cooperation road side perception |
CN112199991A (en) * | 2020-08-27 | 2021-01-08 | 广州中国科学院软件应用技术研究所 | Simulation point cloud filtering method and system applied to vehicle-road cooperative roadside sensing |
CN112184867A (en) * | 2020-09-23 | 2021-01-05 | 中国第一汽车股份有限公司 | Point cloud feature extraction method, device, equipment and storage medium |
WO2022061604A1 (en) * | 2020-09-23 | 2022-03-31 | 株式会社Ntt都科摩 | Electronic device and signal envelop obtaining method |
CN112465829A (en) * | 2020-10-26 | 2021-03-09 | 南京理工大学 | Interactive point cloud segmentation method based on feedback control |
CN112465829B (en) * | 2020-10-26 | 2022-09-27 | 南京理工大学 | Interactive point cloud segmentation method based on feedback control |
CN112560747A (en) * | 2020-12-23 | 2021-03-26 | 苏州工业园区测绘地理信息有限公司 | Vehicle-mounted point cloud data-based lane boundary interactive extraction method |
CN112991435B (en) * | 2021-02-09 | 2023-09-15 | 中国农业大学 | Orchard end-of-line and head-of-line identification method based on 3D LiDAR |
CN112991435A (en) * | 2021-02-09 | 2021-06-18 | 中国农业大学 | Orchard end-of-row and head-of-row identification method based on 3D LiDAR |
CN113011113B (en) * | 2021-03-23 | 2022-04-05 | 中国空气动力研究与发展中心计算空气动力研究所 | Method for rapidly searching discrete point cloud repetition points |
CN113011113A (en) * | 2021-03-23 | 2021-06-22 | 中国空气动力研究与发展中心计算空气动力研究所 | Method for rapidly searching discrete point cloud repetition points |
CN113034682A (en) * | 2021-04-13 | 2021-06-25 | 阿波罗智联(北京)科技有限公司 | Point cloud data processing method, device, equipment and storage medium |
EP4036861A3 (en) * | 2021-04-13 | 2022-10-19 | Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. | Method and apparatus for processing point cloud data, electronic device, storage medium, computer program product |
CN113177477A (en) * | 2021-04-29 | 2021-07-27 | 湖南大学 | Target detection and identification method based on three-dimensional point cloud analysis |
CN114049448B (en) * | 2021-11-16 | 2024-10-01 | 武汉中海庭数据技术有限公司 | POI matching and system based on area interpolation |
CN114049448A (en) * | 2021-11-16 | 2022-02-15 | 武汉中海庭数据技术有限公司 | POI matching and system based on area interpolation |
CN113866743A (en) * | 2021-12-06 | 2021-12-31 | 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) | Roadside laser point cloud simplification method and system for cooperative vehicle and road sensing |
CN114111628A (en) * | 2021-12-07 | 2022-03-01 | 西安理工大学 | Three-dimensional reconstruction algorithm for underwater target laser point cloud data |
CN114820986A (en) * | 2022-05-13 | 2022-07-29 | 广西微车检智能科技有限公司 | Trailer outline parameter measuring method based on laser radar |
CN114820986B (en) * | 2022-05-13 | 2024-04-09 | 广西微车检智能科技有限公司 | Laser radar-based trailer outline parameter measurement method |
CN116012613A (en) * | 2023-01-04 | 2023-04-25 | 北京数字绿土科技股份有限公司 | Method and system for measuring and calculating earthwork variation of strip mine based on laser point cloud |
CN116012613B (en) * | 2023-01-04 | 2024-01-16 | 北京数字绿土科技股份有限公司 | Method and system for measuring and calculating earthwork variation of strip mine based on laser point cloud |
CN116168386A (en) * | 2023-03-06 | 2023-05-26 | 东南大学 | Bridge construction progress identification method based on laser radar scanning |
CN116523414B (en) * | 2023-06-29 | 2023-09-05 | 深圳市鑫冠亚科技有限公司 | Production management method and system for composite nickel-copper heat dissipation bottom plate |
CN116523414A (en) * | 2023-06-29 | 2023-08-01 | 深圳市鑫冠亚科技有限公司 | Production management method and system for composite nickel-copper heat dissipation bottom plate |
CN116797704B (en) * | 2023-08-24 | 2024-01-23 | 山东云海国创云计算装备产业创新中心有限公司 | Point cloud data processing method, system, device, electronic equipment and storage medium |
CN116797704A (en) * | 2023-08-24 | 2023-09-22 | 山东云海国创云计算装备产业创新中心有限公司 | Point cloud data processing method, system, device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN110969624B (en) | 2023-08-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110969624B (en) | Laser radar three-dimensional point cloud segmentation method | |
Cheng et al. | 3D building model reconstruction from multi-view aerial imagery and lidar data | |
Gross et al. | Extraction of lines from laser point clouds | |
CN110443201B (en) | Target identification method based on multi-source image joint shape analysis and multi-attribute fusion | |
CN107451982A (en) | A kind of high canopy density standing forest tree crown area acquisition methods based on unmanned plane image | |
CN108074232B (en) | Voxel segmentation-based airborne LIDAR building detection method | |
CN111047698B (en) | Real projection image acquisition method | |
Cheng et al. | Building boundary extraction from high resolution imagery and lidar data | |
CN111784840B (en) | LOD (line-of-sight) level three-dimensional data singulation method and system based on vector data automatic segmentation | |
Ma | Building model reconstruction from LiDAR data and aerial photographs | |
CN112099046B (en) | Airborne LIDAR three-dimensional plane detection method based on multi-value voxel model | |
CN114926739B (en) | Unmanned collaborative acquisition processing method for geographical space information on water and under water of inland waterway | |
CN111325138A (en) | Road boundary real-time detection method based on point cloud local concave-convex characteristics | |
CN116433672B (en) | Silicon wafer surface quality detection method based on image processing | |
CN102201125A (en) | Method for visualizing three-dimensional imaging sonar data | |
CN111932669A (en) | Deformation monitoring method based on slope rock mass characteristic object | |
CN113177593A (en) | Fusion method of radar point cloud and image data in water traffic environment | |
CN111127622A (en) | Three-dimensional point cloud outlier rejection method based on image segmentation | |
Kim et al. | Tree and building detection in dense urban environments using automated processing of IKONOS image and LiDAR data | |
CN107993242B (en) | Method for extracting boundary of missing area based on airborne LiDAR point cloud data | |
CN113807238A (en) | Visual measurement method for area of river surface floater | |
US11348261B2 (en) | Method for processing three-dimensional point cloud data | |
Omidalizarandi et al. | Segmentation and classification of point clouds from dense aerial image matching | |
Ni et al. | Applications of 3d-edge detection for als point cloud | |
CN109118565B (en) | Electric power corridor three-dimensional model texture mapping method considering shielding of pole tower power line |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |