CN117974741A - 360-Degree point cloud depth zone triangulation composition method, device and system - Google Patents

360-Degree point cloud depth zone triangulation composition method, device and system Download PDF

Info

Publication number
CN117974741A
CN117974741A CN202410381411.3A CN202410381411A CN117974741A CN 117974741 A CN117974741 A CN 117974741A CN 202410381411 A CN202410381411 A CN 202410381411A CN 117974741 A CN117974741 A CN 117974741A
Authority
CN
China
Prior art keywords
point cloud
point
depth
triangulation
primary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410381411.3A
Other languages
Chinese (zh)
Other versions
CN117974741B (en
Inventor
张旭东
樊杰
邹渊
李圆圆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yangtze River Delta Research Institute Of Beijing University Of Technology Jiaxing
Original Assignee
Yangtze River Delta Research Institute Of Beijing University Of Technology Jiaxing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yangtze River Delta Research Institute Of Beijing University Of Technology Jiaxing filed Critical Yangtze River Delta Research Institute Of Beijing University Of Technology Jiaxing
Priority to CN202410381411.3A priority Critical patent/CN117974741B/en
Priority claimed from CN202410381411.3A external-priority patent/CN117974741B/en
Publication of CN117974741A publication Critical patent/CN117974741A/en
Application granted granted Critical
Publication of CN117974741B publication Critical patent/CN117974741B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a 360-degree point cloud depth zone triangulation composition method, device and system, and belongs to the field of point cloud composition. Projecting a 3D original point cloud of a traffic scene onto a cylindrical surface, wherein the adjacency relation of the projected point cloud in the neighborhood of the azimuth angle pi/-pi is not changed, the point cloud is in a space continuous state, and the adjacency relation and the space continuous state are not changed by element changing operation during coordinate transformation. Because all depth zone point clouds are positioned on the same curved surface, the generated normal will be vertical to the curved surface, and continuity in pi/-pi azimuth angles is ensured.

Description

360-Degree point cloud depth zone triangulation composition method, device and system
Technical Field
The invention relates to the field of point cloud composition, in particular to a 360-degree point cloud depth zone triangulation composition method, device and system.
Background
When the 3D point cloud is represented in a mode similar to a picture for deep learning, the 3D original point cloud is projected onto a 2D depth plane space, then 2D Delaunay triangulation is performed, dalaunay triangulation effects are shown in fig. 3, and then a composition (graph) of the point cloud is obtained. Figure 3 shows the long side, pi/-pi azimuthal plane, spatial discontinuity and the car split into two pieces. Referring to fig. 3, the triangle section graph obtained by the method has discontinuity when the azimuth angle is pi/-pi, and if the composition of the point cloud is used for tasks such as classification, positioning, 3D target detection and the like of the graph neural network, information loss is caused, and the result is inaccurate.
Disclosure of Invention
The invention aims to provide a 360-degree point cloud depth band triangulation composition method, device and system, which can solve the problem of discontinuity of 2D triangulation when the azimuth angle is pi/-pi.
In order to achieve the above object, the present invention provides the following.
A 360 degree point cloud depth zone triangulation patterning method, comprising: projecting a 3D original point cloud of a traffic scene onto a cylindrical surface in a projection mode; the projection mode is the intersection point of the ray from the origin of the cylindrical surface to the 3D origin cloud direction and the cylindrical surface; changing elements of projection points on the cylindrical surface to obtain depth band point clouds; determining a normal line of each point in the depth band point cloud; performing Delaunay triangulation on the depth band point cloud according to the normal line of each point in the depth band point cloud to obtain a triangulation graph of the depth band point cloud; back-projecting the triangular split map to a 3D original point cloud to obtain a primary map of a traffic scene; and carrying out graph modification on the primary graph to obtain a final composition of the traffic scene.
Optionally, the cylindrical surface uses the z-axis as the rotation axis.
Optionally, changing elements of projection points on the cylindrical surface to obtain depth band point cloud, which specifically comprises the following steps: replacing z-axis coordinates of midpoint of 3D original point cloud withUse/>The coordinate system represents the coordinates of the projection points on the cylindrical surface; wherein/>Is the elevation angle,/>,/>Three-dimensional coordinates of one point in the 3D original point cloud; x, y and z respectively represent the horizontal axis coordinate, the vertical axis coordinate and the vertical axis coordinate of one point in the 3D original point cloud; according to the formula/>Determining coordinates of each projection point; in the/>For the coordinates of a point in the depth band point cloud,/>For the coordinates of a point in the 3D origin point cloud on the xy plane,/>For scaling factor,/>,/>Radius of the 3D origin cloud on the xy plane; Is of unit radius,/> ,/>Representing equivalent azimuth resolution of a point cloud sensor,/>Representing equivalent angular resolution of a point cloud sensor,/>,/>Representation/>Or/>;/>Representing a horizontal or vertical field of view range of the point cloud sensor; /(I)And the equivalent laser beam number of the azimuth angle or elevation angle of the point cloud sensor is represented.
Optionally, determining a normal of each point in the depth band point cloud specifically includes: searching nearest neighbors of each point in the depth band point cloud using a k-d tree of mixed k-NN and RNN; determining the normal line of each point in the depth band point cloud by adopting a principal component analysis method based on the nearest neighbor point of each point in the depth band point cloud; the directions of all normal lines are the directions pointing to the origin direction or the opposite directions pointing to the origin direction.
Optionally, determining a normal of each point in the depth band point cloud specifically includes: directly prescribing the normal line of each point in the depth band point cloud asOr/>; Wherein/>Is the coordinate of the point in the xy plane of the depth band point cloud.
Optionally, back projecting the triangle section graph onto a 3D origin cloud to obtain a primary graph of the traffic scene, which specifically includes: will triangulate the mapIn/>Change to/>Obtain a primary map/>; Wherein/>Is the node set of the triangular section diagram,/>For the edge set of the triangular section diagram,/>For a set of nodes in the primary graph,,/>Is the coordinate of the midpoint of the 3D original point cloud in a 3D Cartesian coordinate system,/>For 3D origin cloud midpoint division/>Other features.
Optionally, performing graph modification on the primary graph to obtain a final composition of the traffic scene, which specifically includes: if the long side exists in the primary graph, the formula is used forRemoving long edges to obtain edge set after removing long edges as/>; The long side is the side with the length larger than a length threshold value in the primary diagram; /(I)For edge mask,/>Is the distance upper threshold,/>Is a positive real number,/>For the edge set of the triangular section diagram,/>One edge in the edge set of the triangle split graph; /(I)To remove edge sets after long edges,/>Is the subscript of the edge of the primary graph,/>For the number of edges,/>Is the first/>, of the primary graphStrip edge,/>Is the first/>, of the primary graphMask of whether a stripe edge is a long edge,/>=0 Represents that the kth side of the primary graph is the long side,/>=1 Indicates that the kth side of the primary graph is a non-long side; constructing an undirected graph, adding reverse edges, and obtaining a reverse edge set as follows: ; wherein/> For the reverse edge set,/>For node set in primary graph,/>And/>First/>, respectively, of a set of nodes in a primary graphPersonal node and/>Personal node,/>Is a fully-weighted quantized symbol; adding self-loops, the set obtained from the loop edges is: /(I); Wherein/>Is a set of self-loop edges; according to the edge set, the reverse edge set and the self-loop edge set after the long edges are removed, the formula is adoptedDetermining an edge set/>, in a final composition of a traffic scene; According to the edge set/>, in the final composition of the traffic sceneAnd node set/>, in the primary graphObtaining final composition/>, of traffic scene
A 360 degree point cloud depth zone triangulation patterning device, comprising: a point cloud sensor and a computer; the point cloud sensor is used for collecting a 3D original point cloud of a traffic scene and transmitting the 3D original point cloud to the computer; the computer is used for obtaining the final composition of the traffic scene by adopting the 360-degree point cloud depth zone triangulation composition method.
A 360 degree point cloud depth band triangulation patterning system comprising: the projection module is used for projecting the 3D original point cloud of the traffic scene onto a cylindrical surface in a projection mode; the projection mode is the intersection point of the ray from the origin of the cylindrical surface to the 3D origin cloud direction and the cylindrical surface; the coordinate transformation module is used for changing elements of projection points on the cylindrical surface to obtain depth zone point clouds; the normal determining module is used for determining the normal of each point in the depth band point cloud; the triangulation module is used for performing Delaunay triangulation on the depth band point cloud according to the normal line of each point in the depth band point cloud to obtain a triangulation graph of the depth band point cloud; the back projection module is used for back projecting the triangular split map to a 3D origin cloud to obtain a primary map of a traffic scene; and the diagram modification module is used for carrying out diagram modification on the primary diagram to obtain a final composition of the traffic scene.
According to the specific embodiments provided by the invention, the following technical effects are disclosed.
According to the 360-degree point cloud depth band triangulation composition method, device and system, the 3D original point cloud of the traffic scene is projected onto a cylindrical surface, the adjacency of the projected point cloud in the neighborhood of the azimuth angle pi/-pi is unchanged, the point cloud is in a space continuous state, and element changing operation in coordinate transformation does not change the adjacency and the space continuous state. Because all depth zone point clouds are positioned on the same curved surface, the generated normal will be vertical to the curved surface, and continuity in pi/-pi azimuth angles is ensured.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the drawings that are needed in the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a 360-degree point cloud depth band triangulation composition method according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a 360-degree point cloud depth band triangulation composition method according to an embodiment of the present invention.
Fig. 3 is a 2D triangulation effect diagram according to an embodiment of the present invention.
Fig. 4 is a sub-flowchart of acquiring a depth band point cloud normal according to an embodiment of the present invention.
Fig. 5 is a 3D depth band spatial point cloud distribution diagram according to an embodiment of the present invention.
FIG. 6 is a schematic diagram of a modification sub-process according to an embodiment of the present invention.
Fig. 7 is a schematic structural diagram of a 360-degree point cloud depth band triangulation patterning device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description.
Example 1
As shown in fig. 1, the embodiment of the invention provides a 360-degree point cloud depth band triangulation composition method, which comprises the following steps of.
Step 1: projecting a 3D original point cloud of a traffic scene onto a cylindrical surface in a projection mode; the projection mode is the intersection point of the ray from the origin of the cylindrical surface to the 3D origin cloud direction and the cylindrical surface.
Referring to fig. 2, the specific implementation procedure of step 1 includes substep 1.1 and substep 1.2.
1.1: And (3) entering a 3D original point cloud. Is provided withIs a 3D original point cloud, then there is/>,/>Is the coordinates of the point in the 3D cartesian coordinate system of the 3D original point cloud. Here the point coordinates in the 3D origin point cloud have only three components/>, of the 3D Cartesian coordinate systemDoes not include reflectivity/>And color/>Etc. because they do not affect the geometric projection. Wherein/>、/>、/>Respectively representing a horizontal axis coordinate, a vertical axis coordinate and a vertical axis coordinate of one point in the 3D original point cloud; /(I)、/>、/>Representing the colors of the three channels red, green and blue.
1.2: Depth zone projection. And projecting the 3D origin point cloud onto a cylindrical surface taking the z axis as a rotation axis, wherein the projection mode is the intersection point of rays from the origin point to the direction of the point cloud and the cylindrical surface.
Step 2: and changing the elements of the projection points on the cylindrical surface to obtain the depth band point cloud.
The z-axis coordinate of the intersection point is replaced byThe coordinates of the x axis and the y axis are unchanged. As shown in part (a) of fig. 5, it is named a depth band. Depth with point cloud/>Can be represented by formula (1).
(1)
Wherein,Is the elevation angle,/>,/>Is the three-dimensional coordinate of a point in the 3D original point cloud,/>、/>、/>Respectively representing the horizontal axis coordinate, the vertical axis coordinate and the vertical axis coordinate of one point in the 3D original point cloud. /(I)For a point in the depth band point cloud,/>For the coordinates of a point in the depth band point cloud,/>、/>、/>Respectively representing the horizontal axis coordinate, the vertical axis coordinate and the vertical axis coordinate of one point in the depth zone point cloud. /(I)Is the coordinates of one point in the 3D origin point cloud on the xy plane.
In the formula (1), the components are as follows,Radius of the 3D origin cloud on the xy plane; /(I)Is a unit radius to ensure that the Euler distance increment of the left and right adjacent projection points on the xy plane is 1. /(I)As a scaling factor, ensuring that the point cloud falls at a unit radius of/>Is arranged on the cylindrical surface of the cylinder. /(I)Use of standardization/>The increment of two adjacent line point clouds in the direction of the z axis of the cylindrical surface is about 1. /(I)Representing equivalent angular resolution of a point cloud sensor,/>Representing the equivalent azimuthal resolution of a point cloud sensor, such as LiDAR, which can be obtained by equation (2).
(2)
In the method, in the process of the invention,Representation/>Or/>;/>Representing a horizontal or vertical Field of View (FOV) of the point cloud sensor; /(I)And the equivalent laser beam number of the azimuth angle or elevation angle of the point cloud sensor is represented.
Step 2 corresponds to the "z-axis conversion element" in fig. 2.
Step 3: a normal to each point in the depth band point cloud is determined.
Exemplary, the method for determining the normal line of each point in the depth band point cloud is as follows: searching nearest neighbors of each point in the depth band point cloud using a k-d tree of mixed k-NN and RNN; determining the normal line of each point in the depth band point cloud by adopting a principal component analysis method based on the nearest neighbor point of each point in the depth band point cloud; the directions of all normal lines are the directions pointing to the origin direction or the opposite directions pointing to the origin direction.
Fig. 4 shows the acquire depth with point cloud normal sub-flow. Thus, referring to fig. 4, the detailed procedure of the determination method of the normal line of each point in the depth band point cloud is as follows.
(1) And selecting a normal line acquisition mode. There are many criteria for selection, such as: the normal line defining each point can be selected with a higher calculation speed, the normal line can be obtained by selecting a neighborhood principal component analysis method with consideration of the distribution factors of adjacent points. And (3) executing the substep (2) if the neighborhood principal component analysis method is selected to obtain the normal, otherwise executing the substep (4).
(2) And (5) obtaining a normal line by neighborhood principal component analysis. The resulting depth band point cloud searches for points in the neighborhood using a k-d tree (KDTree) that mixes k-NN and RNN, and obtains normals based on principal component analysis (PRINCIPAL COMPONENT ANALYSIS, PCA). The details are as follows: and (3) a point in the point cloud is set, firstly, a KDTree of the mixed k-NN and RNN is used for searching and calculating nearest neighbor points of the point, and then covariance matrixes of the neighbor points are calculated, wherein a feature vector corresponding to a minimum feature value of the covariance matrixes is a normal.
(3) The normal direction is specified. Since the normal direction obtained in the last substep (2) is uncertain, this step directly unifies the following: all normal directions are directed to the origin or unified as their opposite directions.
(4) The normal line of each point is defined. Directly defining the normal line of each point asOr the opposite direction thereof
Step 3 corresponds to the "get normal sub-process" in fig. 2.
Step 4: and performing Delaunay triangulation on the depth band point cloud according to the normal line of each point in the depth band point cloud, and obtaining a triangulation graph of the depth band point cloud.
With each point and its normal, this step may perform Delaunay triangulation on the depth band point cloud, with the effects shown in part (b) of FIG. 5, including but not limited to Ball pivot algorithm (Ball-Pivoting Algorithm, BPA), and the like. The triangular sectional view at this time isIn the above, the ratio of/>Is the node set of the triangular section diagram,/>And (3) withBijection,/>Is the collection of edges of the triangular split graph.
Part (a) in fig. 5 represents a depth band, and part (b) in fig. 5 represents a depth band point cloud splitting effect. In fig. 5 Point Cloud Holes, the dot cloud hole is shown, the english in the depth band is Range Belt, the english in the dot/Vertex is Point or Vertex, and the english in the Edge is Edge.
Step 4 corresponds to "Delaunay triangulation" in fig. 2.
Step 5: and back-projecting the triangular split map to a 3D original point cloud to obtain a primary map of the traffic scene.
The back projection is essentially a binary one, willIn/>Change to/>Obtain a primary map/>. In the/>For the node set in the primary graph, the node set bijective to the 3D original point cloud is also represented, namely,/>,/>Is the coordinate of the midpoint of the 3D original point cloud in a 3D Cartesian coordinate system,/>For 3D origin cloud midpoint division/>Other than features, e.g. reflectivityAnd color/>Etc.
Step5 corresponds to the "triangle split map back projection in fig. 2 to the 3D original point cloud".
Step 6: and carrying out graph modification on the primary graph to obtain a final composition of the traffic scene.
The diagram modification sub-flow for performing diagram modification on the primary diagram is shown in fig. 6, and the final composition can be expressed as. Referring to fig. 6, the specific process of the diagram modification sub-process is as follows.
If the long side exists in the primary graph, the formula is used forRemoving long edges to obtain edge set after removing long edges as/>; The long side is the side with the length larger than a length threshold value in the primary diagram; /(I)For edge mask,/>Is the distance upper threshold,/>Is a positive real number,/>For the edge set of the triangle splitting diagram,/>One edge in the edge set of the triangle split graph; /(I)To remove edge sets after long edges,/>For the number of edges,/>Is the first/>, of the primary graphStrip edge,/>Is the first/>, of the primary graphMask of whether a stripe edge is a long edge,/>=0 Represents that the kth side of the primary graph is the long side,/>=1 Indicates that the kth side of the primary graph is a non-long side.
Constructing an undirected graph, adding reverse edges, and obtaining a reverse edge set as follows: ; wherein/> For the reverse edge set,/>For node set in primary graph,/>And/>First/>, respectively, of a set of nodes in a primary graphPersonal node and/>Personal node,/>Is a fully-weighed quantized symbol.
Adding self-loops, the set obtained from the loop edges is: ; wherein/> Is a set of self-edge.
According to the edge set, the reverse edge set and the self-loop edge set after the long edges are removed, the formula is adoptedDetermining an edge set/>, in a final composition of a traffic scene
Edge set in final composition according to traffic sceneAnd node set/>, in the primary graphObtaining final composition/>, of traffic scene
Step 6 corresponds to the "diagram modification sub-process" in fig. 2.
The 360-degree point cloud depth band triangulation composition method is a triangulation algorithm for mapping the point cloud into a band Loop (Loop) space of 360 °, so as to solve the problem of discontinuity of 2D triangulation when the azimuth angle is pi/-pi.
The final composition of the traffic scene can be input into the graph neural network for classification, positioning, 3D target detection and other tasks, such as: inputting the final composition of the traffic scene into a graph neural network model, identifying the classification of the targets in the traffic scene, and positioning the targets; the object includes: vehicles, walkers and cyclists.
The invention has the advantages that: the method solves the problem of discontinuity of 2D triangulation when the azimuth angle is pi/-pi.
The advantages are as follows: step1, step2, step 3 and step4, most importantly step 1.
The generation reasons are as follows: in the projection method of the depth zone in the step 1, the adjacency relation of the projected point cloud in the azimuth angle pi/-pi neighborhood is not changed, and is in a space continuous state, and the adjacency relation and the space continuous state are not changed in the element changing operation in the step 2. And because all depth zone point clouds are positioned on the same curved surface, the normal generated in the step 3 is vertical to the curved surface, the triangulation effect generated in the step 4 is more facilitated, and continuity in pi/-pi azimuth angles is ensured.
Example two
As shown in fig. 7, an embodiment of the present invention provides a 360 degree point cloud depth band triangulation patterning device, including: a point cloud sensor and a computer.
The point cloud sensor is used for collecting a 3D original point cloud of the traffic scene and transmitting the 3D original point cloud to the computer. The computer is used for obtaining the final composition of the traffic scene by adopting the 360-degree point cloud depth zone triangulation composition method of the first embodiment.
The 3D origin point cloud is obtained by a point cloud sensor including, but not limited to, lidar, millimeter wave radar, stereo camera, etc. The proposed algorithm runs on the computer described above and can therefore also be referred to as a calculation unit. The computer (or computing unit) includes, but is not limited to, various computers, industrial computers, single-chip computers, DSPs (DIGITAL SIGNAL Processing), FPGAs (Field-Programmable GATE ARRAY), CPUs, GPUs, and other devices with computing functions.
Example III
The embodiment provides a 360-degree point cloud depth zone triangulation composition system, which comprises: the system comprises a projection module, a coordinate transformation module, a normal determination module, a triangulation module, a back projection module and a graph modification module.
The projection module is used for projecting the 3D original point cloud of the traffic scene onto a cylindrical surface in a projection mode; the projection mode is the intersection point of the ray from the origin of the cylindrical surface to the 3D origin cloud direction and the cylindrical surface.
And the coordinate transformation module is used for changing the projection points on the cylindrical surface to obtain a depth zone point cloud.
And the normal determining module is used for determining the normal of each point in the depth band point cloud.
And the triangulation module is used for performing Delaunay triangulation on the depth band point cloud according to the normal line of each point in the depth band point cloud to obtain a triangulation diagram of the depth band point cloud.
And the back projection module is used for back projecting the triangular split map onto a 3D origin cloud to obtain a primary map of the traffic scene.
And the diagram modification module is used for carrying out diagram modification on the primary diagram to obtain a final composition of the traffic scene.
The 360-degree point cloud depth zone triangulation composition system provided in the embodiment is similar to the 360-degree point cloud depth zone triangulation composition method described in the first embodiment, and therefore, the working principle and the beneficial effects are similar, and specific details will not be described herein.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other.
The principles and embodiments of the present invention have been described herein with reference to specific examples, the description of which is intended only to assist in understanding the methods of the present invention and the core ideas thereof; also, it is within the scope of the present invention to be modified by those of ordinary skill in the art in light of the present teachings. In view of the foregoing, this description should not be construed as limiting the invention.

Claims (9)

1. A360-degree point cloud depth zone triangulation composition method is characterized by comprising the following steps:
Projecting a 3D original point cloud of a traffic scene onto a cylindrical surface in a projection mode; the projection mode is the intersection point of the ray from the origin of the cylindrical surface to the 3D origin cloud direction and the cylindrical surface;
changing elements of projection points on the cylindrical surface to obtain depth band point clouds;
determining a normal line of each point in the depth band point cloud;
Performing Delaunay triangulation on the depth band point cloud according to the normal line of each point in the depth band point cloud to obtain a triangulation graph of the depth band point cloud;
Back-projecting the triangular split map to a 3D original point cloud to obtain a primary map of a traffic scene;
and carrying out graph modification on the primary graph to obtain a final composition of the traffic scene.
2. The 360 degree point cloud depth zone triangulation patterning method of claim 1, wherein said cylinder has a z-axis as a rotational axis.
3. The method for forming the triangulation of the depth zone of the 360-degree point cloud according to claim 1, wherein the method for obtaining the depth zone point cloud by changing elements of projection points on a cylindrical surface comprises the following steps:
Replacing z-axis coordinates of midpoint of 3D original point cloud with Use/>The coordinate system represents the coordinates of the projection points on the cylindrical surface; wherein/>Is the elevation angle,/>,/>Three-dimensional coordinates of one point in the 3D original point cloud; /(I)、/>、/>Respectively representing a horizontal axis coordinate, a vertical axis coordinate and a vertical axis coordinate of one point in the 3D original point cloud;
According to the formula Determining coordinates of each projection point; in the/>For the coordinates of a point in the depth band point cloud,/>For the coordinates of a point in the 3D origin point cloud on the xy plane,/>For scaling factor,/>,/>Radius of the 3D origin cloud on the xy plane; /(I)Is of unit radius,/>,/>Representing equivalent azimuth resolution of a point cloud sensor,/>Representing the equivalent angular resolution of the point cloud sensor,,/>Representation/>Or/>;/>Representing a horizontal or vertical field of view range of the point cloud sensor; /(I)And the equivalent laser beam number of the azimuth angle or elevation angle of the point cloud sensor is represented.
4. The 360-degree point cloud depth zone triangulation patterning method of claim 1, wherein determining the normal of each point in the depth zone point cloud comprises:
Searching nearest neighbors of each point in the depth band point cloud using a k-d tree of mixed k-NN and RNN;
determining the normal line of each point in the depth band point cloud by adopting a principal component analysis method based on the nearest neighbor point of each point in the depth band point cloud;
The directions of all normal lines are the directions pointing to the origin direction or the opposite directions pointing to the origin direction.
5. The 360-degree point cloud depth zone triangulation patterning method of claim 1, wherein determining the normal of each point in the depth zone point cloud comprises:
directly prescribing the normal line of each point in the depth band point cloud as Or/>; Wherein/>Is the coordinate of the point in the xy plane of the depth band point cloud.
6. The 360-degree point cloud depth zone triangulation composition method according to claim 1, wherein the triangle subdivision map is back projected onto a 3D origin point cloud to obtain a primary map of a traffic scene, specifically comprising:
Will triangulate the map In/>Change to/>Obtain a primary map/>; Wherein/>Is the node set of the triangular section diagram,/>For the edge set of the triangle splitting diagram,/>For a set of nodes in the primary graph,,/>Is the coordinate of the midpoint of the 3D original point cloud in a 3D Cartesian coordinate system,/>For 3D origin cloud midpoint division/>Other features.
7. The 360-degree point cloud depth zone triangulation composition method according to claim 1, wherein the primary map is subjected to map modification to obtain a final composition of a traffic scene, and the method specifically comprises the following steps:
if the long side exists in the primary graph, the formula is used for Removing long edges to obtain edge set after removing long edges as/>; The long side is the side with the length larger than a length threshold value in the primary diagram; /(I)For edge mask,/>Is the distance upper threshold,/>Is a positive real number,/>For the edge set of the triangle splitting diagram,/>One edge in the edge set of the triangle split graph; /(I)To remove edge sets after long edges,/>For the number of edges,/>Is the first/>, of the primary graphStrip edge,/>Is the first/>, of the primary graphMask of whether a stripe edge is a long edge,/>=0 Represents that the kth side of the primary graph is the long side,/>=1 Indicates that the kth side of the primary graph is a non-long side;
constructing an undirected graph, adding reverse edges, and obtaining a reverse edge set as follows: ; wherein/> For the reverse edge set,/>For node set in primary graph,/>And/>First/>, respectively, of a set of nodes in a primary graphPersonal node and/>Personal node,/>Is a fully-weighted quantized symbol;
Adding self-loops, the set obtained from the loop edges is: ; wherein/> Is a set of self-loop edges;
According to the edge set, the reverse edge set and the self-loop edge set after the long edges are removed, the formula is adopted Determining an edge set/>, in a final composition of a traffic scene
Edge set in final composition according to traffic sceneAnd node set/>, in the primary graphObtaining final composition/>, of traffic scene
8. A 360 degree point cloud depth zone triangulation patterning device, comprising: a point cloud sensor and a computer;
the point cloud sensor is used for collecting a 3D original point cloud of a traffic scene and transmitting the 3D original point cloud to the computer;
the computer is used for obtaining a final composition of a traffic scene by adopting the 360-degree point cloud depth zone triangulation composition method as claimed in any one of claims 1 to 7.
9. A 360 degree point cloud depth zone triangulation patterning system, comprising:
The projection module is used for projecting the 3D original point cloud of the traffic scene onto a cylindrical surface in a projection mode; the projection mode is the intersection point of the ray from the origin of the cylindrical surface to the 3D origin cloud direction and the cylindrical surface;
The coordinate transformation module is used for changing elements of projection points on the cylindrical surface to obtain depth zone point clouds;
The normal determining module is used for determining the normal of each point in the depth band point cloud;
the triangulation module is used for performing Delaunay triangulation on the depth band point cloud according to the normal line of each point in the depth band point cloud to obtain a triangulation graph of the depth band point cloud;
the back projection module is used for back projecting the triangular split map to a 3D origin cloud to obtain a primary map of a traffic scene;
And the diagram modification module is used for carrying out diagram modification on the primary diagram to obtain a final composition of the traffic scene.
CN202410381411.3A 2024-04-01 360-Degree point cloud depth zone triangulation composition method, device and system Active CN117974741B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410381411.3A CN117974741B (en) 2024-04-01 360-Degree point cloud depth zone triangulation composition method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410381411.3A CN117974741B (en) 2024-04-01 360-Degree point cloud depth zone triangulation composition method, device and system

Publications (2)

Publication Number Publication Date
CN117974741A true CN117974741A (en) 2024-05-03
CN117974741B CN117974741B (en) 2024-07-09

Family

ID=

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001243500A (en) * 2000-03-01 2001-09-07 Hitachi Eng Co Ltd Three-dimensional dot group data processor
CN105809615A (en) * 2016-03-10 2016-07-27 广州欧科信息技术股份有限公司 Point cloud data imaging method
US20160314564A1 (en) * 2015-04-22 2016-10-27 Esight Corp. Methods and devices for optical aberration correction
CN107533771A (en) * 2015-04-17 2018-01-02 微软技术许可有限责任公司 Carried out by 3D Model Reconstructions lattice simplified
CN107767442A (en) * 2017-10-16 2018-03-06 浙江工业大学 A kind of foot type three-dimensional reconstruction and measuring method based on Kinect and binocular vision
US20200074652A1 (en) * 2018-08-30 2020-03-05 Baidu Online Network Technology (Beijing) Co., Ltd. Method for generating simulated point cloud data, device, and storage medium
CN112365592A (en) * 2020-11-10 2021-02-12 大连理工大学 Local environment feature description method based on bidirectional elevation model
CN113362463A (en) * 2021-05-17 2021-09-07 浙江工业大学 Workpiece three-dimensional reconstruction method based on Gaussian mixture model
US20220051475A1 (en) * 2020-08-12 2022-02-17 Datasight Data compression algorithm for processing of point cloud data for digital terrain models (dtm)
WO2022207946A1 (en) * 2021-03-31 2022-10-06 Enusa Industrias Avanzadas, S.A., S.M.E System and procedure for inspection of the surface of a nuclear fuel rod for the automatic detection, location and characterization of defects
EP4156109A1 (en) * 2021-09-28 2023-03-29 Nokia Technologies Oy Apparatus and method for establishing a three-dimensional conversational service
CN116402963A (en) * 2023-04-04 2023-07-07 重庆长安汽车股份有限公司 Lane line vector model construction method and device, electronic equipment and storage medium
CN116559838A (en) * 2023-07-06 2023-08-08 深圳赋能光达科技有限公司 Acousto-optic deflection module based on cylindrical lens beam expansion, photoelectric device and electronic equipment

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001243500A (en) * 2000-03-01 2001-09-07 Hitachi Eng Co Ltd Three-dimensional dot group data processor
CN107533771A (en) * 2015-04-17 2018-01-02 微软技术许可有限责任公司 Carried out by 3D Model Reconstructions lattice simplified
US20160314564A1 (en) * 2015-04-22 2016-10-27 Esight Corp. Methods and devices for optical aberration correction
CN105809615A (en) * 2016-03-10 2016-07-27 广州欧科信息技术股份有限公司 Point cloud data imaging method
CN107767442A (en) * 2017-10-16 2018-03-06 浙江工业大学 A kind of foot type three-dimensional reconstruction and measuring method based on Kinect and binocular vision
US20200074652A1 (en) * 2018-08-30 2020-03-05 Baidu Online Network Technology (Beijing) Co., Ltd. Method for generating simulated point cloud data, device, and storage medium
US20220051475A1 (en) * 2020-08-12 2022-02-17 Datasight Data compression algorithm for processing of point cloud data for digital terrain models (dtm)
CN112365592A (en) * 2020-11-10 2021-02-12 大连理工大学 Local environment feature description method based on bidirectional elevation model
WO2022207946A1 (en) * 2021-03-31 2022-10-06 Enusa Industrias Avanzadas, S.A., S.M.E System and procedure for inspection of the surface of a nuclear fuel rod for the automatic detection, location and characterization of defects
CN113362463A (en) * 2021-05-17 2021-09-07 浙江工业大学 Workpiece three-dimensional reconstruction method based on Gaussian mixture model
EP4156109A1 (en) * 2021-09-28 2023-03-29 Nokia Technologies Oy Apparatus and method for establishing a three-dimensional conversational service
CN116402963A (en) * 2023-04-04 2023-07-07 重庆长安汽车股份有限公司 Lane line vector model construction method and device, electronic equipment and storage medium
CN116559838A (en) * 2023-07-06 2023-08-08 深圳赋能光达科技有限公司 Acousto-optic deflection module based on cylindrical lens beam expansion, photoelectric device and electronic equipment

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
JIATENG GUO等: ""3D modeling for mine roadway from laser scanning point cloud"", 《2016 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS)》, 3 November 2016 (2016-11-03) *
ZHE LI等: ""3D Point Cloud Attribute Compression Based on Cylindrical Projection"", 《2019 IEEE INTERNATIONAL SYMPOSIUM ON BROADBAND MULTIMEDIA SYSTEMS AND BROADCASTING (BMSB)》, 30 January 2020 (2020-01-30) *
张旭东等: ""动态环境下无人地面车辆点云地图快速重定位方法"", 《兵工学报》, 27 July 2020 (2020-07-27) *
江盟;蔡勇;张建生;: "基于投影的点云配准算法", 自动化仪表, no. 04, 20 April 2019 (2019-04-20) *
江记洲;郭甲腾;吴立新;杨宜舟;周文辉;张培娜;: "基于三维激光扫描点云的矿山巷道三维建模方法研究", 煤矿开采, no. 02, 15 April 2016 (2016-04-15) *
蔡先杰;: "基于局部坐标系法线投射的点云精细配准算法", 现代计算机(专业版), no. 26, 15 September 2016 (2016-09-15) *

Similar Documents

Publication Publication Date Title
KR102125959B1 (en) Method and apparatus for determining a matching relationship between point cloud data
CN109598765B (en) Monocular camera and millimeter wave radar external parameter combined calibration method based on spherical calibration object
Zhou et al. T-LOAM: Truncated least squares LiDAR-only odometry and mapping in real time
JP6456141B2 (en) Generating map data
CN103729872B (en) A kind of some cloud Enhancement Method based on segmentation resampling and surface triangulation
CN112418245A (en) Electromagnetic emission point positioning method based on urban environment physical model
Taylor et al. Automatic calibration of multi-modal sensor systems using a gradient orientation measure
Zhang et al. Three-dimensional cooperative mapping for connected and automated vehicles
Shan et al. LiDAR-based stable navigable region detection for unmanned surface vehicles
CN115619780B (en) Laser scanning image quality evaluation method and system
CN115032648A (en) Three-dimensional target identification and positioning method based on laser radar dense point cloud
Xu et al. Dynamic vehicle pose estimation and tracking based on motion feedback for LiDARs
Kaushik et al. Accelerated patch-based planar clustering of noisy range images in indoor environments for robot mapping
Lu et al. A lightweight real-time 3D LiDAR SLAM for autonomous vehicles in large-scale urban environment
Liu et al. Comparison of 2D image models in segmentation performance for 3D laser point clouds
CN117974741B (en) 360-Degree point cloud depth zone triangulation composition method, device and system
Zhang et al. Accurate real-time SLAM based on two-step registration and multimodal loop detection
CN117974741A (en) 360-Degree point cloud depth zone triangulation composition method, device and system
CN116642490A (en) Visual positioning navigation method based on hybrid map, robot and storage medium
CN115937449A (en) High-precision map generation method and device, electronic equipment and storage medium
CN114648561A (en) Point cloud matching method and system for rotary three-dimensional laser radar
US6456741B1 (en) Structure-guided image measurement method
CN114384486A (en) Data processing method and device
Gou et al. A Visual SLAM With Tightly-Coupled Integration of Multi-Object Tracking for Production Workshop
CN117974747B (en) 360-Degree point cloud 2D depth disk triangulation composition method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant