CN114781140A - Laser radar point cloud simulation method and device and computer equipment - Google Patents
Laser radar point cloud simulation method and device and computer equipment Download PDFInfo
- Publication number
- CN114781140A CN114781140A CN202210371428.1A CN202210371428A CN114781140A CN 114781140 A CN114781140 A CN 114781140A CN 202210371428 A CN202210371428 A CN 202210371428A CN 114781140 A CN114781140 A CN 114781140A
- Authority
- CN
- China
- Prior art keywords
- target
- point cloud
- data
- target object
- laser beam
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2119/00—Details relating to the type or aim of the analysis or the optimisation
- G06F2119/02—Reliability analysis or reliability optimisation; Failure analysis, e.g. worst case scenario performance, failure mode and effects analysis [FMEA]
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Evolutionary Computation (AREA)
- Geometry (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The application relates to a laser radar point cloud simulation method and device and computer equipment. The method comprises the following steps: the first processor acquires the position and the posture of a target object; the second processor determines first mesh data of the target object based on the position and the posture; the second processor determines second grid data from the first grid data based on a target scanning range of the lidar; the second processor determines a target polygon intersecting the laser beam in the laser beam data within the target scanning range and intersection coordinates of the intersection with the laser beam in the second grid data; the second processor calculates reflected point cloud intensity based on direction vector information of the laser beam, normal vector information of the target polygon, and intersection point coordinates, and determines point cloud data corresponding to the target object within the target scanning range based on the point cloud intensity and the intersection point coordinates. By adopting the method, the simulation efficiency of the laser radar point cloud can be improved.
Description
Technical Field
The application relates to the technical field of sensor simulation, in particular to a laser radar point cloud simulation method, a laser radar point cloud simulation device and computer equipment.
Background
Lidar is capable of providing accurate environmental depth measurements and is therefore widely used for environmental sensing and positioning tasks. The laser radar is an important component of an automatic driving sensor system, and the simulation of point cloud data is an indispensable loop in a data closed loop of a laser radar perception algorithm.
The existing laser radar point cloud simulation technology is mainly divided into two types, one is a GPU rendering method based on a game engine, and the other is a simulation method based on a CPU. The first method generally uses a mature large game engine to render point cloud, and the second method uses a CPU to simulate point cloud, so that the simulation efficiency of laser radar point cloud is low and the cost is high because the characteristics of laser radar, especially the most widely used mechanical laser radar, emitting multiple beams (up to 64 or even 128 lines) at the same time are not considered in both methods.
Disclosure of Invention
In view of the above, there is a need to provide a lidar point cloud simulation method, device and computer equipment capable of improving the simulation efficiency of lidar point cloud and reducing the cost.
In a first aspect, the application provides a laser radar point cloud simulation method. The method comprises the following steps:
the first processor acquires the position and the posture of a target object;
a second processor determines first mesh data of the target object based on the position and the pose;
the second processor determines second grid data from the first grid data based on a target scan range of the lidar;
the second processor determines, in the second grid data, a target polygon that intersects a laser beam in the laser beam data within the target scanning range, and intersection coordinates that intersect the laser beam;
the second processor calculates a reflected point cloud intensity based on the direction vector information of the laser beam, the normal vector information of the target polygon, and the intersection point coordinates, an
The second processor determines point cloud data corresponding to a target object within the target scanning range based on the point cloud intensity and the intersection point coordinates.
In one embodiment, before the lidar based target scanning range determining the second grid data from the first grid data, the method further comprises:
dividing the preset frame time according to the preset dividing number to obtain target scanning time; the preset frame time is the time required by scanning the laser radar for one week;
determining a scanning range sequence according to the preset dividing number and the target scanning time;
and selecting a scanning range as a target scanning range from the scanning range sequence according to the sequence.
In one embodiment, the determining point cloud data corresponding to a target object within the target scanning range based on the point cloud intensity and the intersection coordinates comprises:
when a target scanning range is selected from the scanning range sequence, point cloud data corresponding to a target object in the selected target scanning range are determined based on the point cloud intensity and the intersection point coordinates until point cloud data corresponding to the target object in each scanning range in the scanning range sequence are obtained;
the method further comprises the following steps: and combining the point cloud data corresponding to the target object in each scanning range in the scanning range sequence into point cloud simulation data of the target object.
In one embodiment, before determining the target polygon in the second grid data that intersects the laser beam in the laser beam data within the target scanning range and the intersection coordinates of the intersection with the laser beam, the method further comprises:
acquiring the information of the starting point and the direction vector of the laser beam in the target scanning range and the information of the normal vector of the target polygon;
calculating the distance from the starting point to the intersection point based on the starting point and the direction vector information and the normal vector information of the target polygon; the intersection point is a point where the target polygon intersects with the laser beam;
the determining, in the second grid data, a target polygon that intersects a laser line beam in the laser line beam data, and intersection coordinates that intersect the laser line beam, includes:
when the distance is smaller than the effective detection distance of the laser radar, determining that the laser beams in the laser beam data are intersected with a target polygon;
and calculating the intersection point coordinate of the intersection point based on the initial coordinate, the distance and the direction vector information.
In one embodiment, the calculating the reflected point cloud intensity based on the direction vector information of the laser beam, the normal vector information of the target polygon, and the intersection coordinates comprises:
determining an included angle based on the direction vector information of the laser beam and the normal vector information of the target polygon;
and calculating the reflected point cloud intensity based on the included angle, the intersection point coordinate and the starting point.
In one embodiment, the method further comprises:
determining the capacity of a calculation block according to the number of preset wire harnesses and the number of preset target polygons;
determining the number of calculation blocks according to the number of laser beams, the number of target polygons and the capacity of the calculation blocks;
determining the number of threads in each calculation block according to the capacity of the calculation block;
the calculating reflected point cloud intensity based on the direction vector information of the laser beam, the normal vector information of the target polygon, and the intersection point coordinates includes:
and according to the number of the calculation blocks and the number of the threads, performing parallel calculation on the direction vector information of the laser beam, the normal vector information of the target polygon and the intersection point coordinates to obtain the reflected point cloud intensity.
In a second aspect, the application further provides a laser radar point cloud simulation device. The device comprises a memory and a processor, wherein the memory stores a computer program, the processor comprises a first processor and a second processor, the processor is characterized in that when executing the computer program, the processor realizes a laser radar point cloud simulation method, and the method comprises the following steps:
the first processor acquires the position and the posture of a target object;
a second processor determines first mesh data of the target object based on the position and the pose;
the second processor determines second grid data from the first grid data based on a target scan range of the lidar;
the second processor determines, in the second grid data, a target polygon intersecting a laser line beam in the laser line beam data within the target scanning range and intersection coordinates of the intersection with the laser line beam;
the second processor calculates a reflected point cloud intensity based on the direction vector information of the laser beam, the normal vector information of the target polygon, and the intersection point coordinates, an
The second processor determines point cloud data corresponding to a target object within the target scanning range based on the point cloud intensity and the intersection point coordinates.
In a third aspect, the application also provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the steps of the above method when executing the computer program.
In a fourth aspect, the present application further provides a computer-readable storage medium. The computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method.
In a fifth aspect, the present application further provides a computer program product. The computer program product comprises a computer program which, when executed by a processor, carries out the steps of the above-mentioned method.
According to the laser radar point cloud simulation method, the laser radar point cloud simulation device and the computer equipment, corresponding second grid data and laser beam data are determined through a divided target scanning range, a triangle intersecting with a laser beam in the laser beam data and an intersection point coordinate intersecting with the laser beam are determined in the second grid data, and reflected point cloud intensity is calculated based on direction vector information of the laser beam, normal vector information of a target polygon and the intersection point coordinate, so that point cloud data of a target object are obtained; the method and the device realize the targeted screening of the grid data and the laser beams according to the characteristics of the mechanical laser radar wire harnesses, and determine a reflected point cloud intensity scheme by combining a parallel computing architecture of the second processor, thereby effectively improving the simulation efficiency of the laser radar point cloud and reducing the simulation cost of the laser radar point cloud.
Drawings
FIG. 1 is a diagram of an exemplary environment in which a laser radar point cloud simulation method may be implemented;
FIG. 2 is a schematic flow chart of a laser radar point cloud simulation method according to an embodiment;
FIG. 3 is a diagram of a compute block in one embodiment;
FIG. 4 is a diagram of threads in one embodiment;
FIG. 5 is a schematic flow chart of a lidar point cloud simulation method in another embodiment;
FIG. 6 is a schematic flow chart diagram illustrating the step of calculating coordinates of an intersection in one embodiment;
FIG. 7 is a block diagram of an exemplary lidar point cloud simulation apparatus;
FIG. 8 is a block diagram of another exemplary embodiment of a lidar point cloud simulation apparatus;
FIG. 9 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clearly understood, the present application is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The laser radar point cloud simulation method provided by the embodiment of the application can be applied to the application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The data storage system may store data that the server 104 needs to process. The data storage system may be integrated on the server 104, or may be located on the cloud or other network server.
The application can be applied to the terminal 102 or the server 104, and is described in the specification of the terminal 102, wherein the terminal 102 comprises a first processor and a second processor, and the first processor acquires the position and the posture of a target object; the second processor determines first mesh data of the target object based on the position and the posture; the second processor determines second grid data from the first grid data based on a target scanning range of the lidar; the second processor determines a target polygon intersecting the laser beam in the laser beam data within the target scanning range and intersection coordinates of the intersection with the laser beam in the second grid data; the second processor calculates the reflected point cloud intensity based on the direction vector information of the laser beam, the normal vector information of the target polygon and the intersection point coordinates; and the second processor determines point cloud data corresponding to the target object within the target scanning range based on the point cloud intensity and the intersection point coordinates.
CUDA (computer Unified Device architecture) is a parallel computing platform and programming model. The processing capability of a Graphics Processing Unit (GPU) is utilized, so that the computing performance can be greatly improved.
The terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, internet of things devices, and portable wearable devices, and the internet of things devices may be smart speakers, smart televisions, smart air conditioners, smart car-mounted devices, and the like. The portable wearable device can be a smart watch, a smart bracelet, a head-mounted device, and the like. The terminal 102 is equipped with a mechanical lidar sensor system. The server 104 may be implemented as a stand-alone server or a server cluster comprised of multiple servers.
In one embodiment, as shown in fig. 2, a lidar point cloud simulation method is provided, which may be applied to the terminal 102 or the server 104, and is described by taking the method as an example applied to the terminal 102 in fig. 1, including the following steps:
s202, the first processor acquires the position and the posture of the target object.
The first processor may be a Central Processing Unit (CPU). The target object may refer to an object scanned by the mechanical lidar, and the target object may be a moving or stationary object, for example, the target object may be a pedestrian, a vehicle, a building, and the like. The position may refer to coordinates of the target object, for example, the position of the target object a may be (10.0, 20.0, 0.3). The attitude may refer to a direction for describing the target object, and the attitude may be expressed in the form of (roll angle, pitch angle, yaw angle) and an angle range of 0 to 2 Π, for example, the attitude of the target object a is (0.01, 0.04, 3 Π/2). It is noted that the lidar may be mounted on top of the terminal, e.g. on the roof of an autonomous vehicle, i.e. be part of the autonomous sensor.
In one embodiment, before S202, the first processor may acquire historical mesh data of the target object in the environment map, and a motion trajectory of the terminal and the target object.
Wherein, the historical mesh data may refer to mesh data before a preset time, the historical mesh data may be composed of polygons, each vertex in the polygons may be represented by coordinates, and one target object may be composed of 1 or more polygons. The preset time can refer to the time when the laser radar point cloud preset by the user starts to be simulated. For example, in the mesh corresponding to the target object a, the target object may be composed of 300 triangles. The movement trajectory may refer to a mark, a position, and a posture corresponding to the terminal or the target object at different times. For example, at 14: at time 51, target object B corresponds to marker 001, position (20.1, 31.3, 0.5), and posture (0.03, 0.0, 3 Π/2).
Specifically, the first processor can acquire the position and the posture of the target object at a preset moment, and the first processor can also receive a laser radar point cloud simulation instruction and take the position and the posture stored in the terminal as the position and the posture of the target object. For example, a user may generate a lidar point cloud simulation instruction through a trigger operation, and when the first processor receives the lidar point cloud simulation instruction, the first processor may use a position and a posture corresponding to the user identifier stored in the terminal as a position and a posture of the target object.
In one embodiment, S202 includes determining a first position and a first pose corresponding to a first time, and a second position and a second pose corresponding to a second time in the motion trajectory; the first moment and the second moment are moments adjacent to a preset moment in the moving process of the target object; determining a time parameter based on a preset time, a first time and a second time; determining a position of the target object based on the time parameter, the first position and the second position; the pose of the target object is determined based on the time of day parameter, the first pose, and the second pose.
The first time may be a time before a preset time, and the first time is included in the motion trajectory of the target object. The first position may refer to a position to which the target object corresponds at a first time. The first pose may refer to a pose of the target object at a first time. The second time may be a time subsequent to the preset time, and the second time is included in the motion trajectory of the target object. The second position may refer to a position of the target object corresponding to the second time. The second pose may refer to a pose corresponding to the target object at the second time. The time parameter may refer to a parameter determined based on a preset time, a first time, and a second time, and the time parameter may be used to determine the position and the posture of the target object.
For example, the preset time is t, in the motion trajectory of the target object, the first time adjacent to the preset time is s1.timestamp, and the second time is s2. timestamp. The calculation formula of the time instant parameter ratio may be:
the first position corresponding to the first moment s1.timestamp is (s1.x, s1.y, s1.z) and the first pose is (s1.roll, s1.pitch, s1. yaw). The second position corresponding to the second time instant s2.timestamp is (s2.x, s2.y, s2.z) and the second posture is (s2.roll, s2.pitch, s2. yaw).
The calculation formula for the position (x, y, z) of the target object may be:
x=(1-ratio)*s1.x+ratio*s2.x
y=(1-ratio)*s1.y+ratio*s2.y
z=(1-ratio)*s1.z+ratio*s2.z
the calculation formula of the target object's posture (roll, pitch, yaw) may be:
roll=(1-ratio)*s1.roll+ratio*s2.roll
pitch=(1-ratio)*s1.pitch+ratio*s2.pitch
yaw=(1-ratio)*s1.yaw+ratio*s2.yaw
for example, the user a logs in the automatic driving sensor system through a page, and the system can acquire the positions and the postures of all target objects uploaded by the user a according to the user identification, namely the user a. Meanwhile, the user A can select the position and the posture which meet the requirements of the user A from a plurality of preset moments displayed on the page, and the first processor can respond to the preset moment selection operation of the user A and screen out the position and the posture of the target object corresponding to the preset moment.
S204, the second processor determines first grid data of the target object based on the position and the posture.
Wherein the second processor may be a CUDA based GPU. The first mesh data may refer to 1 to more polygon data corresponding to the target object, for example, the first mesh data corresponding to the target object a and the target object B are { (10.2, 20.1, 0.3), (11.1, 19.0, 0.4), (12.0, 19.5, 0.2), respectively; … (30.0, 50.0, 0.60), (31.0, 51.0, 0.9), (28.0, 45.0, 1.0) }, { (4.0, 6.0, 0.3), (6.0, 3.0, 0.5), (5.0, 4.0, 0.3); … (6.0, 4.0, 0.7), (8.0, 5.7, 0.8), (9.3, 6.0, 1.8) }.
Specifically, the second processor may obtain historical grid data from the first processor; first mesh data of the target object is determined based on the position, the pose, and the historical mesh data.
For example, the history mesh data is mesh _ h, the position of the target object is (x, y, z), and the posture is (α, β, γ). The calculation formula of the first mesh data new1_ mesh may be:
new1_mesh=R(α,β,γ)*mesh_h+(x,y,z)
wherein the attitude parameter R (α, β, γ) may be
S206, the second processor determines second grid data from the first grid data based on the target scanning range of the laser radar.
The laser radar may be a mechanical laser radar. The target scanning range may refer to a range in which point cloud data is to be obtained from a selected target in a range formed by one laser radar scanning cycle, for example, the target scanning range may be a scanning range formed by a laser radar scanning angle of 2 degrees. The second mesh data may refer to data belonging to a target scanning range in the first mesh data.
In one embodiment, before S206, the second processor divides the preset frame time into target scanning times according to a preset division number; determining a scanning range sequence according to the preset division number and the target scanning time; and selecting the scanning range from the scanning range sequence as a target scanning range according to the sequence.
The preset number of partitions may refer to a preset number of partitions for a preset frame time, for example, the preset number of partitions may be 300. The preset frame time is the time required for scanning the laser radar for one week, and for example, the preset frame time may be 100 milliseconds. The target scan time may refer to a preset frame time/a preset division amount of time after the preset frame time is divided into the preset division amount, for example, the preset division amount may be 300, the preset frame time may be 100 milliseconds, and the target scan time may be 100 milliseconds/300, that is, 0.333 milliseconds. The scanning range sequence can be a sequence number of the scanning range sequence determined by a preset division number, and the target scanning time determines the scanning time of the laser radar corresponding to the scanning range of the point cloud data to be determined in the scanning range sequence. For example, if the predetermined division number is 300 and the target scan time is 0.333 ms, the corresponding scan range sequence may be { scan range 1, scan range 2, …, scan range 299, and scan range 300} and the scan time for each scan range is 0.333 ms.
Specifically, the second processor may screen the first mesh data according to a target scanning range of the laser radar, to obtain second mesh data belonging to the target scanning range in the first mesh data.
In one embodiment, the second processor establishes a spherical coordinate system with the center of the laser radar as an origin, filters all triangles of the first grid data in the whole environment according to the horizontal angle (target scanning range) of the beam, ignores triangles exceeding the horizontal angle range of the beam, and stores triangles within the horizontal angle range to obtain the second grid data.
S208, the second processor determines, in the second grid data, a target polygon that intersects the laser beam in the laser beam data within the target scanning range, and intersection coordinates that intersect the laser beam.
Wherein the laser line beam data may refer to data related to the laser line beam, which may be a starting point of the laser line beam, directional vector information, and the like.
Specifically, the second processor can acquire laser line beam data within a target scanning range, receive a laser radar point cloud simulation instruction, and use the laser line beam data stored in the terminal as the laser line beam data within the target scanning range.
The laser line beam can refer to a laser line beam emitted by a laser radar, and the mechanical laser radar can emit 64 laser line beams at a time and continuously emit 6 times when rotating. The target polygon may refer to a polygon in the second mesh data that intersects the laser line beam, e.g., the target polygon may be a triangle, a rectangle, etc. The intersection coordinates may refer to coordinates corresponding to an intersection between the laser beam and the target polygon, for example, the intersection coordinates may be (10.0, 20.0, 0.3).
In one embodiment, before S208, the second processor acquires start point and direction vector information of the laser beam and normal vector information of the target polygon; and calculating the distance from the starting point to the intersection point based on the information of the starting point and the direction vector and the information of the normal vector of the target polygon.
In one embodiment, S208 includes the second processor determining that the laser-line beam in the laser-line-beam data intersects the target polygon when the distance is less than the effective detection distance of the lidar; and calculating to obtain the intersection point coordinate of the intersection point based on the initial coordinate, the distance and the direction vector information.
S210, the second processor calculates the reflected point cloud intensity based on the direction vector information of the laser beam, the normal vector information of the target polygon and the intersection point coordinates, and determines point cloud data corresponding to the target object in the target scanning range based on the point cloud intensity and the intersection point coordinates.
The reflected point cloud intensity may refer to the reflected point cloud intensity corresponding to the intersection point coordinates. The point cloud data may refer to data in which the coordinates of the intersection point and the intensity of the point cloud correspond to each other.
In one embodiment, S210 includes the second processor determining the included angle based on the direction vector information of the laser beam and the normal vector information of the target polygon; and calculating the reflected point cloud intensity based on the included angle, the intersection point coordinate and the initial point.
The included angle may be an included angle between a direction vector of the laser beam and a normal vector of the target polygon.
For example, the coordinates of the intersection point are P ═ O + t ×, D, where O is the start coordinate of the laser beam, t is the distance (constant value) from the start point of the laser beam to the intersection point, and D is the direction vector information of the corresponding laser beam. The normal vector information of the target polygon is N (A, B, C). Then the angle is θ, and the cosine of the angle is cos θ D N (A, B, C), the intensity of the point cloud is reflectivity 1 cos θ0.5Exp (-1.0 × t), the reflectivity may be 1.
In one embodiment, before S210, the second processor determines a calculation block capacity according to a preset harness number and a preset target polygon number; determining the number of calculation blocks according to the number of laser beams, the number of target polygons and the capacity of the calculation blocks; and determining the number of threads in each calculation block according to the capacity of the calculation block.
The preset number of beams may refer to a preset number of beams, which is used to represent the calculation capacity of the calculation block for the laser beams, for example, the preset number of beams may be 16. The preset number of target polygons may refer to a preset number of target polygons for representing a calculation capacity of the target polygons by the calculation block, and for example, the preset number of polygons may be 256. The calculation block capacity is used to represent the calculation capacity of the calculation block for the laser beam and the target polygon, for example, the calculation block capacity may be 16 laser beams and 256 target polygons. The number of laser beams may refer to the number of laser beams of the target scan range. The target polygon number may refer to a number of target polygons within a target scanning range. FIG. 3 is a diagram of a compute block in one embodiment; the number of calculation blocks may refer to the number of calculation blocks. The calculation block may be used to determine a target polygon that intersects the laser beam and calculate the reflected point cloud intensity. FIG. 4 is a diagram of threads in one embodiment; the number of threads may refer to the number of threads contained in a computing block, e.g., 1 computing block contains 256 threads, each of which may be used to determine whether 1 triangle intersects with 16 laser beams, respectively.
In one embodiment, S210 includes the second processor performing parallel calculation on the direction vector information of the laser beam, the normal vector information of the target polygon, and the intersection coordinates according to the number of calculation blocks and the number of threads to obtain the reflected point cloud intensity.
In one embodiment, S210 includes the second processor determining, each time a target scanning range is selected from the scanning range sequence, point cloud data corresponding to a target object within the selected target scanning range based on the point cloud intensity and the intersection coordinates until point cloud data corresponding to the target object within each scanning range in the scanning range sequence is obtained.
In one embodiment, after S210, the second processor combines the point cloud data corresponding to the target object in each scanning range in the scanning range sequence into the point cloud simulation data of the target object.
For example, fig. 5 is a schematic flow chart of a lidar point cloud simulation method in another embodiment; as shown in the figure, whether one frame is finished or not is judged, namely whether the laser radar scans for a circle or not is judged, if not, the positions and postures of the main vehicle and all objects are updated, after rotation and translation transformation is carried out on the grid data of the objects, the laser radar emits 64 x 6 laser radar wire harnesses, triangles in the grid data are filtered according to the average horizontal angle of the wire harnesses, intersection calculation is carried out on the laser radar and the filtered triangles, point cloud coordinates (intersection point coordinates) and point cloud intensity are calculated, after the data are stored, the time is increased by 1/300 frames, when one frame is finished, the number of frames is automatically increased by 1, and when the number of frames is greater than a preset number of frames m, the flow is finished.
In the laser radar point cloud simulation method, corresponding second grid data and laser beam data are determined according to a divided target scanning range, a triangle intersecting with a laser beam in the laser beam data and an intersection point coordinate intersecting with the laser beam are determined in the second grid data, and reflected point cloud intensity is calculated based on direction vector information of the laser beam, normal vector information of a target polygon and the intersection point coordinate, so that point cloud data of a target object are obtained; the method and the device realize the targeted screening of the grid data and the laser beams according to the characteristics of the mechanical laser radar wire harnesses, and determine the reflected point cloud intensity scheme by combining the parallel computing architecture of the GPU, thereby effectively improving the simulation efficiency of the laser radar point cloud and reducing the simulation cost of the laser radar point cloud.
In one embodiment, as shown in FIG. 6, the step of calculating the intersection coordinates includes:
s602, acquiring the starting point and direction vector information of the laser beam in the target scanning range and the normal vector information of the target polygon.
Specifically, the second processor may acquire start point and direction vector information of the laser beam within the target scanning range, and normal vector information of the target polygon. The second processor can also receive a laser radar point cloud simulation instruction, and the starting point, the direction vector information and the normal vector information stored in the terminal are used as the starting point and the direction vector information of the laser beam and the normal vector information of the target polygon.
S604, calculating the distance from the starting point to the intersection point based on the information of the starting point and the direction vector and the normal vector information of the target polygon; the intersection point is the point where the target polygon intersects the laser beam.
Wherein the starting point of the laser line beam may refer to the starting point of the laser line beam emission, which may be (4.0, 5.0, 2.0), for example. The direction vector information of the laser beam may be used to represent the direction of the laser beam, for example, the direction vector information of the laser beam may be (0.612, 0.612, 0.56). The normal vector information of the target polygon may refer to vector information perpendicular to the plane of the target polygon, for example, the normal vector information of the target polygon may be (0.745, 0.579, 0.331). The distance may refer to a distance from a starting point of the laser beam to the intersection point. The intersection point is the point where the target polygon intersects the laser beam.
For example, laser line beam R1 may be denoted as R1 ═ O + t ═ D; wherein O is the initial coordinate of the laser beam, t is the distance from the initial point of the laser beam to the intersection point, and D is the direction vector information of the corresponding laser beam. The vertex coordinates of the polygons are a (a1, a2, A3), B (B1, B2, B3), and C (C1, C2, C3), respectively.
The formula for calculating the normal vector information of the polygon may be:
N(A、B、C)=(B-A)×(C-A)/(|B-A|*|C-A|)
And S606, when the distance is smaller than the effective detection distance of the laser radar, determining that the laser beam in the laser beam data is intersected with the target polygon.
Wherein, lidar's effective detection distance may refer to the distance that lidar transmission can effectively detect the environment, e.g., effective detection distance may be 200 meters. The intersection coordinates may refer to the intersection coordinates of the laser beam and the target polygon.
In one embodiment, the laser beam does not intersect the polygon if the distance is calculated to be less than zero.
For example, the second processor determines that the laser beam in the laser beam data intersects the target polygon if the distance is less than 200 meters.
And S608, calculating to obtain intersection point coordinates of the intersection points based on the initial coordinates, the distance and the direction vector information.
In this embodiment, the distance from the starting point to the intersection point is calculated based on the information of the starting point and the direction vector and the information of the normal vector of the target polygon, when the distance is smaller than the effective detection distance of the laser radar, it is determined that the laser beam in the laser beam data intersects with the target polygon, and the intersection point coordinate of the intersection point is calculated based on the starting coordinate, the distance and the information of the direction vector. The simulation efficiency of the laser radar point cloud can be improved, and the simulation cost of the laser radar point cloud is reduced.
It should be understood that, although the steps in the flowcharts related to the embodiments are shown in sequence as indicated by the arrows, the steps are not necessarily executed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in the flowcharts related to the above embodiments may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least a part of the steps or stages in other steps.
Based on the same inventive concept, the embodiment of the application also provides a laser radar point cloud simulation device for realizing the laser radar point cloud simulation method. The implementation scheme for solving the problem provided by the device is similar to the implementation scheme recorded in the method, so the specific limitations in one or more laser radar point cloud simulation device embodiments provided below can be referred to the limitations of the laser radar point cloud simulation method in the above, and details are not repeated here.
In one embodiment, as shown in fig. 7, there is provided a lidar point cloud simulation apparatus, including: a first obtaining module 702, a first determining module 704, a third determining module 706, a calculating and determining module 708, a calculating and determining module 710, wherein:
a first obtaining module 702, configured to obtain, by a first processor, a position and a posture of a target object;
a first determination module 704 for the second processor determining first mesh data of the target object based on the position and the pose;
a second determination module 706 for the second processor to determine second grid data from the first grid data based on the target scanning range of the lidar;
a third determination module 708 for the second processor to determine, in the second grid data, a target polygon that intersects the laser line beam in the laser line beam data within the target scanning range and intersection coordinates of the intersection with the laser line beam;
and a calculation and determination module 710 for calculating the reflected point cloud intensity by the second processor based on the direction vector information of the laser beam, the normal vector information of the target polygon, and the intersection coordinates, and determining the point cloud data corresponding to the target object within the target scanning range by the second processor based on the point cloud intensity and the intersection coordinates.
In one embodiment, the first obtaining module 702 is further configured to obtain a motion trajectory of the target object; determining a first position and a first gesture corresponding to a first moment, and a second position and a second gesture corresponding to a second moment in the motion trail; the first moment and the second moment are moments when the target object is adjacent to the preset moment in the moving process; determining a time parameter based on a preset time, a first time and a second time; determining a position of the target object based on the time parameter, the first position and the second position; the pose of the target object is determined based on the time of day parameter, the first pose, and the second pose.
In one embodiment, the second determination module 706 is used to obtain historical grid data; first mesh data of the target object is determined based on the position, the pose, and the historical mesh data.
In one embodiment, the calculation and determination module 710 is further configured to determine point cloud data corresponding to a target object in the selected target scanning range based on the point cloud intensity and the intersection coordinates each time a target scanning range is selected from the scanning range sequence until point cloud data corresponding to the target object in each scanning range in the scanning range sequence is obtained; and combining the point cloud data corresponding to the target object in each scanning range in the scanning range sequence into point cloud simulation data of the target object.
In one embodiment, the third determining module 708 is further configured to obtain the starting point and direction vector information of the laser beam and the normal vector information of the target polygon; calculating the distance from the starting point to the intersection point based on the information of the starting point and the direction vector and the normal vector information of the target polygon; the intersection point is the point where the target polygon intersects with the laser beam; when the distance is smaller than the effective detection distance of the laser radar, determining that the laser beam in the laser beam data intersects with the target polygon; and calculating the intersection point coordinate of the intersection point based on the initial coordinate, the distance and the direction vector information.
In one embodiment, the calculation and determination module 710 is further configured to determine an included angle based on the direction vector information of the laser beam and the normal vector information of the target polygon; and calculating the reflected point cloud intensity based on the included angle, the intersection point coordinate and the initial point.
In one embodiment, the calculation and determination module 710 is further configured to determine the calculation block capacity according to a preset harness number and a preset target polygon number; determining the number of calculation blocks according to the number of laser beams, the number of target polygons and the capacity of the calculation blocks; determining the number of threads in each calculation block according to the capacity of the calculation block; and according to the number of the calculation blocks and the number of the thread runs, parallel calculation is carried out on the direction vector information of the laser beam, the normal vector information of the target polygon and the intersection point coordinates to obtain the reflected point cloud intensity.
In one embodiment, as shown in fig. 8, the apparatus further comprises: a selection module 712 for selecting, among other things,
a selecting module 712, configured to divide a preset frame time according to a preset division number to obtain a target scanning time; presetting frame time as the time required by scanning the laser radar for one week; determining a scanning range sequence according to the preset dividing quantity and the target scanning time; and selecting the scanning range from the scanning range sequence as a target scanning range according to the sequence.
In the above embodiment, the corresponding second mesh data and laser beam data are determined by the divided target scanning range, the triangle intersecting with the laser beam in the laser beam data and the intersection point coordinate intersecting with the laser beam are determined in the second mesh data, and the reflected point cloud intensity is calculated based on the direction vector information of the laser beam, the normal vector information of the target polygon and the intersection point coordinate, thereby obtaining the point cloud data of the target object; the method and the device realize the targeted screening of the grid data and the laser beams according to the characteristics of the mechanical laser radar wire harnesses, and determine the reflected point cloud intensity scheme by combining the parallel computing architecture of the GPU, thereby effectively improving the simulation efficiency of the laser radar point cloud and reducing the simulation cost of the laser radar point cloud.
All modules in the laser radar point cloud simulation device can be completely or partially realized through software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent of a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, there is provided still another lidar point cloud simulation apparatus, the apparatus including a memory and a processor, the memory storing a computer program, and the processor including a first processor and a second processor, wherein the processor when executing the computer program implements a lidar point cloud simulation method, the method including:
the first processor acquires the position and the posture of a target object;
the second processor determines first mesh data of the target object based on the position and the posture;
the second processor determines second grid data from the first grid data based on a target scanning range of the lidar;
the second processor determines a target polygon intersecting the laser line beam in the laser line beam data within the target scanning range and intersection coordinates of the intersection with the laser line beam in the second grid data;
the second processor calculates reflected point cloud intensity based on direction vector information of the laser beam, normal vector information of the target polygon, and intersection point coordinates, and determines point cloud data corresponding to the target object within the target scanning range based on the point cloud intensity and the intersection point coordinates.
In an embodiment, a computer device is provided, where the computer device may be a terminal or a server, and this embodiment is described by taking the computer device as a terminal as an example, and an internal structure diagram of the computer device may be as shown in fig. 9. The computer apparatus includes a processor, a memory, an input/output interface, a communication interface, a display unit, and an input device. The processor, the memory and the input/output interface are connected by a system bus, and the communication interface, the display unit and the input device are connected by the input/output interface to the system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The input/output interface of the computer device is used for exchanging information between the processor and an external device. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a lidar point cloud simulation method. The display unit of the computer equipment is used for forming a visual picture, and can be a display screen, a projection device or a virtual reality imaging device, the display screen can be a liquid crystal display screen or an electronic ink display screen, the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on a shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the configuration shown in fig. 9 is a block diagram of only a portion of the configuration associated with the present application, and is not intended to limit the computing device to which the present application may be applied, and that a particular computing device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory in which a computer program is stored and a processor, which when executing the computer program implements the embodiments described above.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which when executed by a processor, implements the embodiments described above.
In an embodiment, a computer program product is provided comprising a computer program which, when executed by a processor, implements the embodiments described above.
It should be noted that the user information (including but not limited to user device information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, displayed data, etc.) referred to in the present application are information and data authorized by the user or fully authorized by each party, and the collection, use and processing of the related data need to comply with the relevant laws and regulations and standards of the relevant countries and regions.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above may be implemented by hardware that is instructed by a computer program, and the computer program may be stored in a non-volatile computer-readable storage medium, and when executed, may include the processes of the embodiments of the methods described above. Any reference to memory, databases, or other media used in the embodiments provided herein can include at least one of non-volatile and volatile memory. The nonvolatile Memory may include a Read-Only Memory (ROM), a magnetic tape, a floppy disk, a flash Memory, an optical Memory, a high-density embedded nonvolatile Memory, a resistive Random Access Memory (ReRAM), a Magnetic Random Access Memory (MRAM), a Ferroelectric Random Access Memory (FRAM), a Phase Change Memory (PCM), a graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), for example. The databases referred to in various embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the various embodiments provided herein may be, without limitation, general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing-based data processing logic devices, or the like.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not to be construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.
Claims (10)
1. A laser radar point cloud simulation method is characterized by comprising the following steps:
the first processor acquires the position and the posture of a target object;
a second processor determines first mesh data of the target object based on the position and the pose;
the second processor determines second grid data from the first grid data based on a target scan range of the lidar;
the second processor determines, in the second grid data, a target polygon that intersects a laser beam in the laser beam data within the target scanning range, and intersection coordinates that intersect the laser beam;
the second processor calculates a reflected point cloud intensity based on the direction vector information of the laser beam, the normal vector information of the target polygon, and the intersection point coordinates, an
The second processor determines point cloud data corresponding to a target object within the target scanning range based on the point cloud intensity and the intersection point coordinates.
2. The method of claim 1, wherein prior to obtaining the position and pose of the target object, the method further comprises:
acquiring a motion track of a target object;
the acquiring the position and the posture of the target object comprises:
determining a first position and a first gesture corresponding to a first moment in the motion trail, and a second position and a second gesture corresponding to a second moment; the first time and the second time are adjacent to a preset time in the moving process of the target object;
determining a time parameter based on the preset time, the first time and the second time;
determining a position of a target object based on the time of day parameter, the first position, and the second position;
determining a pose of the target object based on the time of day parameter, the first pose, and the second pose.
3. The method of claim 1, wherein the determining first mesh data for the target object based on the position and the pose comprises:
acquiring historical grid data;
determining first mesh data of the target object based on the position, the pose, and the historical mesh data.
4. The method of claim 1, wherein prior to determining second grid data from the first grid data based on the lidar-based target scan range, the method further comprises:
dividing preset frame time according to the preset dividing number to obtain target scanning time; the preset frame time is the time required by scanning the laser radar for one week;
determining a scanning range sequence according to the preset division number and the target scanning time;
and selecting the scanning range from the scanning range sequence as a target scanning range according to the sequence.
5. The method of claim 4, wherein the determining point cloud data corresponding to a target object within the target scan range based on the point cloud intensity and the intersection coordinates comprises:
when a target scanning range is selected from the scanning range sequence, point cloud data corresponding to a target object in the selected target scanning range are determined based on the point cloud intensity and the intersection point coordinates until point cloud data corresponding to the target object in each scanning range in the scanning range sequence are obtained;
the method further comprises the following steps: and combining the point cloud data corresponding to the target object in each scanning range in the scanning range sequence into point cloud simulation data of the target object.
6. The method of claim 1, wherein prior to determining a target polygon in the second grid data that intersects a laser line beam in the laser line beam data within the target scan range, and intersection coordinates of the intersection with the laser line beam, the method further comprises:
acquiring the information of the starting point and the direction vector of the laser beam in the target scanning range and the information of the normal vector of the target polygon;
calculating the distance from the starting point to the intersection point based on the information of the starting point and the direction vector and the information of the normal vector of the target polygon; the intersection point is a point where the target polygon intersects with the laser beam;
the determining, in the second grid data, a target polygon that intersects a laser line beam in the laser line beam data, and intersection coordinates that intersect the laser line beam, includes:
when the distance is smaller than the effective detection distance of the laser radar, determining that the laser beams in the laser beam data are intersected with a target polygon;
and calculating to obtain the intersection point coordinate of the intersection point based on the initial coordinate, the distance and the direction vector information.
7. The method of claim 6, wherein the calculating a reflected point cloud intensity based on the direction vector information of the laser beam, the normal vector information of the target polygon, and the intersection coordinates comprises:
determining an included angle based on the direction vector information of the laser beam and the normal vector information of the target polygon;
and calculating the reflected point cloud intensity based on the included angle, the intersection point coordinate and the initial point.
8. The method of claim 7, further comprising:
determining the capacity of a calculation block according to the number of preset wire harnesses and the number of preset target polygons;
determining the number of calculation blocks according to the number of laser beams, the number of target polygons and the capacity of the calculation blocks;
determining the number of threads in each calculation block according to the capacity of the calculation block;
the calculating reflected point cloud intensity based on the direction vector information of the laser beam, the normal vector information of the target polygon, and the intersection point coordinates includes:
and according to the number of the calculation blocks and the number of the threads, performing parallel calculation on the direction vector information of the laser beam, the normal vector information of the target polygon and the intersection point coordinates to obtain the reflected point cloud intensity.
9. A lidar point cloud simulation apparatus, the apparatus comprising a memory and a processor, the memory storing a computer program, the processor comprising a first processor and a second processor, wherein the processor, when executing the computer program, implements a lidar point cloud simulation method, the method comprising:
the first processor acquires the position and the posture of a target object;
a second processor determines first mesh data of the target object based on the position and the pose;
the second processor determines second grid data from the first grid data based on a target scan range of the lidar; the second processor determines, in the second grid data, a target polygon that intersects a laser beam in the laser beam data within the target scanning range, and intersection coordinates that intersect the laser beam;
the second processor calculates reflected point cloud intensity based on direction vector information of the laser beam, normal vector information of the target polygon, and the intersection point coordinates, and the second processor determines point cloud data corresponding to a target object within the target scanning range based on the point cloud intensity and the intersection point coordinates.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210371428.1A CN114781140A (en) | 2022-04-11 | 2022-04-11 | Laser radar point cloud simulation method and device and computer equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210371428.1A CN114781140A (en) | 2022-04-11 | 2022-04-11 | Laser radar point cloud simulation method and device and computer equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114781140A true CN114781140A (en) | 2022-07-22 |
Family
ID=82430068
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210371428.1A Pending CN114781140A (en) | 2022-04-11 | 2022-04-11 | Laser radar point cloud simulation method and device and computer equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114781140A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115393496A (en) * | 2022-10-25 | 2022-11-25 | 之江实验室 | Method and device for rapidly drawing multi-laser-radar simulation point cloud |
-
2022
- 2022-04-11 CN CN202210371428.1A patent/CN114781140A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115393496A (en) * | 2022-10-25 | 2022-11-25 | 之江实验室 | Method and device for rapidly drawing multi-laser-radar simulation point cloud |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230053462A1 (en) | Image rendering method and apparatus, device, medium, and computer program product | |
CN114419240B (en) | Illumination rendering method and device, computer equipment and storage medium | |
CN110088808B (en) | System and method for program-generated object distribution in region of three-dimensional virtual environment | |
CN112169324A (en) | Rendering method, device and equipment of game scene | |
CN114419099B (en) | Method for capturing motion trail of virtual object to be rendered | |
CN116051713B (en) | Rendering method, electronic device, and computer-readable storage medium | |
US20140267229A1 (en) | System And Method For Classification Of Three-Dimensional Models In A Virtual Environment | |
CN114596423A (en) | Model rendering method and device based on virtual scene gridding and computer equipment | |
CN114781140A (en) | Laser radar point cloud simulation method and device and computer equipment | |
CN114820980A (en) | Three-dimensional reconstruction method and device, electronic equipment and readable storage medium | |
WO2024148898A1 (en) | Image denoising method and apparatus, and computer device and storage medium | |
Masood et al. | High‐performance virtual globe GPU terrain rendering using game engine | |
CN115631320B (en) | Pre-calculation cell display method, pre-calculation cell generation method and device | |
WO2023231793A9 (en) | Method for virtualizing physical scene, and electronic device, computer-readable storage medium and computer program product | |
CN115984440B (en) | Object rendering method, device, computer equipment and storage medium | |
CN117557703A (en) | Rendering optimization method, electronic device and computer readable storage medium | |
CN116030221A (en) | Processing method and device of augmented reality picture, electronic equipment and storage medium | |
CN117392358B (en) | Collision detection method, collision detection device, computer device and storage medium | |
CN117830587B (en) | Map annotation drawing method and device, computer equipment and storage medium | |
CN117576645B (en) | Parking space detection method and device based on BEV visual angle and computer equipment | |
CN117115382B (en) | Map road drawing method, device, computer equipment and storage medium | |
CN116824082B (en) | Virtual terrain rendering method, device, equipment, storage medium and program product | |
US10553025B2 (en) | Method and device for efficient building footprint determination | |
CN116778049A (en) | Image rendering method, device, computer equipment and storage medium | |
Chen et al. | Large-Scale 3D Terrain Reconstruction Using 3D Gaussian Splatting for Visualization and Simulation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |