CN115081195A - Laser radar simulation method and device, electronic equipment and storage medium - Google Patents

Laser radar simulation method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115081195A
CN115081195A CN202210631656.8A CN202210631656A CN115081195A CN 115081195 A CN115081195 A CN 115081195A CN 202210631656 A CN202210631656 A CN 202210631656A CN 115081195 A CN115081195 A CN 115081195A
Authority
CN
China
Prior art keywords
ray
laser radar
simulation
laser
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210631656.8A
Other languages
Chinese (zh)
Inventor
陈泓宇
张煜东
范圣印
王颖
郑林飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yihang Yuanzhi Technology Co Ltd
Original Assignee
Beijing Yihang Yuanzhi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yihang Yuanzhi Technology Co Ltd filed Critical Beijing Yihang Yuanzhi Technology Co Ltd
Priority to CN202210631656.8A priority Critical patent/CN115081195A/en
Publication of CN115081195A publication Critical patent/CN115081195A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2119/00Details relating to the type or aim of the analysis or the optimisation
    • G06F2119/10Noise analysis or noise optimisation

Abstract

The application provides a laser radar simulation method, a laser radar simulation device, electronic equipment and a storage medium, wherein the method comprises the following steps: constructing a three-dimensional simulation scene, and reading parameters for laser radar simulation, wherein the parameters for laser radar simulation comprise: and then carrying out laser radar simulation based on the three-dimensional simulation scene, the incidence angle after the distortion treatment and other parameters except the incidence angle of the laser radar rays. The application provides a laser radar simulation method, a laser radar simulation device, electronic equipment and a storage medium, so that the laser radar simulation is closer to the real situation, and further more accurate simulation of the laser radar on a simulation platform can be realized.

Description

Laser radar simulation method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of laser radar simulation technologies, and in particular, to a method and an apparatus for laser radar simulation, an electronic device, and a storage medium.
Background
The laser radar simulation technology is widely applied to the fields of automatic driving, unmanned aerial vehicles, robots and the like. On the simulation platform, the laser radar simulation technology can provide simulation data of a laser radar sensor, and the simulation data can be used as input of mobile robot perception, path planning, control, positioning navigation and the like.
In a real scene, the working performance of the laser radar is easily affected by various factors, so that how to perform more accurate simulation on the laser radar on a simulation platform becomes a key problem.
Disclosure of Invention
The present application aims to provide a method, an apparatus, an electronic device and a storage medium for laser radar simulation, which are used for solving at least one technical problem.
The above object of the present invention is achieved by the following technical solutions:
in a first aspect, a method for laser radar simulation is provided, including:
constructing a three-dimensional simulation scene;
reading parameters for lidar simulation, the parameters for lidar simulation including: the angle of incidence of each lidar ray;
carrying out motion distortion processing on the incident angle of each laser radar ray to obtain the distorted incident angle;
and carrying out laser radar simulation based on the three-dimensional simulation scene, the incidence angle after the distortion processing and other parameters except the incidence angle of the laser radar ray.
In a possible implementation manner, the building of the three-dimensional simulation scene includes any one of:
acquiring sensor data, and constructing the three-dimensional simulation scene based on the sensor data;
constructing the three-dimensional simulation scene through a physical engine;
wherein the three-dimensional simulation scene comprises: the three-dimensional simulation scene is described by a triangular grid or a square grid.
In another possible implementation manner, the parameters for lidar simulation include: field angle and angular resolution;
wherein, reading parameters for lidar simulation, and then further comprises:
and establishing a three-dimensional incident angle table based on the angle of view and the angular resolution.
In another possible implementation manner, reading the incident angle of the lidar ray further includes:
acquiring a multi-frame measurement result;
removing noise points from the multi-frame measurement results;
calculating an incidence angle for an effective point corresponding to each ray in each frame;
and determining the incidence angle corresponding to the ith ray based on the incidence angle corresponding to the ith ray in each frame, wherein i belongs to [1, n ], and n is the number of rays contained in each frame.
In another possible implementation manner, the performing motion distortion processing on the incident angle of each lidar ray includes:
acquiring current pose information;
determining a relative transformation matrix generated by motion distortion based on the current pose information;
and performing motion distortion processing on the incidence angle of each laser radar ray based on the relative transformation matrix.
In another possible implementation manner, the performing motion distortion processing on the incident angle of each lidar ray based on the relative transformation matrix to obtain a distorted incident angle includes:
determining a unit direction vector of the incidence angle of each laser radar ray in a specific coordinate system to obtain a unit direction vector corresponding to each ray;
determining a unit direction vector after each ray is distorted based on the unit direction vector corresponding to each ray and the relative transformation matrix;
and determining the distorted incidence angle of each ray based on the unit direction vector of each ray after motion distortion.
In another possible implementation manner, performing lidar simulation based on the three-dimensional simulation scene, the incident angle after any distortion processing, and other parameters except the incident angle of the lidar ray includes:
constructing a sphere three-dimensional simulation sub-scene based on the three-dimensional simulation scene;
performing collision detection based on the sphere three-dimensional simulation sub-scene, the incidence angle after any distortion processing and other parameters except the incidence angle of the laser radar ray;
and carrying out laser radar simulation based on the collision detection result and the beam model.
In another possible implementation, the other parameters than the incident angle of the lidar ray include: an effective detection range of the lidar, the vertical angular resolution, the horizontal angular resolution, and the diameter resolution;
wherein the performing collision detection based on the sphere three-dimensional simulation sub-scene, the incidence angle after any distortion processing, and other parameters except the incidence angle of the lidar ray comprises:
performing spherical element rasterization on the spherical element sub-scene according to the vertical angular resolution, the horizontal angular resolution and the diameter resolution to obtain a spherical element rasterized three-dimensional simulation sub-scene;
and performing collision detection based on the three-dimensional simulation sub-scene rasterized by the spherical elements and the incidence angle after any distortion processing.
In another possible implementation manner, the rasterized three-dimensional simulation sub-scene of the voxel contains a plurality of voxel grid voxels corresponding to each ray; the method further comprises the following steps:
determining index values corresponding to a plurality of element grid voxels corresponding to each ray based on the incidence angle processed by the distortion of each ray, the propagation distance of each ray, the effective detection range of the lidar, the vertical angular resolution and the horizontal angular resolution;
and creating a three-dimensional data group based on the index values corresponding to the multiple metagrid voxels corresponding to each ray.
In another possible implementation manner, the three-dimensional simulation sub-scene with the rasterized spherical voxels contains a plurality of spherical voxel raster voxels;
the collision detection is carried out on the three-dimensional simulation sub-scene based on the spherical element rasterization and the incidence angle after any distortion treatment, and the collision detection comprises the following steps:
acquiring semantic information in the three-dimensional simulation sub-scene with the rasterized spherical elements;
carrying out bounding box division on the barrier in each voxel grid voxel based on semantic information in the three-dimensional simulation sub-scene with the voxel rasterized;
determining the coordinate of any ray under a laser radar spherical coordinate system based on the incidence angle after any distortion processing, the pose of the laser radar in a simulation scene and the spherical three-dimensional simulation sub-scene;
determining the distance from any ray to the surface of an obstacle based on the coordinate of the any ray in a laser radar spherical coordinate system and the plurality of spherical element grid voxels;
determining the size of a light spot generated by any ray to the surface of the obstacle based on the distance from the any ray to the surface of the obstacle and the light beam model;
determining an incidence angle after ray distortion processing meeting a first preset condition, wherein the incidence angle after the ray distortion processing meeting the first preset condition is a distorted incidence angle corresponding to a ray with the light spot size as a radius and the direction vector of any ray as a center;
determining the coordinate of each ray meeting the first preset condition under a laser radar spherical coordinate system based on the incidence angle after distortion processing of each ray meeting the first preset condition, the pose of the laser radar in a simulation scene and the spherical three-dimensional simulation sub-scene;
and determining the distance from each ray meeting the first preset condition to the surface of the obstacle based on the coordinate of each ray meeting the first preset condition in the laser radar spherical coordinate system and the plurality of spherical element grid voxels.
In another possible implementation manner, the determining the incident angle after the ray distortion processing that satisfies the first preset condition further includes:
determining the number of sampling points;
dividing a sampling range which is formed by taking any ray direction vector as a center and taking the spot radius as a radius on the basis of the number of the sampling points;
sampling the laser radar ray set based on each division range to obtain a sampled laser radar ray set;
determining the coordinate of each ray meeting the first preset condition under a laser radar spherical coordinate system based on each incidence angle meeting the first preset condition after distortion processing, the pose of the laser radar in a simulation scene and the spherical three-dimensional simulation sub-scene, wherein the determining comprises the following steps:
and determining the coordinate of each ray meeting the first preset condition under the spherical coordinate system of the laser radar based on the incidence angle of the sampled laser radar ray set after the distortion processing, the pose of the laser radar in the simulation scene and the spherical three-dimensional simulation sub-scene.
In another possible implementation manner, the coordinate of any ray in the laser radar spherical coordinate system is (m, n);
the determining the distance from any ray to the surface of the obstacle based on the coordinate of any ray in the spherical laser radar coordinate system and the plurality of spherical element grid voxels comprises:
from k = k min To k = k max Traversing a voxel grid voxel with index (m, n, k) in the three-dimensional data group, and carrying out bounding box detection on the traversed voxel grid voxel;
and if the intersection exists, stopping traversing, and calculating the distance from any ray to the surface of the obstacle based on ray detection.
In another possible implementation, the collision detection result of any ray includes: the distance of any ray to the surface of the obstacle; the beam model comprises a beam waist value of the light beam and the wavelength of the light beam;
the performing laser radar simulation based on the collision detection result and the beam model includes:
determining a spot radius based on a distance of the any ray to the surface of the obstacle, a beam waist value of the light beam and a wavelength of the light beam;
determining a laser radar ray set which takes any ray direction vector as a center and takes the spot radius as a radius;
and simulating the divergence phenomenon of the rays in the propagation process based on the laser radar ray set.
In another possible implementation, the lidar simulation result includes: a distance value of each ray to the obstacle; the method further comprises the following steps:
processing the distance value from each ray to the obstacle by combining a noise model to obtain a distance value corresponding to each ray after noise is added;
processing the distance value corresponding to each ray added with the noise by combining a weather model to obtain the laser intensity corresponding to each ray under each weather type;
simulating by an echo mode and a Dropoff mechanism based on the corresponding laser intensity of each ray in the target weather type;
and determining three-dimensional point cloud data of each laser radar ray and an intensity value after normalization processing through three-dimensional point calculation processing and normalization processing based on the incidence angle and the simulation result after distortion processing of each laser radar ray.
In another possible implementation manner, the lidar simulation result includes: a distance value of each ray to the obstacle;
wherein, combine the noise model, process the laser radar simulation result, still include before and:
acquiring the variance of the measured distance, and determining the expectation and the variance of a noise model based on the variance of the measured distance;
the method comprises the following steps of combining a noise model to process a laser radar simulation result, wherein the method comprises the following steps:
and adding noise to the distance value of each ray from the obstacle based on the expectation and the variance of the noise model to obtain a distance value after the noise is added.
In another possible implementation manner, the weather model includes a corresponding relationship between a weather type, a measurement range, and an attenuation degree;
combining with a weather model, processing the distance value corresponding to each ray added with noise to obtain the laser intensity corresponding to each ray under the target weather type, including:
determining whether the distance value corresponding to each ray added with the noise belongs to the measurement range corresponding to the target weather type;
if the ray belonging to the measurement range corresponding to the target weather type exists, acquiring the attenuation rate corresponding to the target weather type;
and determining the laser intensity corresponding to the ray meeting the measurement range under the target weather type based on the distance value corresponding to the ray meeting the measurement range and the attenuation rate corresponding to the target weather type.
In another possible implementation, the simulating by the echo mode based on the laser intensity corresponding to any ray in the target weather type includes:
determining an echo pattern of the any ray;
if the echo mode of any ray is a single echo mode, returning a first laser intensity and a corresponding first distance value, wherein the first laser intensity is the laser intensity corresponding to any ray under the target weather type, and the first distance value is the distance value from the ray corresponding to the first laser intensity to the obstacle;
if the echo mode of any ray is a multi-echo mode, returning the first laser intensity, the first distance value, the laser intensity meeting a second preset condition and the distance value corresponding to the laser intensity meeting the second preset condition, wherein the laser intensity meeting the second preset condition is the first N laser intensities selected from the laser intensities corresponding to the rays meeting the first preset condition from large to small; the distance values corresponding to the laser intensity meeting the second preset condition are the first N distance values selected from the distance values corresponding to the rays meeting the first preset condition from large to small.
In another possible implementation, the simulation is performed by a Dropoff mechanism based on the laser intensity corresponding to any ray in the target weather type, and includes:
determining whether a second laser intensity is greater than a first intensity threshold, wherein the second laser intensity is the laser intensity corresponding to any ray in a target weather type, and the first intensity threshold is the laser intensity threshold generating a Dropoff phenomenon;
if the intensity is larger than the first intensity threshold value, outputting the second laser intensity and a corresponding second distance value;
if the intensity is not larger than the first intensity threshold value, calculating an attenuation value based on the second laser intensity;
and if the attenuation value is larger than a second intensity threshold value, outputting the second laser intensity and the first distance value, wherein the second intensity threshold value is a random value.
In another possible implementation manner, the simulation result includes: the laser intensity of the laser radar is the laser intensity corresponding to any laser radar ray, and the distance value of the laser radar is the distance value of the ray corresponding to the laser intensity of the laser radar;
based on the incidence angle, the second laser intensity and the second distance value after the distortion processing of any laser radar ray, the three-dimensional point cloud data of any laser radar ray and the intensity value after the normalization processing are determined through the three-dimensional point calculation processing and the normalization processing, and the method comprises the following steps:
calculating a three-dimensional point coordinate measured by any laser radar ray under a laser radar Cartesian coordinate system based on the incidence angle, the third laser intensity and the third distance value after any laser radar ray is subjected to distortion processing;
and determining the intensity value after normalization processing based on the third laser intensity, the maximum laser intensity and the minimum laser intensity, wherein the maximum laser intensity and the minimum laser intensity are the maximum value and the minimum value in the laser intensity values respectively corresponding to all the rays.
In a second aspect, an apparatus for lidar simulation is provided, including:
the building module is used for building a three-dimensional simulation scene;
a reading module, configured to read parameters for lidar simulation, where the parameters for lidar simulation include: the angle of incidence of each lidar ray;
the motion distortion processing module is used for carrying out motion distortion processing on the incidence angle of each laser radar ray to obtain the incidence angle after distortion processing;
and the laser radar simulation module is used for carrying out laser radar simulation based on the three-dimensional simulation scene, the incidence angle after the distortion processing and other parameters except the incidence angle of the laser radar ray.
In a possible implementation manner, when the building module is used to build a three-dimensional simulation scene, the building module is specifically configured to any one of the following:
acquiring sensor data, and constructing the three-dimensional simulation scene based on the sensor data;
constructing the three-dimensional simulation scene through a physical engine;
wherein the three-dimensional simulation scene comprises: the three-dimensional simulation scene is described by a triangular grid or a square grid.
In another possible implementation manner, the parameters for lidar simulation include: field angle and angular resolution;
wherein the apparatus further comprises: a module is established in which, among other things,
the establishing module is used for establishing a three-dimensional incident angle table based on the field angle and the angular resolution.
In another possible implementation manner, the apparatus further includes: a first obtaining module, a noise point removing module, a calculating module and a first determining module, wherein,
the first acquisition module is used for acquiring multi-frame measurement results;
the noise point removing module is used for removing noise points from the multi-frame measurement results;
the calculation module is used for calculating an incidence angle for an effective point corresponding to each ray in each frame;
and the first determining module is used for determining the incidence angle corresponding to the ith ray based on the incidence angle corresponding to the ith ray in each frame, wherein i belongs to [1, n ], and n is the number of rays contained in each frame.
In another possible implementation manner, when performing motion distortion processing on the incident angle of each lidar ray, the motion distortion processing module is specifically configured to:
acquiring current pose information;
determining a relative transformation matrix generated by motion distortion based on the current pose information;
and performing motion distortion processing on the incidence angle of each laser radar ray based on the relative transformation matrix.
In another possible implementation manner, the motion distortion processing module, when performing motion distortion processing on the incident angle of each lidar ray based on the relative transformation matrix to obtain a distorted incident angle, is specifically configured to:
determining a unit direction vector of the incidence angle of each laser radar ray in a specific coordinate system to obtain a unit direction vector corresponding to each ray;
determining a unit direction vector after each ray is distorted based on the unit direction vector corresponding to each ray and the relative transformation matrix;
and determining the distorted incidence angle of each ray based on the unit direction vector of each ray after motion distortion.
In another possible implementation manner, when performing lidar simulation based on the three-dimensional simulation scene, the incident angle after any distortion processing, and other parameters except the incident angle of the lidar ray, the lidar simulation module is specifically configured to:
constructing a sphere three-dimensional simulation sub-scene based on the three-dimensional simulation scene;
performing collision detection based on the sphere three-dimensional simulation sub-scene, the incidence angle after any distortion processing and other parameters except the incidence angle of the laser radar ray;
and carrying out laser radar simulation based on the collision detection result and the beam model.
In another possible implementation, the other parameters than the incident angle of the lidar ray include: an effective detection range of the lidar, the vertical angular resolution, the horizontal angular resolution, and the diameter resolution;
the lidar simulation module is specifically configured to, when performing collision detection based on the spherical three-dimensional simulation sub-scene, the incident angle after any distortion processing, and other parameters except the incident angle of the lidar ray:
performing spherical element rasterization on the spherical element sub-scene according to the vertical angular resolution, the horizontal angular resolution and the diameter resolution to obtain a spherical element rasterized three-dimensional simulation sub-scene;
and performing collision detection based on the three-dimensional simulation sub-scene rasterized by the spherical elements and the incidence angle processed by any distortion.
In another possible implementation manner, the three-dimensional simulation sub-scene in which the spherical elements are rasterized includes a plurality of element grid voxels corresponding to each ray; the device further comprises: a second determination module and a creation module, wherein,
the second determining module is configured to determine, based on the incidence angle after distortion processing of each ray, the propagation distance of each ray, the effective detection range of the lidar, the vertical angular resolution, and the horizontal angular resolution, index values corresponding to each of a plurality of element grid voxels corresponding to each ray;
and the creating module is used for creating a three-dimensional data set based on the index values corresponding to the multiple metagrid voxels corresponding to each ray.
In another possible implementation manner, the three-dimensional simulation sub-scene with the rasterized spherical elements comprises a plurality of spherical element grid voxels;
the laser radar simulation module is specifically used for performing collision detection on the three-dimensional simulation sub-scene based on the spherical element rasterization and the incidence angle after any distortion processing:
acquiring semantic information in the three-dimensional simulation sub-scene with the rasterized spherical elements;
carrying out bounding box division on the barrier in each voxel grid voxel based on semantic information in the three-dimensional simulation sub-scene with the voxel rasterized;
determining the coordinate of any ray under a laser radar spherical coordinate system based on the incidence angle after any distortion processing, the pose of the laser radar in a simulation scene and the spherical three-dimensional simulation sub-scene;
determining the distance from any ray to the surface of an obstacle based on the coordinate of the any ray in a laser radar spherical coordinate system and the plurality of spherical element grid voxels;
determining the size of a light spot generated by any ray to the surface of the obstacle based on the distance from the any ray to the surface of the obstacle and the light beam model;
determining an incidence angle after ray distortion processing meeting a first preset condition, wherein the incidence angle after the ray distortion processing meeting the first preset condition is a distorted incidence angle corresponding to a ray with the light spot size as a radius and the direction vector of any ray as a center;
determining the coordinate of each ray meeting the first preset condition under a laser radar spherical coordinate system based on the incidence angle after distortion processing of each ray meeting the first preset condition, the pose of the laser radar in a simulation scene and the spherical three-dimensional simulation sub-scene;
and determining the distance from each ray meeting the first preset condition to the surface of the obstacle based on the coordinate of each ray meeting the first preset condition in the laser radar spherical coordinate system and the plurality of spherical element grid voxels.
In another possible implementation manner, the apparatus further includes: a third determination module, a partitioning module, and a sampling module, wherein,
the third determining module is used for determining the number of sampling points;
the dividing module is used for dividing a sampling range which is formed by taking any ray direction vector as a center and taking the spot radius as a radius based on the number of the sampling points;
the sampling module is used for sampling the laser radar ray set based on each division range to obtain a sampled laser radar ray set;
the laser radar simulation module is specifically used for determining a coordinate of each ray meeting a first preset condition under a laser radar spherical coordinate system based on each incidence angle meeting the first preset condition after distortion processing, the pose of the laser radar in a simulation scene and the three-dimensional simulation sub-scene of the sphere:
and determining the coordinate of each ray meeting the first preset condition under the spherical coordinate system of the laser radar based on the incidence angle of the sampled laser radar ray set after the distortion processing, the pose of the laser radar in the simulation scene and the spherical three-dimensional simulation sub-scene.
In another possible implementation manner, the coordinate of any ray in the laser radar spherical coordinate system is (m, n);
the lidar simulation module is specifically configured to, when determining a distance from any ray to a surface of an obstacle based on the coordinate of any ray in the lidar spherical coordinate system and the plurality of spherical element grid voxels:
from k = k min To k = k max Traversing a voxel grid voxel with index (m, n, k) in the three-dimensional data group, and carrying out bounding box detection on the traversed voxel grid voxel;
and if the intersection exists, stopping traversing, and calculating the distance from any ray to the surface of the obstacle based on ray detection.
In another possible implementation, the collision detection result of any ray includes: the distance of any ray to the surface of the obstacle; the beam model comprises a beam waist value of the light beam and the wavelength of the light beam;
the laser radar simulation module is based on the collision detection result and the beam model, and is specifically used for:
determining a spot radius based on a distance of the any ray to the surface of the obstacle, a beam waist value of the light beam and a wavelength of the light beam;
determining a laser radar ray set which takes any ray direction vector as a center and takes the spot radius as a radius;
and simulating the divergence phenomenon of the rays in the propagation process based on the laser radar ray set.
In another possible implementation, the lidar simulation result includes: a distance value of each ray to the obstacle; the device further comprises: a first processing module, a second processing module, a simulation module, and a fourth determination module, wherein,
the first processing module is used for processing the distance value from each ray to the obstacle by combining a noise model to obtain a distance value corresponding to each ray after noise is added;
the second processing module is used for processing the distance value corresponding to each ray added with the noise by combining with the weather model to obtain the laser intensity corresponding to each ray under each weather type;
the simulation module is used for simulating the laser intensity corresponding to each ray in the target weather type through an echo mode and a Dropoff mechanism;
and the fourth determining module is used for determining the three-dimensional point cloud data of each laser radar ray and the intensity value after normalization processing through three-dimensional point calculation processing and normalization processing based on the incidence angle and the simulation result after distortion processing of each laser radar ray.
In another possible implementation manner, the lidar simulation result includes: a distance value of each ray to the obstacle;
wherein the apparatus further comprises: a second obtaining module and a fifth determining module, wherein,
the second obtaining module is used for obtaining the variance of the measured distance;
the fifth determining module is used for determining expectation and variance of a noise model based on the variance of the measured distance;
when the first processing module is used for processing the laser radar simulation result in combination with the noise model, the first processing module is specifically configured to:
and adding noise to the distance value of each ray from the obstacle based on the expectation and the variance of the noise model to obtain a distance value after the noise is added.
In another possible implementation manner, the weather model includes a corresponding relationship between a weather type, a measurement range, and an attenuation degree;
the second processing module is used for processing the distance value corresponding to each ray after the noise is added in combination with the weather model, and specifically used for:
determining whether the distance value corresponding to each ray added with the noise belongs to the measurement range corresponding to the target weather type;
if the ray belonging to the measurement range corresponding to the target weather type exists, acquiring the attenuation rate corresponding to the target weather type;
and determining the laser intensity corresponding to the rays meeting the measurement range under the target weather type based on the distance value corresponding to the rays meeting the measurement range and the attenuation rate corresponding to the target weather type.
In another possible implementation manner, when the simulation module performs the simulation based on the laser intensity corresponding to any ray in the target weather type and in the echo mode, the simulation module is specifically configured to:
determining an echo pattern of the any ray;
if the echo mode of any ray is a single echo mode, returning a first laser intensity and a corresponding first distance value, wherein the first laser intensity is the laser intensity corresponding to any ray under the target weather type, and the first distance value is the distance value from the ray corresponding to the first laser intensity to the obstacle;
if the echo mode of any ray is a multi-echo mode, returning the first laser intensity, the first distance value, the laser intensity meeting a second preset condition and the distance value corresponding to the laser intensity meeting the second preset condition, wherein the laser intensity meeting the second preset condition is the first N laser intensities selected from the laser intensities corresponding to the rays meeting the first preset condition from large to small; the distance values corresponding to the laser intensity meeting the second preset condition are the first N distance values selected from the distance values corresponding to the rays meeting the first preset condition from large to small.
In another possible implementation manner, when the simulation module performs the simulation based on the laser intensity corresponding to any ray in the target weather type and through a Dropoff mechanism, the simulation module is specifically configured to:
determining whether a second laser intensity is greater than a first intensity threshold value, wherein the second laser intensity is the laser intensity corresponding to any ray in a target weather type, and the first intensity threshold value is the laser intensity threshold value for generating a Dropoff phenomenon;
if the intensity is larger than the first intensity threshold value, outputting the second laser intensity and a corresponding second distance value;
if the intensity is not larger than the first intensity threshold value, calculating an attenuation value based on the second laser intensity;
and if the attenuation value is larger than a second intensity threshold value, outputting the second laser intensity and the first distance value, wherein the second intensity threshold value is a random value.
In another possible implementation, the simulation result includes: the laser intensity of the laser radar is the laser intensity corresponding to any laser radar ray, and the distance value of the laser radar is the distance value of the ray corresponding to the laser intensity of the laser radar;
the fourth determining module is specifically configured to, when determining three-dimensional point cloud data of any one of the laser radar rays and an intensity value after normalization processing based on the incident angle, the second laser intensity, and the second distance value after distortion processing of any one of the laser radar rays through three-dimensional point calculation processing and normalization processing:
calculating a three-dimensional point coordinate measured by any laser radar ray under a laser radar Cartesian coordinate system based on the incidence angle, the third laser intensity and the third distance value after any laser radar ray is subjected to distortion processing;
and determining the intensity value after normalization processing based on the third laser intensity, the maximum laser intensity and the minimum laser intensity, wherein the maximum laser intensity and the minimum laser intensity are the maximum value and the minimum value in the laser intensity values respectively corresponding to all the rays.
In a third aspect, an electronic device is provided, which includes:
one or more processors;
a memory;
one or more application programs, wherein the one or more application programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to: and executing the corresponding operation of the laser radar simulation method according to any one of the possible implementation manners of the first aspect.
In a fourth aspect, there is provided a computer readable storage medium having stored thereon at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the method of lidar simulation as shown in any of the possible implementations of the first aspect.
In a fifth aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method for lidar simulation provided in the above-described alternative implementation.
In summary, the present application includes at least one of the following beneficial technical effects:
the application provides a laser radar simulation method, a laser radar simulation device, electronic equipment and a storage medium, by constructing a three-dimensional simulation scenario and reading the lidar simulation parameters including the angle of incidence of each lidar ray in the present application, and the incident angle of each laser radar ray is processed by motion distortion to obtain the distorted incident angle, the laser radar simulation is carried out according to the three-dimensional simulation scene, the incidence angle after the distortion treatment and other parameters required by the laser radar simulation, namely when the laser radar simulation is carried out, the incident angle of each laser radar ray is subjected to motion distortion processing to simulate the influence of motion distortion on the laser radar rays in a real scene, therefore, the simulation of the laser radar can be closer to the real situation, and the laser radar can be more accurately simulated on the simulation platform.
Drawings
Fig. 1 is a schematic flowchart of a method for laser radar simulation according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an incident angle of a laser radar beam provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of a sphere three-dimensional simulation sub-scene provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of a sphere element rasterization provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of a simulation performed by adding a beam model according to an embodiment of the present application;
FIG. 6 is a schematic diagram of sampling a set of incident angles in a range of light spots provided by an embodiment of the present application;
fig. 7 is a schematic structural diagram of a lidar simulation apparatus according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The present application is described in further detail below with reference to the accompanying drawings.
The present embodiment is only for explaining the present application, and it is not limited to the present application, and those skilled in the art can make modifications of the present embodiment without inventive contribution as needed after reading the present specification, but all of them are protected by patent law within the scope of the claims of the present application.
In a real scene, the working performance (e.g. measuring distance and strength) of the laser radar is susceptible to a complex test scene (high-reflection object, near obstacle, sunlight irradiation, multi-radar correlation), so that the reproduction of the working performance of the laser radar in the actual scene on a simulation platform is often greatly challenged. Meanwhile, the truth degree and the efficiency of laser radar simulation also have great influence on the simulation test evaluation of the high-standard mobile robot. For example, the lidar simulation data does not take into account motion distortion, the lidar parameters are susceptible to environmental factors (e.g., weather, smoke environment, etc.), the echo intensity of the lidar is too low, and there may be a possibility of random misjudgment as noise.
In order to solve the technical problems, the embodiment of the application provides high-fidelity laser radar simulation, environmental factors in a simulation scene and functional performance of each laser radar in the current industry are fully considered, and high-fidelity laser radar simulation data can be provided. For example, a motion distortion model is added in laser radar simulation, so that laser radar simulation data caused by motion distortion can be generated, and the simulation data is closer to laser radar data measured in a real scene; according to the embodiment of the application, environmental factors of a simulation scene are fully considered, a weather model and a noise model are introduced, and more real laser radar simulation data (measured distance and intensity) are realized; the embodiment of the application provides a simple and effective echo model, which can realize multi-echo or single-echo laser radar simulation; the embodiment of the application provides a Dropoff mechanism which can simulate the phenomenon that random noise misjudgment exists when the laser intensity is too small; according to the method and the device, the parameters required by the simulation model are calibrated by using the real laser radar measurement data, so that the simulation result is closer to the real measurement; the embodiment of the application constructs the sphere three-dimensional simulation sub-scene, and stores the sub-scene by using the sphere element grid, so that the retrieval speed of collision detection is accelerated, and the retrieval speed is higher than that of the common grid.
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In addition, the term "and/or" herein is only one kind of association relationship describing an associated object, and means that there may be three kinds of relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship, unless otherwise specified.
The embodiments of the present application will be described in further detail with reference to the drawings attached hereto.
The embodiment of the application provides a laser radar simulation method, which can be executed by electronic equipment, wherein the electronic equipment can be a server or terminal equipment, the server can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and a cloud server for providing cloud computing service. The terminal device may be a smart phone, a tablet computer, a laptop, a desktop computer, or the like, but is not limited thereto, and the terminal device and the server may be directly or indirectly connected in a wired or wireless communication manner.
As shown in fig. 1, the method may include:
and S101, constructing a three-dimensional simulation scene.
The three-dimensional simulation scene is a vivid virtual environment which is generated by using a computer technology and has multiple perceptions such as sight, hearing, touch, taste and the like, and a user can use various sensing devices to interact with entities in the virtual environment through natural skills of the user.
Specifically, the method for constructing the three-dimensional simulation scene comprises the following steps: acquiring sensor data, and constructing a three-dimensional simulation scene based on the sensor data; or, building a three-dimensional simulation scene through a physical engine. In an embodiment of the present application, the sensor data may include: an inertial measurement unit, point cloud data of a laser radar, image information acquired by a camera, and the like.
The three-dimensional simulation scene comprises the following steps: the three-dimensional simulation scene is described by a triangular grid or a square grid.
And step S102, reading parameters for laser radar simulation.
For the embodiment of the application, the parameters for the laser radar simulation are read from the outside; the parameters for the lidar simulation comprise: the angle of incidence of each lidar ray, in addition to the parameters used for lidar simulation, may include: measuring range, echo parameters, working frequency and the like.
It should be noted that step S101 may be executed before step S102, after step S102, or simultaneously with step S102, which is not limited in the embodiment of the present application, and fig. 1 is only one possible execution manner, and is not limited in the embodiment of the present application.
And step S103, performing motion distortion processing on the incident angle of each laser radar ray to obtain the distorted incident angle.
Because the laser radar is accompanied with the carrier in the motion process, each laser point is generated on different reference poses, the obtained point cloud data are not all point cloud data on the same reference pose, and especially when the scanning frequency of the laser radar is low, the motion error of a laser frame caused by the motion of the laser radar carrier cannot be ignored.
In order to simulate the lidar more truly, a motion distortion model is added to the lidar simulation in the embodiment of the application. Namely, the incident angle of each laser radar ray is subjected to motion processing through a motion distortion model to obtain the incident angle after the distortion processing.
It should be noted that step S101 may be executed before step S103, after step S103, or simultaneously with step S103, and is not limited in this embodiment, where fig. 1 is only one possible execution manner, and is not limited in this embodiment.
And step S104, performing laser radar simulation based on the three-dimensional simulation scene, the incidence angle after distortion processing and other parameters except the incidence angle of the laser radar ray.
And further, performing laser radar simulation on the obtained constructed three-dimensional simulation scene, the distorted incidence angle, the distorted measurement range, the echo parameter, the working frequency and other parameters for laser radar simulation. Specifically, laser radar simulation may be performed on a corresponding simulation platform based on the three-dimensional simulation scene, the distorted incident angle, and other parameters except the incident angle of the laser radar ray.
The embodiment of the application provides a laser radar simulation method, in the embodiment of the application, a three-dimensional simulation scene is constructed, laser radar simulation parameters including the incidence angle of each laser radar ray are read, motion distortion processing is carried out on the incidence angle of each laser radar ray, the incidence angle after distortion processing is obtained, laser radar simulation is carried out according to the three-dimensional simulation scene, the incidence angle after distortion processing and other parameters required by laser radar simulation, namely, when laser radar simulation is carried out, motion distortion processing is carried out on the incidence angle of each laser radar ray, the influence of motion distortion on the laser radar rays in a real scene is simulated, so that the laser radar simulation is closer to a real condition, and accurate simulation on the laser radar can be realized on a simulation platform.
Further, in the step S101, a three-dimensional scene is constructed based on vision, and common techniques include: the Structure From Motion (SFM) method, the Shape From Texture (SFT) method, and the like. The three-dimensional scene construction based on multi-sensor fusion is characterized in that visual fusion laser radar and an inertial measurement unit (R3 live), SFM based on laser radar enhancement, visual fusion inertial measurement unit based on extended Kalman filtering, global positioning system (ethzasl sensor fusion) and the like are common technologies. In addition, the three-dimensional simulation scene can also be artificially designed by a physical Engine, such as Blender, unregeal Engine. Meanwhile, the three-dimensional simulation scene can add semantic information to the obstacles in the simulation scene through data annotation or semantic segmentation, and the three-dimensional simulation scene with the semantic information is output.
And further, reading parameters for laser radar simulation, and configuring the simulated laser radar model. Firstly, the laser radar working parameters (measuring range, echo parameter, field angle, angular resolution, number of output points, working frequency, laser wavelength and measuring noise) are read according to a laser radar working manual. There is also a need to provide the pose of the lidar in the three-dimensional simulation scene as well as the angle of incidence (θ, φ) of the lidar rays. The angle of incidence (θ, φ) for the lidar ray may be calculated by combining the field angle, angular resolution, and number of out-spots, or measured.
It should be noted that, in a real environment, the laser radar performs multi-frame measurement in a static and motionless state and guarantees the validity of each measurement point. The incident angle of the laser radar ray read in step S102 may be the incident angle after the denoising process. That is, reading the incident angle of the laser radar ray in step S102 may be preceded by: acquiring a multi-frame measurement result; removing noise points from the multi-frame measurement results; calculating an incidence angle for an effective point corresponding to each ray in each frame; and determining the incidence angle corresponding to the ith ray based on the incidence angle corresponding to the ith ray in each frame. Where i ∈ [1, n ], n is the number of rays contained in each frame.
And removing noise points (outliers) by using a RAndom SAmple Consensus (RANSAC) algorithm on the multi-frame measurement results, calculating an incident angle for an effective point corresponding to each ray of each frame, and finally taking the average value of the incident angles calculated by multiple frames as the incident angle of the current laser radar.
Specifically, the incident angle of the ith ray is determined by the following formula (1), wherein,
Figure 507846DEST_PATH_IMAGE001
and
Figure 602710DEST_PATH_IMAGE002
formula (1);
wherein (theta) i ,φ i ) Is the last found incident angle of the ith ray,
Figure 551074DEST_PATH_IMAGE003
Figure 207446DEST_PATH_IMAGE004
;θ i,m is the vertical angle of incidence of the ith ray in the m-th frame of measurements, i.e., the angle between the lidar ray and the positive z-axis, phi i,m Is the horizontal angle of incidence of the ith ray in the m-th frame of measurements, (x) i,m, y i,m , z i,m ) Is the measurement of the ith ray in the m-th frame measurement. k is the number of valid points after noise points or invalid points are filtered out using the RANSAC algorithm.
Besides, the method can also be used for calibrating the measurement noise in the same way. Sigma x ,σ y ,σ z Is to measure the variance of the noise, σ d The variance of distance noise is measured, and the variance can be used as a noise model in a simulation model.
Figure 241261DEST_PATH_IMAGE005
Figure 139816DEST_PATH_IMAGE006
Figure 208266DEST_PATH_IMAGE007
Is the mean value of the valid points after filtering the noise points using the RANSAC algorithm.
Wherein, the first and the second end of the pipe are connected with each other,
Figure 32609DEST_PATH_IMAGE008
Figure 288141DEST_PATH_IMAGE009
Figure 990386DEST_PATH_IMAGE010
(ii) a Wherein the content of the first and second substances,
Figure 647764DEST_PATH_IMAGE011
further, the parameters for lidar simulation include: field angle and angular resolution; in step S102, reading parameters for lidar simulation, which may further include: based on the angle of view and the angular resolution, a three-dimensional table of angles of incidence is built.
Specifically, a three-dimensional incident angle table is built through the angle of view and the angular resolution, and the dimension is (m, n, k). Theta is the vertical incident angle, phi is the horizontal incident angle, and k is the number of exit points. The table is used for storing three-dimensional incident angles to be simulated, and simulation output results can be obtained by traversing the array according to a certain sequence, so that ordered point cloud data are generated.
Specifically, the calculation of m and n in the dimension (m, n, k) is shown in formula (2), wherein,
Figure 911517DEST_PATH_IMAGE012
Figure 388766DEST_PATH_IMAGE013
formula (2)
Wherein, vertical field angle range: (theta. providing a sufficient balance between the values min ,θ max ) Horizontal field angle range: (phi) min ,φ max ) Wherein each vertical field angle and each horizontal field angle are as shown in FIG. 2, each adjacent vertical field angle differs by a horizontal angular resolution, each adjacent vertical field angle differs by a vertical angular resolution, further, in FIG. 2, the vertically upward axis is the φ axis, φ is from φ in the axial direction min To phi max Varying (i.e., the range of m), the horizontal right axis being the theta axis, along which theta is oriented to the right from theta min To theta max Change (i.e., the value range of n). Because of the three-dimensional data structure, i.e. fig. 2 contains k-axis in addition to phi-axis and theta-axis, fig. 2 contains a plurality of tables from near to far, i.e. representing different k-values, and from near to far, k is from k min To k max For example, from near to far, the first table corresponds to a k value of k min The k value corresponding to the farthest table is k max Again, each table represents a two-dimensional data structure, k being based on k from k min To k max A plurality of tables, i.e. a three-dimensional data structure, i.e. the three-dimensional data angle table referred to above, i.e. the three-dimensional data angle table, are obtained and stored in the manner shown in fig. 2.
Wherein, vertical angular resolution is: sigma θ, Horizontal angular resolution: sigma φ (ii) a Laser radar ray (theta) i ,φ j ) Index (m) j ,n i ) Wherein, in the step (A),
Figure 629123DEST_PATH_IMAGE014
Figure 406587DEST_PATH_IMAGE015
further, after the incident angle of each lidar ray is read through the above embodiment, in order to simulate the lidar more truly, the read incident angle of each lidar ray is passed through a motion distortion model to obtain a distorted incident angle.
Specifically, in step S103, each laser radar ray is irradiatedThe motion distortion processing is performed according to the incident angle, which may specifically include: acquiring current pose information; determining a relative transformation matrix generated by motion distortion based on the current pose information; and performing motion distortion processing on the incidence angle of each laser radar ray based on the relative transformation matrix. In the embodiment of the present application, the positioning pose may be provided by other sensors, such as an IMU and a fisheye camera, and after the positioning pose is obtained, the relative transformation matrix T may be obtained in an interpolation manner i It can also be assumed that the lidar is in uniform motion in the time period of motion distortion, i.e. the current pose is multiplied by a small random number to obtain a small pose offset as T i So that according to T i And carrying out motion distortion processing on the incident angle of each laser radar ray to obtain the distorted incident angle.
Specifically, the performing motion distortion processing on the incident angle of each lidar ray based on the relative transformation matrix to obtain the distorted incident angle may specifically include: determining a unit direction vector of an incident angle of each laser radar ray in a specific coordinate system to obtain a unit direction vector corresponding to each ray; determining a unit direction vector after each ray is distorted based on the unit direction vector corresponding to each ray and the relative transformation matrix; and determining the distorted incidence angle of each ray based on the unit direction vector of each ray after motion distortion.
With the above embodiment, the incident angle of the i-th lidar ray after calibration is known and is denoted as (θ) i ,φ i ) In the Cartesian coordinate system of the laser radar, theta i Is the angle between the laser radar ray and the positive z-axis, phi i The starting point of the lidar ray is the origin of the lidar coordinate system for the azimuth angle between the projection of the lidar ray on the x-y plane and the positive x-axis. The unit direction vector of the laser radar ray in a laser radar Cartesian coordinate system is (x) i , y i , z i ). Incident angle (theta) i ,φ i ) And unit direction vector (x) i , y i , z i ) The relationship is shown in formula (3) to obtain the movement deformityThe unit direction vector of the changed laser radar beam is (
Figure 775995DEST_PATH_IMAGE016
Figure 786545DEST_PATH_IMAGE017
Figure 315746DEST_PATH_IMAGE018
) And the relative transformation matrix T generated by the motion distortion obtained by the above embodiment i The obtained two relations are shown in formula (4), that is, the unit direction vector of the laser radar ray after motion distortion is obtained through formula (4), and the incidence angle after motion distortion processing is obtained based on formula (5).
Wherein the content of the first and second substances,
Figure 964028DEST_PATH_IMAGE019
Figure 287693DEST_PATH_IMAGE020
equation (3);
wherein the content of the first and second substances,
Figure 254381DEST_PATH_IMAGE021
formula (4);
wherein the content of the first and second substances,
Figure 587273DEST_PATH_IMAGE022
Figure 821552DEST_PATH_IMAGE023
equation (5).
After the distortion-processed incident angle is obtained by the above-described embodiment, the lidar simulation is performed based on the three-dimensional simulation scene, the distortion-processed incident angle, and other parameters except the incident angle of the lidar ray.
Specifically, performing lidar simulation based on the three-dimensional simulation scene, the incident angle after any distortion processing, and other parameters except the incident angle of the lidar ray may specifically include: step Sa (not shown), step Sb (not shown), and step Sc (not shown), wherein,
and step Sa, constructing a sphere three-dimensional simulation sub-scene based on the three-dimensional simulation scene.
In order to increase the simulation speed, a sub-scene is constructed according to the three-dimensional simulation scene constructed in step S101, so as to reduce redundant scenes and accelerate the retrieval speed of collision detection. Because the incident angle of the laser radar ray is under the laser radar spherical coordinate system, the retrieval of the sub-scene by using the voxel grid is more convenient. Specifically, according to a three-dimensional simulation scene, the origin of the current coordinate system { Lidar } of the laser radar is used as the origin, and the effective detection range of the laser radar is the radius [ r ] min , r max ]And constructing a sphere three-dimensional simulation sub-scene as shown in fig. 3.
And Sb, carrying out collision detection based on the sphere three-dimensional simulation sub-scene, the incidence angle after any distortion treatment and other parameters except the incidence angle of the laser radar ray.
In particular, other parameters than the angle of incidence of the lidar rays may include: the effective detection range, the vertical angle resolution, the horizontal angle resolution and the diameter resolution of the laser radar; in the step Sb, collision detection is performed based on the sphere three-dimensional simulation sub-scene, the incident angle after any distortion processing, and other parameters except the incident angle of the laser radar ray, which may specifically include: step Sb1 (not shown), and step Sb2 (not shown), wherein,
and step Sb1, performing spheric element rasterization on the spheric element sub-scene according to the vertical angular resolution, the horizontal angular resolution and the diameter resolution to obtain a three-dimensional simulation sub-scene with the spheric element rasterization. In the embodiment of the application, the vertical angular resolution sigma in the simulation parameters of the laser radar is used θ Horizontal angular resolution σ φ And additionally provided diameter resolution r d The sphere sub-scene is rasterized in terms of sphere elements, as shown in fig. 4. Because the parameters are fixed parameters, a three-dimensional array can be established before simulation begins to store the sub-scenes.
Wherein the diameter resolution r d Not the working parameters of the laser radar, but the parameters required by the simulation of the laser radar, and further the diameter resolution r d The method can be set according to the operation efficiency or the calculation force in the simulation, and the higher the resolution is, the more sub scenes are divided, and the higher the memory occupation is.
Further, the three-dimensional simulation sub-scene of the spherical element rasterization comprises a plurality of element grid voxels corresponding to each ray; in order to search for the corresponding voxel grid voxel quickly, a three-dimensional data set can be created, the three-dimensional data set is used for storing each meta-grid voxel and determining the corresponding index value, and the corresponding meta-grid voxel can be searched quickly according to the index value. The creating of the three-dimensional data group may specifically include: determining index values corresponding to a plurality of element grid voxels corresponding to each ray based on the incidence angle after each ray is distorted, the propagation distance of each ray, the effective detection range of the laser radar, the vertical angular resolution and the horizontal angular resolution; and creating a three-dimensional data set based on the index values respectively corresponding to the plurality of meta-grid voxels corresponding to each ray.
For example, the ith lidar ray (θ) i ,φ i ) The passing voxel grid voxel index is determined by the following formula:
Figure 581697DEST_PATH_IMAGE024
Figure 301261DEST_PATH_IMAGE025
Figure 437844DEST_PATH_IMAGE026
wherein (m, n, k) is the index of the voxel grid voxel passed by the ith laser radar ray, r is the propagation distance of the laser radar ray, m, n, k are integers, sigma θ For vertical angular resolution, σ φ For horizontal and angular resolution and r d Diameter resolution.
Under a laser radar spherical coordinate system, a point (theta, phi, r) in a spherical voxel grid voxel with an index of (m, n, k) needs to satisfy the following condition:
1. theta in the range of [ theta ] i – σ θ /2 , θ i + σ θ /2 )
2. Phi range is [ phi ] i – σ φ /2 , φ i + σ φ /2 )
R is in the range of [ (k-1) r d +r min , k • r d +r min )
Further, after obtaining the three-dimensional simulated self-scene in which the spherical elements are rasterized by the above-described embodiment (step Sb 1), a collision detection simulation is performed, and the collision detection generally has two meanings, one is collision detection in a physical sense, and the other is collision detection in a mathematical sense. In the embodiment of the application, the collision detection refers to the collision detection of the laser radar ray and the obstacle in the simulation scene, and is purely mathematical collision detection, namely, whether the objects intersect (or contain or coincide) or not is judged, and the intersection point is calculated. In a general 3D physics engine, squares, spheres, rays/line segments are often used instead of intersection detection of complex figures. If further improvement in accuracy is required, a triangle or a mesh is used for collision detection, that is, a simplified Bounding Volume (3D) is used instead of the body for collision detection. The two most commonly used shapes are Bounding Box (Bounding Box) and Bounding Sphere (Bounding Sphere), which are the fastest for collision detection. Through a bounding box detection algorithm, obstacles colliding with laser radar rays in a three-dimensional simulation scene can be quickly retrieved. In the embodiment of the application, the simulation scene established by the three-dimensional reconstruction or the physical engine can add semantic information to each obstacle in the scene through data annotation or semantic segmentation. From this semantic information, bounding volumes can be quickly established for the obstacles. The distance from the laser radar Ray to the surface of the obstacle is accurately calculated through Ray detection (Ray cast) according to the obstacle retrieved by the bounding box detection algorithm, and the specific collision mode is described in the following embodiment.
And step Sb2, performing collision detection based on the three-dimensional simulation sub-scene rasterized by the spherical elements and the incidence angle after any distortion processing.
The rasterized three-dimensional simulation sub-scene of the spherical elements comprises a plurality of spherical element raster voxels;
specifically, the collision detection is performed based on the three-dimensional simulated sub-scene rasterized by the spherical elements and any distorted incident angle, which may specifically include: step Sb21 (not shown), step Sb22 (not shown), step Sb23 (not shown), step Sb24 (not shown), step Sb25 (not shown), step Sb26 (not shown), step Sb27 (not shown), and step Sb28 (not shown), wherein,
and step Sb21, acquiring semantic information in the three-dimensional simulation sub-scene with the spherical elements rasterized.
For the embodiment of the application, in the above embodiment, it is pointed out that after the three-dimensional simulation sub-scene is constructed, semantic information may be added to the obstacle in the simulation scene based on data annotation or semantic segmentation, so as to obtain the three-dimensional simulation scene with the semantic information. Based on the method, after the three-dimensional simulation sub-scene with the rasterized spherical elements is obtained, semantic information in the three-dimensional simulation sub-scene with the rasterized spherical elements can also be obtained.
Step Sb22, bounding box partitioning the obstacle in each voxel grid voxel based on semantic information in the three-dimensional simulation sub-scene of the voxel rasterization.
The bounding box is an algorithm for solving the optimal bounding space of the discrete point set, and the basic idea is to approximately replace a complex geometric object by a geometric body (called the bounding box) with a slightly larger volume and simple characteristics; common bounding box algorithms are Axis-aligned bounding boxes (AABB), bounding balls, Oriented Bounding Boxes (OBBs), and Fixed orientation bumps (FDHs).
And step Sb23, determining the coordinate of any ray in the spherical coordinate system of the laser radar based on the incidence angle after any distortion processing, the pose of the laser radar in the simulation scene and the spherical three-dimensional simulation sub-scene.
For the embodiments of the present application, any motion distorted lidar ray incidence angle (c: (a))
Figure 60717DEST_PATH_IMAGE027
Figure 991764DEST_PATH_IMAGE028
) Determining the coordinates (m, n) of the ray in the spherical coordinate system of the laser radar according to the pose of the laser radar in the simulation scene and the spherical three-dimensional simulation sub-scene obtained in the step Sa, wherein the index m, n can be obtained through the ray incidence angle of the laser radar (the)
Figure 933044DEST_PATH_IMAGE027
Figure 607739DEST_PATH_IMAGE028
) And (4) determining.
And step Sb24, determining the distance from any ray to the surface of the obstacle based on the coordinates of any ray in the spherical coordinate system of the laser radar and a plurality of spherical element grid voxels.
For the embodiment of the application, the coordinate of any ray under the spherical coordinate system of the laser radar is (m, n); determining the distance from any ray to the surface of the obstacle based on the coordinate of any ray in the spherical coordinate system of the laser radar and a plurality of spherical voxel grid voxels, which may specifically include: from k = k min To k = k max Traversing a voxel grid voxel with index (m, n, k) in the three-dimensional data group, and carrying out bounding box detection on the traversed voxel grid voxel; if the intersection exists, stopping traversing, and calculating the distance from any ray to the surface of the obstacle based on ray detection. That is, in the embodiment of the present application, the index of the grid voxel of a plurality of voxel elements is (m, n, k), from k min To k = k max And traversing the voxel grid voxels with the index of (m, n, k) in the sub scenes. For example, the measurement range of the lidar is 200 meters, and if the resolution is 1 meter, the measurement range of the lidar is divided into 200 segments, i.e. k is [0,199 ]],k min =0,k max = 199; grid voxels to the traversed spheroid elementsAnd carrying out bounding box detection, judging whether an intersection point exists, skipping if the intersection point does not exist, stopping traversing if the intersection point exists, and further accurately calculating the distance from the laser radar Ray to the surface of the obstacle through Ray detection (Ray cast).
Further, in the embodiment of the present application, the distance from any ray to the surface of the obstacle is determined based on the coordinate of the ray in the laser radar spherical coordinate system, and if there are multiple rays, the determination manner corresponding to the distance from each ray to the surface of the obstacle may be according to the above embodiment, or traversal may be performed first based on θ from small to large, and then traversal is performed based on Φ from small to large, and the specific traversal manner is not described in detail in the embodiment of the present application.
And step Sb25, determining the size of a light spot generated by any ray to the surface of the obstacle based on the distance between any ray and the surface of the obstacle and the light beam model.
Specifically, the beam model in the embodiment of the present application is shown in formula (6), that is, the size of the light spot generated by any ray to the surface of the obstacle is determined based on the distance from any ray to the surface of the obstacle determined in step Sb24 and formula (6).
Wherein the content of the first and second substances,
Figure 816610DEST_PATH_IMAGE029
equation (6); wherein, the distance between any laser radar ray and the obstacle is used as the beam propagation distance m, and the beam waist r of the Gaussian beam 0 Wavelength λ, wherein the beam waist r of the Gaussian beam 0 And wavelength λ is the lidar parameter.
And step Sb26, determining the incidence angle after the ray distortion treatment meeting the first preset condition.
The incidence angle after the ray distortion processing which meets the first preset condition is the incidence angle after the distortion processing corresponding to the ray which takes the spot size as the radius and takes the direction vector of any ray as the center.
For example, the incident angle after the beam distortion process satisfying the first preset condition may be a laser beam having a spot radius as a radius with the beam direction vector of the laser radar as a centerRadar ray collection
Figure 918558DEST_PATH_IMAGE030
Figure 612714DEST_PATH_IMAGE031
}. In the exemplary embodiment of the application, the lidar ray collection
Figure 91100DEST_PATH_IMAGE030
Figure 157407DEST_PATH_IMAGE031
It is possible to store in the three-dimensional incident angle table according to the above-described embodiment, and generate an index corresponding to each incident angle.
And step Sb27, determining the coordinates of each ray meeting the first preset condition in a laser radar spherical coordinate system based on the incidence angle after the distortion processing of each ray meeting the first preset condition, the pose of the laser radar in the simulation scene and the spherical three-dimensional simulation sub-scene.
In the embodiment of the present application, the last page is collected based on the lidar ray
Figure 695836DEST_PATH_IMAGE030
Figure 346129DEST_PATH_IMAGE031
Determining coordinates of each incident angle in the lidar spherical coordinate system according to each incident angle, the pose of the lidar in the simulation scene and the sphere simulation sub-scene, which is described in detail in the implementation manner corresponding to step Sb3, and is not described herein again.
And step Sb28, determining the distance from each ray meeting the first preset condition to the surface of the obstacle based on the coordinates of each ray meeting the first preset condition in the laser radar spherical coordinate system and a plurality of spherical element grid voxels.
Further, each ray referred to in the embodiments of the present application as meeting the first preset condition refers to a lidar collection
Figure 893785DEST_PATH_IMAGE030
Figure 811669DEST_PATH_IMAGE031
Ray corresponding to each incident angle. In the embodiment of the present application, determining the distance from each ray satisfying the first preset condition to the surface of the obstacle is described in detail in an implementation manner corresponding to step Sb4, and details are not described herein again.
And step Sc, carrying out laser radar simulation based on the collision detection result and the beam model.
For the embodiment of the present application, the collision detection result of any ray includes: distance of any ray to the surface of the obstacle; the beam model includes the beam waist value of the beam and the wavelength of the beam. In the embodiment of the application, in order to truly simulate the propagation process of laser radar rays in a medium, a beam model is added in the simulation of the laser radar in the embodiment of the application, a facula effect generated by divergence of a laser beam in the propagation process is simulated, and the generated facula is used for an echo mode. In order not to affect the real-time performance of the simulation, a gaussian beam model may be used in the embodiment of the present application. In optics, a gaussian beam is an electromagnetic beam whose transverse electric field and irradiance distribution approximately satisfy a gaussian function.
Specifically, in step Sc, performing laser radar simulation based on the collision detection result and the beam model, which may specifically include: step Sc1 (not shown), step Sc2 (not shown), and step Sc3 (not shown), wherein,
step Sc1, determining the spot radius based on the distance of any ray to the surface of the obstacle, the beam waist value of the beam and the wavelength of the beam. In the present embodiment, the spot radius is determined by equation (7), wherein,
Figure 786578DEST_PATH_IMAGE032
formula (7);
wherein the content of the first and second substances,
Figure 658588DEST_PATH_IMAGE033
m denotes the propagation distance of the beam, r denotes the spot radius, r 0 The waist of the gaussian beam is characterized and λ represents the wavelength.
And step Sc2, determining a laser radar ray set which takes any ray direction vector as a center and takes the spot radius as a radius.
For the embodiment of the present application, the light spot is a light spot with a radius r and a direction vector of the laser radar ray as a center, and a distance from the laser radar ray to the obstacle is used as a beam propagation distance m, which is specifically shown in fig. 5.
According to the size of the spot radius, a laser radar ray set taking the laser radar ray direction vector as the center and the spot radius as the radius can be obtained
Figure 744356DEST_PATH_IMAGE034
And step Sc3, simulating a divergence phenomenon of the rays in the propagation process based on the laser radar ray set.
In particular, the set of lidar rays obtained by the above embodiments
Figure 300102DEST_PATH_IMAGE034
As a simulated laser radar ray divergence phenomenon in the propagation process. In addition to this, the lidar ray collection
Figure 931066DEST_PATH_IMAGE034
Can be used in echo mode to return multiple echo measurements. Wherein the lidar rays are collected
Figure 306683DEST_PATH_IMAGE034
The echo mode may be used, for details, in the following embodiments, which are not described herein.
Further, in order to improve the simulation speed and reduce the calculation cost, the incidence angle is collected in the light spot range
Figure 179830DEST_PATH_IMAGE034
Sampling is performed. In this applicationIn an embodiment, the sampling range refers to the light spot, in order to simulate the echo mode, one laser ray may generate a plurality of sampling values, so that the simulated ray may generate the light spot when striking on the obstacle, a range of incident angle sets (i.e., incident angles after the ray distortion processing satisfying the first preset condition) may be generated in the light spot range, in order to obtain a plurality of sampling values, sampling is performed in the light spot range, the incident angles of several rays are selected, and then the intersection of the several rays and the obstacle is simulated, so that a plurality of values may be obtained. Specifically, the step Sb26 determines the incident angle after the ray distortion processing satisfying the first preset condition, and may further include: determining the number of sampling points; based on the number of sampling points, dividing a sampling range which is formed by taking any ray direction vector as a center and taking the radius of a light spot as a radius; and sampling the laser radar ray set based on each division range to obtain the sampled laser radar ray set. That is, in the embodiment of the present application, the number n of the determined sampling points may be set to be twice of the maximum loop number in the laser radar parameter, then in the light spot, the sampling range is equally divided according to the light spot center and the light spot radius, the number of the divided ranges is a multiple of the number of the sampling points, it is ensured that the same number of points can be collected in each range, as shown in fig. 6, and finally in each divided range, random sampling is performed, and all sampling point incident angle set sets are output
Figure 590083DEST_PATH_IMAGE034
. Wherein all sampling points are set to incident angles
Figure 389018DEST_PATH_IMAGE034
Is not greater than the set of lidar rays obtained in the above-described embodiment
Figure 517511DEST_PATH_IMAGE034
In step Sb27, determining coordinates of each ray satisfying the first preset condition in the laser radar spherical coordinate system based on the angle of incidence after the distortion processing satisfying the first preset condition, the pose of the laser radar in the simulation scene, and the three-dimensional sphere simulation sub-scene may specifically include: and determining the coordinate of each ray meeting the first preset condition in a laser radar spherical coordinate system based on the incidence angle of the sampled laser radar ray set after the distortion processing of each ray, the pose of the laser radar in the simulation scene and the spherical three-dimensional simulation sub-scene. In the embodiment of the present application, a manner of determining the coordinate of each ray satisfying the first preset condition in the laser radar spherical coordinate system is described in detail in the above embodiment, and details are not described herein again.
In another possible implementation manner of the embodiment of the present application, the laser radar simulation result includes: a distance value of each ray to the obstacle; the method further comprises the following steps: step Sl (not shown), step Sm (not shown), step Sn (not shown), and step So (not shown), wherein step Sl-step So may be performed after obtaining the distance value of each ray to the obstacle, for example, step Sl-step So may be performed after step Sb, wherein,
and step Sl, combining the noise model, processing the distance value from each ray to the obstacle, and obtaining the distance value corresponding to each ray after the noise is added.
In order to simulate the measurement noise of the laser radar in the measurement process, noise is added to the distance value from each ray to the obstacle obtained in the above embodiment, and the noise generally includes gaussian noise or shot noise, etc. It is worth noting that since the distance values are used in the laser intensity value calculation, there is no need to add an additional noise model at the intensity values.
Processing the simulation result of the laser radar by combining with the noise model, wherein the processing method may further include: a variance of the measured distances is obtained, and an expectation and variance of the noise model are determined based on the variance of the measured distances. In the embodiment of the present application, the variance of the measured distance belongs to the parameters read in the above embodiment for the lidar simulation, and then the expected μ, variance σ of the noise model is calculated from the parameters.
After obtaining the expected μ and the variance σ of the noise model, processing the lidar simulation result by combining the noise model, which may specifically include: and adding noise to the distance value of each ray from the obstacle based on the expectation and the variance of the noise model to obtain the distance value after the noise is added. In the embodiment of the present application, the distance value after adding noise is determined by the following formula (8), in which,
Figure 928770DEST_PATH_IMAGE035
equation (8).
Wherein the content of the first and second substances,
Figure 193529DEST_PATH_IMAGE036
for characterizing the distance values after adding noise, z for characterizing the distance value of each ray to the obstacle, and n for characterizing the noise.
Further, the performance of the lidar is also related to weather, for example, in fog, snow, rain and other weather, the measurement range of the lidar is affected, and in addition, the weather also affects the attenuation rate of light in the air, and the calculation of the laser intensity is greatly affected. Therefore, the influence of different weather types on the working performance of the laser radar is simulated by combining a weather model, which is described in detail in the following embodiments. And Sm, combining a weather model, processing the distance value corresponding to each ray added with the noise to obtain the laser intensity corresponding to each ray under the target weather type.
Specifically, the weather model stores the corresponding relationship among the weather type, the measurement range and the attenuation degree; in the embodiment of the present application, the measurement ranges and attenuation degrees corresponding to four weather types, namely rain, snow, fog, and sunny, are given, and are specifically shown in table one.
Watch 1
Weather (weather) Measuring Range (m) Rate of decay
Figure 900716DEST_PATH_IMAGE037
(1/m)
Rain water 110 0.032
Snow (snow) 80 0.0412
Fog mist 50m 0.005
All weather 120m 0.0001
Specifically, after obtaining the measurement range and the attenuation degree respectively corresponding to each weather type, processing the distance value corresponding to each ray to which the noise is added in combination with the weather model, to obtain the laser intensity corresponding to each ray in the target weather type, which may specifically include: determining whether the distance value corresponding to each ray added with the noise belongs to the measurement range corresponding to the target weather type; if the ray belonging to the measurement range corresponding to the target weather type exists, acquiring the attenuation rate corresponding to the target weather type; and determining the laser intensity corresponding to the ray meeting the measurement range under the target weather type based on the distance value corresponding to the ray meeting the measurement range and the attenuation rate corresponding to the target weather type.
Specifically, based on the distance value corresponding to the ray satisfying the measurement range and the attenuation degree corresponding to the target weather type, the laser intensity corresponding to the ray satisfying the measurement range under the target weather type is determined by the following formula (9).
Figure 250926DEST_PATH_IMAGE038
Formula (9)
Wherein E is 0 Is the pulse energy emitted by the laser light,
Figure 465876DEST_PATH_IMAGE037
is the decay rate of the laser light in propagation,
Figure 585142DEST_PATH_IMAGE039
is the distance value after adding the noise. Pulse energy E 0 This can be obtained from the above-mentioned read lidar operating parameters, or can be obtained from a rough calculation of E = hc/λ, h being the planck constant, c being the speed of light in default vacuum, and λ being the laser wavelength (typically obtained from the lidar data sheets).
In this embodiment of the present application, the target weather type may be set by a user, or may be randomly selected by the simulation platform, for example, the target weather type may be rain, which is not limited in this embodiment of the present application.
When the target weather type is rain, for example, the measurement range and the attenuation degree are 110m and 0.0321/m, respectively, and further for example,
Figure 725880DEST_PATH_IMAGE040
=100m,E 0 =1, i.e. not greater than 110m, it is determined that the weather type is rain attenuation, i.e. 0.032, and then based on the distance value after adding noise
Figure 563386DEST_PATH_IMAGE036
And attenuation degree of 0.032, and determining the corresponding laser intensity of the ray in the environment with the weather type of rain as follows: intensity = e -3.2
Further, if the distance value after adding the noise obtained in the above embodiment is larger than the measurement range corresponding to the target type, for example, larger than the measurement range corresponding to the weather type being rain, the distance value is ignored.
It should be noted that, when determining the laser intensity corresponding to the target weather type, each ray may be simulated in the above manner in combination with the weather type, which is not limited in the embodiment of the present application.
Further, after simulating the laser intensity of the target weather type for each ray, the echo pattern and the Dropoff mechanism simulation may also be performed based on the corresponding laser intensity of each ray in the target weather type, which is described in detail in the following embodiments. And Sn, simulating through an echo mode and a Dropoff mechanism based on the corresponding laser intensity of each ray in the target weather type.
The echo patterns may respectively include: the Strongest echo (strong Return), the Last echo (Last Return), the single echo and the double echo (Dual Return) modes, when set to the double echo mode, the details of the target object will increase, the data volume is twice of the single echo, only when the distance between 2 distant objects is more than 1 meter will there be two echoes. Due to beam divergence, multiple laser returns are possible with any one laser shot out. The light spot becomes larger gradually after the laser pulse is emitted. Assuming that a spot is large enough to hit multiple targets, multiple reflections are produced. Generally, the farther the target is away, the weaker the energy at the receiver, and the opposite may be true for a bright or reflective surface, thus requiring simulation of the echo pattern.
Further, the Dropoff mechanism is used to simulate the possibility that when the laser intensity is too low, the return is mistaken for the filtered out measurement noise. When the laser intensity is less than a certain threshold, the laser intensity is too small, which may be measurement noise, or the intensity is too small due to too long propagation distance, so that there is a possibility of being filtered out or returning normally. Therefore, the random possibility is realized through a Dropoff mechanism so as to reduce the phenomenon of laser radar noise misjudgment.
Based on the above echo mode and the function of Dropoff mechanism, the following embodiment describes a specific implementation manner of laser radar simulation through echo mode and Dropoff mechanism.
Specifically, the simulating based on the laser intensity corresponding to any ray in the target weather type and through the echo mode may specifically include: determining an echo mode of any ray; if the echo mode of any ray is a single echo mode, returning the first laser intensity and the corresponding first distance value; and if the echo mode of any ray is a multi-echo mode, returning the first laser intensity, the first distance value, the laser intensity meeting the second preset condition and the distance value corresponding to the laser intensity meeting the second preset condition.
Specifically, the parameters read in the above embodiment for lidar simulation may include an echo mode corresponding to each laser ray. The first laser intensity is the laser intensity corresponding to any ray under the target weather type, and the first distance value is the distance value from the ray corresponding to the first laser intensity to the obstacle. The laser intensity meeting the second preset condition is the first N laser intensities selected from the laser intensities corresponding to the rays meeting the first preset condition from big to small; the distance values corresponding to the laser intensity meeting the second preset condition are the first N distance values selected from the distance values corresponding to the rays meeting the first preset condition from large to small.
The general echo mode is selected according to the intensity value, if it is single echo mode, the ray is returned directly (
Figure 50868DEST_PATH_IMAGE041
Figure 555798DEST_PATH_IMAGE042
) If the measured value is in the multi-echo mode, the measured value of the ray is returned, and the measured value is collected according to the laser radar ray with the laser radar ray direction vector as the center and the spot radius as the radius
Figure 119635DEST_PATH_IMAGE043
Figure 195169DEST_PATH_IMAGE044
And selecting a corresponding number of measured values with larger intensity values as other echoes. For example, in the dual echo mode, one ray needs to return two measurements, and in addition to the measurement of the ray, another measurement with the highest intensity needs to be returned. According to the beam model, a ray hits on an obstacle to generate a certain spot, then a ray set can be obtained by taking the spot as the radius, a ray set with a smaller number can be further obtained through sampling, then collision detection is carried out on all rays (namely the incident angle set, the incident angle is the direction vector of the ray, representing a ray and the rays cannot be intersected) in the ray set, the intensity value is calculated, and the measurement result with the maximum intensity value in the ray set is selected as a second measurement value. Three echoes are returned, and the general laser radar only has a single echo mode and a double echo mode.
It should be noted that, if the echo mode simulation is performed on a plurality of laser radar rays, any one of the laser radar rays may perform the echo mode simulation in the above manner, so as to obtain an echo mode simulation result corresponding to each ray.
Further, based on the laser intensity corresponding to any ray in the target weather type, and performing simulation through a Dropoff mechanism, the method specifically may include: determining whether the second laser intensity is greater than a first intensity threshold; if the laser intensity is greater than the first intensity threshold, outputting a second laser intensity and a corresponding second distance value; if the intensity is not larger than the first intensity threshold value, calculating an attenuation value based on the second laser intensity; and if the attenuation value is larger than the second intensity threshold value, outputting the second laser intensity and the first distance value.
The second laser intensity is the laser intensity corresponding to any ray under the target weather type, and the first intensity threshold is the laser intensity threshold for generating a Dropoff phenomenon; the second intensity threshold is a random value. In the embodiment of the present application, the first intensity threshold may be characterized by D1, that is, after obtaining the laser intensity of a certain ray in the target weather type through the above-mentioned embodiment, determining whether the laser intensity is greater than D1, if so, considering that the ray intensity value and the distance value are not noise and cannot be ignored, outputting the intensity value, if the ray intensity value is not greater than D1, calculating Dropoff based on the laser intensity value of the ray in the target weather type, and calculating through formula (10), and if the Dropoff value is greater than a Random value (that is, the above-mentioned second intensity threshold), considering that the ray intensity value and the distance value are not noise and cannot be ignored, and outputting the intensity value and the distance value. Otherwise, the ray measurement is ignored.
Figure 768233DEST_PATH_IMAGE045
Equation (10);
where Do is the probability that a return value can be obtained when Intensity is 0, and D is the value when the Intensity is 0, for example o Is 70%, i.e. a probability of 70% can result in a measured distance with an intensity value of 0. The value is generally an empirical value, and if the value is less than 50%, the measured data with the intensity value of 0 may be filtered because the general laser radar has a filtering function inside; dl is the Intensity threshold that produces the Dropoff phenomenon, Intensity being the Intensity value of the laser ray for that ray in the weather of the target type. For example, the D1 value may be obtained empirically by the user, the D1 value may be the resolution of the intensity values, and if the resolution of the intensity values is 0.1, the D1 value may be 0.1.
In order to convert the distance measured by the laser radar into the coordinate of the point in the laser radar coordinate system, and due to the difference in the measurement ranges of different types of laser radars, the ranges of the intensity values in the simulation model may be different, that is, in order to convert the distance into the coordinate and measure the uniform intensity value, three-dimensional point calculation processing and normalization processing are performed, which are specifically described in step So and a corresponding implementation manner.
And So, determining three-dimensional point cloud data of each laser radar ray and an intensity value after normalization processing through three-dimensional point calculation processing and normalization processing based on the incidence angle and the simulation result after distortion processing of each laser radar ray.
Specifically, the simulation results include: the third laser intensity is the laser intensity corresponding to any laser radar ray, and the third distance value is the distance value of the ray corresponding to the third laser intensity.
Specifically, determining three-dimensional point cloud data of any laser radar ray and an intensity value after normalization processing through three-dimensional point calculation processing and normalization processing based on an incident angle, a second laser intensity and a second distance value after distortion processing of any laser radar ray may specifically include: calculating a three-dimensional point coordinate measured by any laser radar ray under a laser radar Cartesian coordinate system based on the incidence angle, the third laser intensity and the third distance value after any laser radar ray is subjected to distortion processing; and determining the normalized intensity value based on the third laser intensity, the maximum laser intensity and the minimum laser intensity. In the embodiment of the present application, the maximum laser intensity and the minimum laser intensity are the maximum value and the minimum value of the laser intensity values respectively corresponding to the respective rays.
It is worth mentioning that: based on the angle of incidence, the third laser intensity, and the third distance value after any lidar ray distortion processing, the step of calculating the three-dimensional point coordinate measured by any lidar ray in the lidar cartesian coordinate system may be performed before the step of determining the normalized intensity value based on the third laser intensity, the maximum laser intensity, and the minimum laser intensity, or may be performed after the step of determining the normalized intensity value based on the third laser intensity, the maximum laser intensity, and the minimum laser intensity, or may be performed simultaneously with the step of determining the normalized intensity value based on the third laser intensity, the maximum laser intensity, and the minimum laser intensity, which is not limited in the embodiment of the present application.
Specifically, calculating a three-dimensional point coordinate of any laser radar ray measured in a laser radar cartesian coordinate system based on the incident angle, the third laser intensity and the third distance value after any laser radar ray distortion processing may specifically include: according to the incidence angle corresponding to the laser radar ray and the measured distance value, the following formula (11) And calculating the three-dimensional point (x) measured by the laser radar ray in the laser radar Cartesian coordinate system l,i, y l,i, z l,i )。
Figure 111359DEST_PATH_IMAGE046
Figure 377255DEST_PATH_IMAGE047
Figure 923774DEST_PATH_IMAGE048
(ii) a Formula (11)
Wherein (x) l,i, y l,i, z l,i ) For three-dimensional points of the lidar beam measured in the lidar Cartesian coordinate System: (
Figure 806190DEST_PATH_IMAGE027
Figure 754554DEST_PATH_IMAGE028
) Is the angle of incidence of the laser radar beam with motion distortion (i.e. the angle of incidence after distortion processing of any laser radar beam), k i Is the range measurement of the lidar ray obtained from the above embodiment.
Specifically, determining the normalized intensity value based on the third laser intensity, the maximum laser intensity, and the minimum laser intensity may specifically include: the intensity values for all rays are calculated and normalized based on equation (12), with the general range being [0,255 ].
Wherein the content of the first and second substances,
Figure 175040DEST_PATH_IMAGE049
formula (12);
wherein, I max Is the maximum of all intensities, I min Is the minimum of all intensities, I g Is the normalized intensity value, I is the above implementationExample Intensity values.
Further, the three-dimensional point cloud corresponding to the ray and the corresponding intensity value can be obtained through the above embodiment.
Further, the ray corresponding to each incident angle in the three-dimensional incident angle group established in the above embodiment may be executed to calculate a three-dimensional point coordinate of any lidar ray measured in a lidar cartesian coordinate system based on the incident angle, the third laser intensity, and the third distance value after any lidar ray is distorted; determining the intensity value after normalization processing based on the third laser intensity, the maximum laser intensity and the minimum laser intensity, and returning the corresponding three-dimensional point cloud and the corresponding intensity value { x }thereof l,i, y l,i, z l,i, I i }. Wherein i is used to characterize the rays corresponding to the incident angles in the three-dimensional incident angle group.
The above embodiments describe a method for laser radar simulation from the perspective of a method flow, and the following embodiments describe a device for laser radar simulation from the perspective of a module, which are described in detail in the following embodiments.
The embodiment of the present application provides a lidar simulation apparatus, as shown in fig. 7, the lidar simulation apparatus 70 may specifically include: a build module 71, a read module 72, a motion distortion processing module 73, and a lidar simulation module 74, wherein,
a construction module 71, configured to construct a three-dimensional simulation scene;
a reading module 72, configured to read parameters for lidar simulation, where the parameters for lidar simulation include: the angle of incidence of each lidar ray;
the motion distortion processing module 73 is configured to perform motion distortion processing on the incident angle of each laser radar ray to obtain a distorted incident angle;
and the laser radar simulation module 74 is used for carrying out laser radar simulation based on the three-dimensional simulation scene, the incidence angle after distortion processing and other parameters except the incidence angle of laser radar rays.
In a possible implementation manner of the embodiment of the present application, when constructing the three-dimensional simulation scene, the construction module 71 is specifically configured to any one of the following:
acquiring sensor data, and constructing a three-dimensional simulation scene based on the sensor data;
constructing a three-dimensional simulation scene through a physical engine;
the three-dimensional simulation scene comprises the following steps: the three-dimensional simulation scene is described by a triangular grid or a square grid.
In another possible implementation manner of the embodiment of the present application, the parameters for laser radar simulation include: field angle and angular resolution; wherein the device 70 further comprises: a module is established in which, among other things,
and the establishing module is used for establishing a three-dimensional incident angle table based on the angle of view and the angular resolution.
In another possible implementation manner of the embodiment of the present application, the apparatus 70 further includes: a first obtaining module, a noise point removing module, a calculating module and a first determining module, wherein,
the first acquisition module is used for acquiring multi-frame measurement results;
the noise point removing module is used for removing noise points from the multi-frame measurement results;
the calculation module is used for calculating an incidence angle for an effective point corresponding to each ray in each frame;
and the first determining module is used for determining the incidence angle corresponding to the ith ray based on the incidence angle corresponding to the ith ray in each frame, wherein i belongs to [1, n ], and n is the number of rays contained in each frame.
In another possible implementation manner of the embodiment of the present application, when performing motion distortion processing on the incident angle of each lidar ray, the motion distortion processing module 73 is specifically configured to:
acquiring current pose information;
determining a relative transformation matrix generated by motion distortion based on the current pose information;
and performing motion distortion processing on the incidence angle of each laser radar ray based on the relative transformation matrix.
In another possible implementation manner of the embodiment of the present application, the motion distortion processing module 73 is specifically configured to, when performing motion distortion processing on the incident angle of each lidar ray based on the relative transformation matrix to obtain the incident angle after the distortion processing:
determining a unit direction vector of an incidence angle of each laser radar ray in a specific coordinate system to obtain a unit direction vector corresponding to each ray;
determining a unit direction vector after each ray is distorted based on the unit direction vector corresponding to each ray and a relative conversion matrix;
and determining the distorted incidence angle of each ray based on the unit direction vector of each ray after motion distortion.
In another possible implementation manner of the embodiment of the present application, the lidar simulation module 74 is specifically configured to, when performing lidar simulation based on a three-dimensional simulation scene, an incident angle after any distortion processing, and other parameters except the incident angle of the lidar ray:
constructing a sphere three-dimensional simulation sub-scene based on the three-dimensional simulation scene;
performing collision detection based on the sphere three-dimensional simulation sub-scene, the incidence angle after any distortion treatment and other parameters except the incidence angle of the laser radar ray;
and carrying out laser radar simulation based on the collision detection result and the beam model.
In another possible implementation manner of the embodiment of the present application, the other parameters except the incident angle of the laser radar ray include: the effective detection range, the vertical angle resolution, the horizontal angle resolution and the diameter resolution of the laser radar;
when performing collision detection based on the sphere three-dimensional simulation sub-scene, the incident angle after any distortion processing, and other parameters except the incident angle of the laser radar ray, the laser radar simulation module 74 is specifically configured to:
performing spheric element rasterization on the spheric element sub-scene according to the vertical angular resolution, the horizontal angular resolution and the diameter resolution to obtain a three-dimensional simulation sub-scene with the spheric element rasterization;
and performing collision detection based on the three-dimensional simulation sub-scene rasterized by the spherical elements and the incidence angle processed by any distortion.
In another possible implementation manner of the embodiment of the application, the three-dimensional simulation sub-scene with the rasterized spherical elements includes a plurality of element grid voxels corresponding to each ray; the apparatus 70 further comprises: a second determination module and a creation module, wherein,
a second determining module, configured to determine, based on the angle of incidence after distortion processing of each ray, the propagation distance of each ray, an effective detection range of a laser radar, a vertical angular resolution, and a horizontal angular resolution, index values corresponding to each of a plurality of element grid voxels corresponding to each ray;
and the creating module is used for creating a three-dimensional data group based on the index values corresponding to the plurality of voxel grid voxels corresponding to each ray.
In another possible implementation manner of the embodiment of the application, the three-dimensional simulation sub-scene with the rasterized spherical elements comprises a plurality of spherical element raster voxels;
when performing collision detection based on the three-dimensional simulated sub-scene rasterized by the voxel and any distorted incident angle, the lidar simulation module 74 is specifically configured to:
acquiring semantic information in a three-dimensional simulation sub-scene with a rasterized spherical element;
carrying out bounding box division on the barrier in each voxel grid voxel based on semantic information in the three-dimensional simulation sub-scene with the voxel rasterized;
determining the coordinate of any ray under a laser radar spherical coordinate system based on the incident angle after any distortion processing, the pose of the laser radar in the simulation scene and the spherical three-dimensional simulation sub-scene;
determining the distance from any ray to the surface of the obstacle based on the coordinate of any ray in a laser radar spherical coordinate system and a plurality of spherical element grid voxels;
determining the size of a light spot generated by any ray to the surface of the obstacle based on the distance from the any ray to the surface of the obstacle and a light beam model;
determining an incidence angle after ray distortion processing meeting a first preset condition, wherein the incidence angle after the ray distortion processing meeting the first preset condition is a distorted incidence angle corresponding to a ray with the light spot size as the radius and the direction vector of any ray as the center;
determining the coordinate of each ray meeting the first preset condition in a laser radar spherical coordinate system based on the incidence angle after distortion processing meeting the first preset condition, the pose of the laser radar in the simulation scene and the spherical three-dimensional simulation sub-scene;
and determining the distance from each ray meeting the first preset condition to the surface of the obstacle based on the coordinates of each ray meeting the first preset condition in a laser radar spherical coordinate system and a plurality of spherical element grid voxels.
In another possible implementation manner of the embodiment of the present application, the apparatus 70 further includes: a third determination module, a partitioning module, and a sampling module, wherein,
the third determining module is used for determining the number of sampling points;
the dividing module is used for dividing a sampling range which is formed by taking any ray direction vector as a center and taking the spot radius as a radius based on the number of sampling points;
the sampling module is used for sampling the laser radar ray set based on each division range to obtain a sampled laser radar ray set;
the lidar simulation module 74 is specifically configured to, when determining, based on each of the incident angle after the distortion processing satisfying the first preset condition, the pose of the lidar in the simulation scene, and the three-dimensional simulation sub-scene of the sphere, a coordinate of each ray satisfying the first preset condition in the spherical coordinate system of the lidar:
and determining the coordinate of each ray meeting the first preset condition in a laser radar spherical coordinate system based on the incidence angle of the sampled laser radar ray set after the distortion processing of each ray, the pose of the laser radar in the simulation scene and the spherical three-dimensional simulation sub-scene.
In another possible implementation manner of the embodiment of the application, the coordinate of any ray in the laser radar spherical coordinate system is (m, n); the lidar simulation module 74 is specifically configured to, when determining a distance from any ray to the surface of the obstacle based on the coordinates of any ray in the lidar spherical coordinate system and the plurality of spherical element grid voxels:
from k = k min To k = k max Traversing a voxel grid voxel with index (m, n, k) in the three-dimensional data group, and carrying out bounding box detection on the traversed voxel grid voxel;
if the intersection exists, stopping traversing, and calculating the distance from any ray to the surface of the obstacle based on ray detection.
In another possible implementation manner of the embodiment of the present application, the collision detection result of any ray includes: distance of any ray to the surface of the obstacle; the beam model comprises a beam waist value of the beam and the wavelength of the beam; the lidar simulation module 74 is specifically configured to, when performing lidar simulation based on the collision detection result and the beam model:
determining the radius of the light spot based on the distance from any ray to the surface of the obstacle, the beam waist value of the light beam and the wavelength of the light beam;
determining a laser radar ray set which takes any ray direction vector as a center and takes the spot radius as a radius;
and simulating the divergence phenomenon of the rays in the propagation process based on the laser radar ray set.
In another possible implementation manner of the embodiment of the present application, the laser radar simulation result includes: a distance value of each ray to the obstacle; the apparatus 70 further comprises: a first processing module, a second processing module, a simulation module, and a fourth determination module, wherein,
the first processing module is used for processing the distance value from each ray to the obstacle by combining a noise model to obtain a distance value corresponding to each ray added with noise;
the second processing module is used for processing the distance value corresponding to each ray added with the noise by combining the weather model to obtain the laser intensity corresponding to each ray under each weather type;
the simulation module is used for simulating the laser intensity corresponding to each ray in the target weather type through an echo mode and a Dropoff mechanism;
and the fourth determination module is used for determining the three-dimensional point cloud data of each laser radar ray and the intensity value after normalization processing through three-dimensional point calculation processing and normalization processing based on the incidence angle and the simulation result after distortion processing of each laser radar ray.
In another possible implementation manner of the embodiment of the present application, the laser radar simulation result includes: the distance value of each ray to the obstacle; wherein the device 70 further comprises: a second obtaining module and a fifth determining module, wherein,
the second acquisition module is used for acquiring the variance of the measured distance;
a fifth determining module for determining an expectation and variance of the noise model based on the variance of the measured distances;
when the first processing module is combined with the noise model to process the laser radar simulation result, the first processing module is specifically configured to:
and adding noise to the distance value of each ray from the obstacle based on the expectation and the variance of the noise model to obtain the distance value after the noise is added.
In another possible implementation manner of the embodiment of the application, the weather model includes a corresponding relationship among a weather type, a measurement range and an attenuation degree; the second processing module is used for processing the distance value corresponding to each ray after the noise is added in combination with the weather model, and specifically used for:
determining whether the distance value corresponding to each ray added with the noise belongs to the measurement range corresponding to the target weather type;
if the ray belonging to the measurement range corresponding to the target weather type exists, acquiring the attenuation rate corresponding to the target weather type;
and determining the laser intensity corresponding to the ray meeting the measurement range under the target weather type based on the distance value corresponding to the ray meeting the measurement range and the attenuation rate corresponding to the target weather type.
In another possible implementation manner of the embodiment of the present application, when the simulation module performs the simulation based on the laser intensity corresponding to any ray in the target weather type and in the echo mode, the simulation module is specifically configured to:
determining an echo mode of any ray;
if the echo mode of any ray is a single echo mode, returning a first laser intensity and a corresponding first distance value, wherein the first laser intensity is the laser intensity corresponding to any ray in the target weather type, and the first distance value is the distance value from the ray corresponding to the first laser intensity to the obstacle;
if the echo mode of any ray is a multi-echo mode, returning the first laser intensity, the first distance value, the laser intensity meeting a second preset condition and the distance value corresponding to the laser intensity meeting the second preset condition, wherein the laser intensity meeting the second preset condition is the first N laser intensities selected from the laser intensities corresponding to the rays meeting the first preset condition from large to small; the distance values corresponding to the laser intensity meeting the second preset condition are the first N distance values selected from the distance values corresponding to the rays meeting the first preset condition from large to small.
In another possible implementation manner of the embodiment of the present application, when the simulation module performs the simulation based on the laser intensity corresponding to any ray in the target weather type and through a Dropoff mechanism, the simulation module is specifically configured to:
determining whether the second laser intensity is greater than a first intensity threshold value, wherein the second laser intensity is the laser intensity corresponding to any ray under the target weather type, and the first intensity threshold value is the laser intensity threshold value for generating a Dropoff phenomenon;
if the laser intensity is greater than the first intensity threshold, outputting a second laser intensity and a corresponding second distance value;
if the intensity is not larger than the first intensity threshold value, calculating an attenuation value based on the second laser intensity;
and if the attenuation value is larger than a second intensity threshold value, outputting a second laser intensity and a first distance value, wherein the second intensity threshold value is a random value.
In another possible implementation manner of the embodiment of the present application, the simulation result includes: the third laser intensity is the laser intensity corresponding to any laser radar ray, and the third distance value is the distance value of the ray corresponding to the third laser intensity;
the fourth determining module is specifically configured to, when determining three-dimensional point cloud data of any laser radar ray and an intensity value after normalization processing based on an incident angle, a second laser intensity, and a second distance value after any laser radar ray distortion processing through three-dimensional point calculation processing and normalization processing:
calculating a three-dimensional point coordinate measured by any laser radar ray under a laser radar Cartesian coordinate system based on the incidence angle, the third laser intensity and the third distance value after any laser radar ray is subjected to distortion processing;
and determining the intensity value after normalization processing based on the third laser intensity, the maximum laser intensity and the minimum laser intensity, wherein the maximum laser intensity and the minimum laser intensity are the maximum value and the minimum value in the laser intensity values respectively corresponding to all the rays.
The embodiment of the application provides a device of laser radar emulation, in this application embodiment through constructing three-dimensional simulation scene and reading the laser radar simulation parameter that contains the incident angle of every laser radar ray, and carry out the motion distortion to the incident angle of every laser radar ray and handle, obtain the incident angle after the distortion is handled, with according to three-dimensional simulation scene, the laser radar emulation is carried out to the incident angle after the distortion is handled and required other parameters of laser radar emulation, also when carrying out the laser radar emulation, carry out the motion distortion to the incident angle of every laser radar ray and handle, receive the influence of motion distortion to the laser radar ray in the real scene with the emulation, thereby can make the laser radar emulation more be close to the true condition, and then can realize carrying out more accurate emulation to the laser radar on the simulation platform.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the laser radar simulation apparatus described above may refer to the corresponding process in the foregoing method embodiment, and is not described herein again.
In an embodiment of the present application, an electronic device is provided, and as shown in fig. 8, an electronic device 800 shown in fig. 8 includes: a processor 801 and a memory 803. Wherein the processor 801 is coupled to a memory 803, such as via a bus 802. Optionally, the electronic device 800 may also include a transceiver 804. It should be noted that the transceiver 804 is not limited to one in practical applications, and the structure of the electronic device 800 is not limited to the embodiment of the present application.
The Processor 801 may be a CPU (Central Processing Unit), a general-purpose Processor, a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array) or other Programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor 801 may also be a combination of computing functions, e.g., comprising one or more microprocessors, a combination of a DSP and a microprocessor, or the like.
Bus 802 may include a path that transfers information between the above components. The bus 802 may be a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus 802 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 8, but this is not intended to represent only one bus or type of bus.
The Memory 803 may be a ROM (Read Only Memory) or other type of static storage device that can store static information and instructions, a RAM (Random Access Memory) or other type of dynamic storage device that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory), a CD-ROM (Compact Disc Read Only Memory) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to these.
The memory 803 is used for storing application program codes for executing the present application scheme, and the execution is controlled by the processor 801. The processor 801 is configured to execute application program code stored in the memory 803 to implement the content shown in the foregoing method embodiments.
Among them, electronic devices include but are not limited to: mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and fixed terminals such as digital TVs, desktop computers, and the like. The server can be a cloud server in the embodiment of the application. The electronic device shown in fig. 8 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
The present application provides a computer-readable storage medium, on which a computer program is stored, which, when running on a computer, enables the computer to execute the corresponding content in the foregoing method embodiments. In the embodiment of the application, through constructing the three-dimensional simulation scene and reading the laser radar simulation parameters including the incidence angle of each laser radar ray, the motion distortion processing is performed on the incidence angle of each laser radar ray, the incidence angle after the distortion processing is obtained, the laser radar simulation is performed on the incidence angle of each laser radar ray and other parameters required by the laser radar simulation according to the three-dimensional simulation scene, the incidence angle after the distortion processing and other parameters required by the laser radar simulation, namely, the motion distortion processing is performed on the incidence angle of each laser radar ray when the laser radar simulation is performed, the influence of the motion distortion on the laser radar rays in the real scene is simulated, the laser radar simulation can be closer to the real situation, and the more accurate simulation on the laser radar can be realized on the simulation platform.
It will be clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to perform all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
In the embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules or units is only one type of logical functional division, and other divisions may be realized in practice, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: u disk, removable hard disk, read only memory, random access memory, magnetic or optical disk, etc. for storing program codes.
The above embodiments are only used to describe the technical solutions of the present application in detail, but the above embodiments are only used to help understanding the method and the core idea of the present application, and should not be construed as limiting the present application. Those skilled in the art should also appreciate that various modifications and substitutions can be made without departing from the scope of the present disclosure.

Claims (22)

1. A method of lidar simulation, comprising:
constructing a three-dimensional simulation scene;
reading parameters for lidar simulation, the parameters for lidar simulation including: the angle of incidence of each lidar ray;
carrying out motion distortion processing on the incident angle of each laser radar ray to obtain the distorted incident angle;
and carrying out laser radar simulation based on the three-dimensional simulation scene, the incidence angle after the distortion processing and other parameters except the incidence angle of the laser radar ray.
2. The method of claim 1, wherein the building a three-dimensional simulation scene comprises any one of:
acquiring sensor data, and constructing the three-dimensional simulation scene based on the sensor data;
constructing the three-dimensional simulation scene through a physical engine;
wherein the three-dimensional simulation scene comprises: the three-dimensional simulation scene is described by a triangular grid or a square grid.
3. The method of claim 1, wherein the parameters for lidar simulation comprise: field angle and angular resolution;
wherein, reading the parameters for the lidar simulation, and then further comprises:
and establishing a three-dimensional incident angle table based on the angle of view and the angular resolution.
4. The method of claim 1, wherein reading the angle of incidence of the lidar radiation further comprises:
acquiring a multi-frame measurement result;
removing noise points from the multi-frame measurement results;
calculating an incidence angle for an effective point corresponding to each ray in each frame;
and determining the incidence angle corresponding to the ith ray based on the incidence angle corresponding to the ith ray in each frame, wherein i belongs to [1, n ], and n is the number of rays contained in each frame.
5. The method of claim 1, wherein the motion distorting the angle of incidence of each lidar ray comprises:
acquiring current pose information;
determining a relative transformation matrix generated by motion distortion based on the current pose information;
and performing motion distortion processing on the incidence angle of each laser radar ray based on the relative transformation matrix.
6. The method of claim 5, wherein the motion distorting the incident angle of each lidar ray based on the relative transformation matrix to obtain a distorted incident angle comprises:
determining a unit direction vector of the incidence angle of each laser radar ray in a specific coordinate system to obtain a unit direction vector corresponding to each ray;
determining a unit direction vector after each ray is distorted based on the unit direction vector corresponding to each ray and the relative transformation matrix;
and determining the distorted incidence angle of each ray based on the unit direction vector of each ray after motion distortion.
7. The method of claim 5 or 6, wherein performing lidar simulation based on the three-dimensional simulation scene, any distorted incident angle, and other parameters than the incident angle of the lidar ray comprises:
constructing a sphere three-dimensional simulation sub-scene based on the three-dimensional simulation scene;
performing collision detection based on the sphere three-dimensional simulation sub-scene, the incidence angle after any distortion processing and other parameters except the incidence angle of the laser radar ray;
and carrying out laser radar simulation based on the collision detection result and the beam model.
8. The method of claim 7, wherein the other parameters than the angle of incidence of the lidar rays include: an effective detection range of the lidar, the vertical angular resolution, the horizontal angular resolution, and the diameter resolution;
wherein the performing collision detection based on the sphere three-dimensional simulation sub-scene, the incidence angle after any distortion processing, and other parameters except the incidence angle of the lidar ray comprises:
performing spherical element rasterization on the spherical element sub-scene according to the vertical angular resolution, the horizontal angular resolution and the diameter resolution to obtain a spherical element rasterized three-dimensional simulation sub-scene;
and performing collision detection based on the three-dimensional simulation sub-scene rasterized by the spherical elements and the incidence angle processed by any distortion.
9. The method of claim 8, wherein the three-dimensional simulated sub-scenes of the voxel rasterization include a plurality of meta-grid voxels corresponding to each ray; the method further comprises the following steps:
determining index values corresponding to a plurality of voxel grid voxels of the spherical volume corresponding to each ray based on the incidence angle processed by distortion of each ray, the propagation distance of each ray, the effective detection range of the laser radar, the vertical angular resolution and the horizontal angular resolution;
and creating a three-dimensional data group based on the index values corresponding to the multiple metagrid voxels corresponding to each ray.
10. The method according to claim 8 or 9, wherein the three-dimensional simulated sub-scene of the voxel rasterization comprises a plurality of voxel grid voxels;
the collision detection is carried out on the three-dimensional simulation sub-scene based on the spheric voxel rasterization and the incidence angle after any distortion treatment, and the method comprises the following steps:
acquiring semantic information in the three-dimensional simulation sub-scene with the rasterized spherical elements;
carrying out bounding box division on the barrier in each voxel grid voxel based on semantic information in the three-dimensional simulation sub-scene with the voxel rasterized;
determining the coordinate of any ray under a laser radar spherical coordinate system based on the incidence angle after any distortion processing, the pose of the laser radar in a simulation scene and the spherical three-dimensional simulation sub-scene;
determining the distance from any ray to the surface of an obstacle based on the coordinate of the any ray in a laser radar spherical coordinate system and the plurality of spherical element grid voxels;
determining the size of a light spot generated by any ray to the surface of the obstacle based on the distance from the any ray to the surface of the obstacle and the light beam model;
determining an incidence angle after ray distortion processing meeting a first preset condition, wherein the incidence angle after the ray distortion processing meeting the first preset condition is a distorted incidence angle corresponding to a ray with the light spot size as a radius and the direction vector of any ray as a center;
determining the coordinate of each ray meeting the first preset condition under a laser radar spherical coordinate system based on the incidence angle after distortion processing of each ray meeting the first preset condition, the pose of the laser radar in a simulation scene and the spherical three-dimensional simulation sub-scene;
and determining the distance from each ray meeting the first preset condition to the surface of the obstacle based on the coordinate of each ray meeting the first preset condition in the laser radar spherical coordinate system and the plurality of spherical element grid voxels.
11. The method of claim 10, wherein the determining the incidence angle after the ray distortion processing satisfying the first preset condition further comprises:
determining the number of sampling points;
dividing a sampling range which is formed by taking any ray direction vector as a center and taking the spot radius as a radius on the basis of the number of the sampling points;
sampling the laser radar ray set based on each division range to obtain a sampled laser radar ray set;
determining the coordinate of each ray meeting the first preset condition under a laser radar spherical coordinate system based on each incidence angle meeting the first preset condition after distortion processing, the pose of the laser radar in a simulation scene and the spherical three-dimensional simulation sub-scene, wherein the determining comprises the following steps:
and determining the coordinate of each ray meeting the first preset condition under the spherical coordinate system of the laser radar based on the incidence angle of the sampled laser radar ray set after the distortion processing, the pose of the laser radar in the simulation scene and the spherical three-dimensional simulation sub-scene.
12. The method of claim 10 or 11, wherein the coordinates of any ray in the lidar spherical coordinate system are (m, n);
the determining the distance from any ray to the surface of the obstacle based on the coordinate of any ray in the spherical laser radar coordinate system and the plurality of spherical element grid voxels comprises:
from k = k min To k = k max Traversing a voxel grid voxel with index (m, n, k) in the three-dimensional data group, and carrying out bounding box detection on the traversed voxel grid voxel;
and if the intersection exists, stopping traversing, and calculating the distance from any ray to the surface of the obstacle based on ray detection.
13. The method of claim 7, wherein the collision detection result of any ray comprises: the distance of any ray to the surface of the obstacle; the beam model comprises a beam waist value of the light beam and the wavelength of the light beam;
the performing laser radar simulation based on the collision detection result and the beam model includes:
determining a spot radius based on a distance of the any ray to the surface of the obstacle, a beam waist value of the light beam and a wavelength of the light beam;
determining a laser radar ray set which takes any ray direction vector as a center and takes the spot radius as a radius;
and simulating the divergence phenomenon of the rays in the propagation process based on the laser radar ray set.
14. The method of claim 1, wherein the lidar simulation results comprise: a distance value of each ray to the obstacle; the method further comprises the following steps:
processing the distance value from each ray to the obstacle by combining a noise model to obtain a distance value corresponding to each ray after noise is added;
processing the distance value corresponding to each ray added with the noise by combining a weather model to obtain the laser intensity corresponding to each ray under the target weather type;
simulating by an echo mode and a Dropoff mechanism based on the corresponding laser intensity of each ray in the target weather type;
and determining three-dimensional point cloud data of each laser radar ray and an intensity value after normalization processing through three-dimensional point calculation processing and normalization processing based on the incidence angle and the simulation result after distortion processing of each laser radar ray.
15. The method of claim 14, wherein the lidar simulation result comprises: a distance value of each ray to the obstacle;
wherein, combine the noise model, process the laser radar simulation result, still include before and:
acquiring the variance of the measured distance, and determining the expectation and the variance of a noise model based on the variance of the measured distance;
the method comprises the following steps of combining a noise model to process a laser radar simulation result, wherein the method comprises the following steps:
and adding noise to the distance value of each ray from the obstacle based on the expectation and the variance of the noise model to obtain a distance value after the noise is added.
16. The method according to claim 15, wherein the weather model includes a corresponding relationship among a weather type, a measurement range, and an attenuation degree;
combining with a weather model, processing the distance value corresponding to each ray added with noise to obtain the laser intensity corresponding to each ray under the target weather type, including:
determining whether the distance value corresponding to each ray added with the noise belongs to the measurement range corresponding to the target weather type;
if the ray belonging to the measurement range corresponding to the target weather type exists, acquiring the attenuation rate corresponding to the target weather type;
and determining the laser intensity corresponding to the ray meeting the measurement range under the target weather type based on the distance value corresponding to the ray meeting the measurement range and the attenuation rate corresponding to the target weather type.
17. The method of claim 16, wherein simulating by echo mode based on the laser intensity corresponding to any ray in the target weather type comprises:
determining an echo pattern of the any ray;
if the echo mode of any ray is a single echo mode, returning a first laser intensity and a corresponding first distance value, wherein the first laser intensity is the laser intensity corresponding to any ray under the target weather type, and the first distance value is the distance value from the ray corresponding to the first laser intensity to the obstacle;
if the echo mode of any ray is a multi-echo mode, returning the first laser intensity, the first distance value, the laser intensity meeting a second preset condition and the distance value corresponding to the laser intensity meeting the second preset condition, wherein the laser intensity meeting the second preset condition is the first N laser intensities selected from the laser intensities corresponding to the rays meeting the first preset condition from large to small; the distance values corresponding to the laser intensity meeting the second preset condition are the first N distance values selected from the distance values corresponding to the rays meeting the first preset condition from large to small.
18. The method of claim 16 or 17, wherein the simulation is performed by a Dropoff mechanism based on the laser intensity corresponding to any ray in the target weather type, and comprises:
determining whether a second laser intensity is greater than a first intensity threshold, wherein the second laser intensity is the laser intensity corresponding to any ray in a target weather type, and the first intensity threshold is the laser intensity threshold generating a Dropoff phenomenon;
if the intensity is larger than the first intensity threshold value, outputting the second laser intensity and a corresponding second distance value;
if the intensity is not larger than the first intensity threshold value, calculating an attenuation value based on the second laser intensity;
and if the attenuation value is larger than a second intensity threshold value, outputting the second laser intensity and the first distance value, wherein the second intensity threshold value is a random value.
19. The method of claim 14, wherein the simulation results comprise: the laser intensity of the laser radar is the laser intensity corresponding to any laser radar ray, and the distance value of the laser radar is the distance value of the ray corresponding to the laser intensity of the laser radar;
based on the incidence angle, the second laser intensity and the second distance value after the distortion processing of any laser radar ray, the three-dimensional point cloud data of any laser radar ray and the intensity value after the normalization processing are determined through the three-dimensional point calculation processing and the normalization processing, and the method comprises the following steps:
calculating a three-dimensional point coordinate measured by any laser radar ray under a laser radar Cartesian coordinate system based on the incidence angle, the third laser intensity and the third distance value after any laser radar ray is subjected to distortion processing;
and determining the intensity value after normalization processing based on the third laser intensity, the maximum laser intensity and the minimum laser intensity, wherein the maximum laser intensity and the minimum laser intensity are the maximum value and the minimum value in the laser intensity values respectively corresponding to all the rays.
20. An apparatus for lidar simulation, comprising:
the building module is used for building a three-dimensional simulation scene;
a reading module, configured to read parameters for lidar simulation, where the parameters for lidar simulation include: the angle of incidence of each lidar ray;
the motion distortion processing module is used for carrying out motion distortion processing on the incidence angle of each laser radar ray to obtain the incidence angle after distortion processing;
and the laser radar simulation module is used for carrying out laser radar simulation based on the three-dimensional simulation scene, the incidence angle after the distortion processing and other parameters except the incidence angle of the laser radar ray.
21. An electronic device, comprising:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more applications configured to: a method of performing lidar simulation according to any of claims 1 to 19.
22. A computer readable storage medium having stored thereon a computer program, characterized in that the program, when executed by a processor, implements a method for lidar simulation according to any of claims 1 to 19.
CN202210631656.8A 2022-06-06 2022-06-06 Laser radar simulation method and device, electronic equipment and storage medium Pending CN115081195A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210631656.8A CN115081195A (en) 2022-06-06 2022-06-06 Laser radar simulation method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210631656.8A CN115081195A (en) 2022-06-06 2022-06-06 Laser radar simulation method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115081195A true CN115081195A (en) 2022-09-20

Family

ID=83248310

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210631656.8A Pending CN115081195A (en) 2022-06-06 2022-06-06 Laser radar simulation method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115081195A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115793715A (en) * 2023-01-05 2023-03-14 雄安雄创数字技术有限公司 Unmanned aerial vehicle auxiliary flight method, system, device and storage medium
CN116644616A (en) * 2023-07-25 2023-08-25 北京赛目科技股份有限公司 Point cloud distortion effect reduction method and device, electronic equipment and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115793715A (en) * 2023-01-05 2023-03-14 雄安雄创数字技术有限公司 Unmanned aerial vehicle auxiliary flight method, system, device and storage medium
CN115793715B (en) * 2023-01-05 2023-04-28 雄安雄创数字技术有限公司 Unmanned aerial vehicle auxiliary flight method, system, device and storage medium
CN116644616A (en) * 2023-07-25 2023-08-25 北京赛目科技股份有限公司 Point cloud distortion effect reduction method and device, electronic equipment and storage medium
CN116644616B (en) * 2023-07-25 2023-09-22 北京赛目科技股份有限公司 Point cloud distortion effect reduction method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US11428803B2 (en) Method and apparatus for SAR image data enhancement, and storage medium
CN115081195A (en) Laser radar simulation method and device, electronic equipment and storage medium
CN112433934B (en) Simulation test method, simulation test device, computer equipment and storage medium
Reitmann et al. Blainder—a blender ai add-on for generation of semantically labeled depth-sensing data
CN112991511B (en) Point cloud data display method
CN113111513B (en) Sensor configuration scheme determining method and device, computer equipment and storage medium
CN115731350A (en) Simulation method and device for virtual laser radar of vehicle
CN115457492A (en) Target detection method and device, computer equipment and storage medium
Hammer et al. Coherent simulation of SAR images
Jones et al. Marine vehicles simulated SAR imagery datasets generation
CN114187589A (en) Target detection method, device, equipment and storage medium
Zhang et al. Hawk‐eye‐inspired perception algorithm of stereo vision for obtaining orchard 3D point cloud navigation map
CN111765883B (en) Robot Monte Carlo positioning method, equipment and storage medium
Gusmão et al. A LiDAR system simulator using parallel raytracing and validated by comparison with a real sensor
CN117269940A (en) Point cloud data generation method and perception capability verification method of laser radar
Majek et al. Range sensors simulation using GPU ray tracing
Xu et al. Acceleration of shooting and bouncing ray method based on OptiX and normal vectors correction
Hammer et al. Dedicated SAR simulation tools for ATR and scene analysis
Abu-Shaqra et al. Object detection in degraded lidar signals by synthetic snowfall noise for autonomous driving
Guo et al. Electromagnetic scattering of electrically large ship above sea surface with SBR-SDFM method
CN113465614B (en) Unmanned aerial vehicle and generation method and device of navigation map thereof
Duvenhage Using an implicit min/max kd-tree for doing efficient terrain line of sight calculations
Latger et al. Millimeter waves sensor modeling and simulation
Yang et al. Absolute scale estimation for underwater monocular visual odometry based on 2-D imaging sonar
CN115081303B (en) Laser radar virtual modeling and simulation method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination