CN113433568B - Laser radar observation simulation method and device - Google Patents

Laser radar observation simulation method and device Download PDF

Info

Publication number
CN113433568B
CN113433568B CN202010206893.0A CN202010206893A CN113433568B CN 113433568 B CN113433568 B CN 113433568B CN 202010206893 A CN202010206893 A CN 202010206893A CN 113433568 B CN113433568 B CN 113433568B
Authority
CN
China
Prior art keywords
point cloud
laser radar
observation
cloud map
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010206893.0A
Other languages
Chinese (zh)
Other versions
CN113433568A (en
Inventor
孟超
吕吉鑫
孙杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN202010206893.0A priority Critical patent/CN113433568B/en
Publication of CN113433568A publication Critical patent/CN113433568A/en
Application granted granted Critical
Publication of CN113433568B publication Critical patent/CN113433568B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/487Extracting wanted echo signals, e.g. pulse detection
    • G01S7/4876Extracting wanted echo signals, e.g. pulse detection by removing unwanted signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/493Extracting wanted echo signals
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The embodiment of the application provides a laser radar observation simulation method and device, radar data collected by a laser radar can reflect a real scene where the laser radar is located, a point cloud map established in advance based on the radar data collected by the laser radar is the point cloud map of the real scene, an observation model is generated based on parameters of the laser radar, point clouds in the point cloud map of the real scene are extracted and screened based on the observation model, finally, a point cloud observation result obtained based on the screened point clouds is a laser radar observation result, and due to the fact that the point cloud map and the laser radar parameters are real, the obtained laser radar observation result is more real, and the more real laser radar observation result is provided for the environment perception function of the laser radar.

Description

Laser radar observation simulation method and device
Technical Field
The application relates to the technical field of laser radars, in particular to a laser radar observation simulation method and device.
Background
In recent years, with the continuous development of artificial intelligence technology, the demand for intelligent devices is increasing, and the artificial intelligence technology represented by intelligent driving is widely applied, so that great convenience is provided for life and work of people, and the application of the intelligent devices has great commercial value and social value.
The laser radar is a main sensor in intelligent driving, and is a sensor for realizing high-precision distance measurement by utilizing diffuse reflection of an object to laser. By virtue of the unique three-dimensional environment modeling capability of the laser radar, the laser radar becomes a core component for multi-sensor fusion in intelligent driving. The intelligent vehicle with the intelligent driving function needs to independently realize functions of environment perception, navigation positioning, decision control and the like, the laser radar plays an important role in the environment perception, and the environment perception needs a large number of laser radar observation results to be used for model training.
In order to realize the function of environment perception, a virtual scene model is generally established in a simulation environment simulation mode, under the scene model, a large number of laser radar observation results are obtained according to a laser radar observation model, and a network model for environment perception is trained by using the laser radar observation results, so that the purpose of environment perception is achieved. However, because the constructed virtual scene model is not a real scene, the obtained lidar observation result has a large difference from the actual observation result.
Disclosure of Invention
An object of the embodiments of the present application is to provide a method and an apparatus for simulating lidar observation, so as to obtain a more real lidar observation result. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a laser radar observation simulation method, where the method includes:
acquiring a point cloud map established in advance based on radar data acquired by a laser radar and parameters of the laser radar;
generating an observation model of the laser radar according to the parameters of the laser radar;
extracting point clouds conforming to the observation model from the point cloud map;
and screening the extracted point cloud, eliminating the point cloud with perspective effect, and determining a point cloud observation result based on the screened point cloud.
Optionally, the method for establishing the point cloud map includes:
acquiring radar data collected by a laser radar;
and obtaining a point cloud map by utilizing a preset pose estimation and reconstruction method according to the radar data.
Optionally, the parameters include a vertical observation range, a vertical angle resolution, a horizontal observation range, and a horizontal angle resolution;
the method comprises the following steps of generating an observation model of the laser radar according to parameters of the laser radar, wherein the steps comprise:
and constructing an observation model of the laser radar by setting scanning lines by taking the central position of the laser radar as an origin and taking a vertical observation range, a vertical angle resolution, a horizontal observation range and a horizontal angle resolution as constraint conditions.
Optionally, the step of constructing the observation model of the laser radar by setting the scan line with the central position of the laser radar as the origin and the vertical observation range, the vertical angular resolution, the horizontal observation range, and the horizontal angular resolution as the constraint conditions includes:
calculating the number of scanning lines according to the vertical observation range and the vertical angle resolution, and determining the vertical angle position of each scanning line in the space;
calculating the spatial angular position of the point cloud on each scanning line according to the horizontal observation range and the horizontal angular resolution;
and emitting the scanning lines at preset intervals by taking the central position of the laser radar as an origin according to the vertical angular position of each scanning line in the space and the spatial angular position of the point cloud on each scanning line, so as to obtain an observation model of the laser radar.
Optionally, before the step of extracting the point cloud conforming to the observation model from the point cloud map, the method further includes:
acquiring the pose and the detection distance range of the laser radar;
with the pose as an origin, screening out point clouds in a detection distance range from a point cloud map;
converting the screened point cloud into a coordinate system taking the pose as a reference system to obtain a local point cloud map;
the method for extracting the point cloud conforming to the observation model from the point cloud map comprises the following steps:
aiming at each point cloud in the local point cloud map, calculating the spatial angle error between the direction of the point cloud relative to the origin and each scanning line;
and aiming at each point cloud in the local point cloud map, if a spatial angle error which is not greater than a preset threshold exists, extracting the point cloud.
Optionally, the step of screening point clouds and rejecting the point clouds with perspective effect includes:
and keeping the point cloud closest to the origin on one scanning line, and rejecting other point clouds on the scanning line.
In a second aspect, an embodiment of the present application provides a lidar observation simulation apparatus, which includes:
the acquisition module is used for acquiring a point cloud map established in advance based on radar data acquired by the laser radar and parameters of the laser radar;
the virtual laser radar generating module is used for generating an observation model of the laser radar according to the parameters of the laser radar;
the laser radar observation acquisition module is used for extracting point cloud which accords with the observation model from the point cloud map;
and the point cloud screening and data integration module is used for screening the extracted point cloud, eliminating the point cloud with perspective effect, and determining a point cloud observation result based on the screened point cloud.
Optionally, the apparatus further comprises: a point cloud map creation module;
the point cloud map creating module is used for acquiring radar data acquired by the laser radar; and obtaining a point cloud map by utilizing a preset pose estimation and reconstruction method according to the radar data.
Optionally, the parameters include a vertical observation range, a vertical angle resolution, a horizontal observation range, and a horizontal angle resolution;
the virtual lidar generation module is specifically configured to:
and constructing an observation model of the laser radar by setting scanning lines by taking the central position of the laser radar as an origin and taking a vertical observation range, a vertical angle resolution, a horizontal observation range and a horizontal angle resolution as constraint conditions.
Optionally, the virtual lidar generating module is specifically configured to:
calculating the number of scanning lines according to the vertical observation range and the vertical angle resolution, and determining the vertical angle position of each scanning line in the space;
calculating the spatial angular position of the point cloud on each scanning line according to the horizontal observation range and the horizontal angular resolution;
and emitting the scanning lines at preset intervals by taking the central position of the laser radar as an origin according to the vertical angular position of each scanning line in the space and the spatial angular position of the point cloud on each scanning line, so as to obtain an observation model of the laser radar.
Optionally, the apparatus further comprises: local point cloud map;
the local point cloud map is used for acquiring the pose and the detection distance range of the laser radar; with the pose as an origin, screening out point clouds in a detection distance range from a point cloud map; converting the screened point cloud into a coordinate system taking the pose as a reference system to obtain a local point cloud map;
laser radar surveys and obtains the module, specifically is used for: aiming at each point cloud in the local point cloud map, calculating the spatial angle error between the direction of the point cloud relative to the origin and each scanning line; and aiming at each point cloud in the local point cloud map, if a spatial angle error which is not greater than a preset threshold exists, extracting the point cloud.
Optionally, the point cloud screening and data integration module is specifically configured to:
and keeping the point cloud closest to the origin on one scanning line, and rejecting other point clouds on the scanning line.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor and a machine-readable storage medium, where the machine-readable storage medium stores machine-executable instructions that can be executed by the processor; the processor is caused by machine execution of the instructions to: the laser radar observation simulation method provided by the first aspect of the embodiment of the application is realized.
In a fourth aspect, an embodiment of the present application provides a machine-readable storage medium, where machine-readable instructions are stored, and when the machine-readable storage medium is called and executed by a processor, the lidar observation simulation method provided in the first aspect of the embodiment of the present application is implemented.
According to the laser radar observation simulation method and device, a point cloud map established in advance based on radar data acquired by a laser radar and parameters of the laser radar are acquired, an observation model of the laser radar is generated according to the parameters, point clouds conforming to the observation model are extracted from the point cloud map, the point clouds are screened, the point clouds with perspective effect are eliminated, and a point cloud observation result is determined based on the screened point clouds. The method comprises the steps that radar data collected by a laser radar can reflect a real scene where the laser radar is located, a point cloud map established in advance based on the radar data collected by the laser radar is the point cloud map of the real scene, an observation model is generated based on parameters of the laser radar, point clouds in the point cloud map of the real scene are extracted and screened based on the observation model, and finally point cloud observation results obtained based on the screened point clouds are laser radar observation results.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flowchart of a lidar observation simulation method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of an observation model according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a local point cloud map according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a single-frame point cloud observation in an embodiment of the present application;
FIG. 5 is a schematic flow chart illustrating a lidar observation simulation method according to another embodiment of the present disclosure;
FIG. 6 is a schematic flow chart of laser point cloud simulation according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a lidar observation simulation apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In order to obtain a more real laser radar observation result and improve the accuracy of the laser radar to the environmental perception, the embodiment of the application provides a laser radar observation simulation method and a laser radar observation simulation device. First, a laser radar observation simulation method provided in the embodiment of the present application is described below.
The execution main body of the laser radar observation simulation method provided by the embodiment of the application can be a background server with an analog simulation function. The method for implementing the lidar observation simulation method provided by the embodiment of the application may be at least one of software, a hardware circuit and a logic circuit arranged in the execution main body.
As shown in fig. 1, a lidar observation simulation method provided in an embodiment of the present application may include the following steps.
S101, a point cloud map established in advance based on radar data collected by a laser radar and parameters of the laser radar are obtained.
And S102, generating an observation model of the laser radar according to the parameters of the laser radar.
S103, extracting the point cloud which accords with the observation model from the point cloud map.
S104, screening the extracted point clouds, eliminating the point clouds with perspective effect, and determining a point cloud observation result based on the screened point clouds.
By applying the method and the device, a point cloud map established in advance based on radar data acquired by a laser radar and parameters of the laser radar are acquired, an observation model of the laser radar is generated according to the parameters, point clouds conforming to the observation model are extracted from the point cloud map, the point clouds are screened, the point clouds with perspective effect are removed, and a point cloud observation result is determined based on the screened point clouds. The method comprises the steps that radar data collected by a laser radar can reflect a real scene where the laser radar is located, a point cloud map established in advance based on the radar data collected by the laser radar is the point cloud map of the real scene, an observation model is generated based on parameters of the laser radar, point clouds in the point cloud map of the real scene are extracted and screened based on the observation model, and finally point cloud observation results obtained based on the screened point clouds are laser radar observation results. Based on the point cloud map of the real scene, observation results of the laser radar at different poses are simulated, and more real scene laser radar perception data can be efficiently and quickly obtained.
The laser radar transmits laser scanning lines on different vertical direction angles of a space through a laser emission source, and radar data of the obstacle are collected according to the reflection result of the scanning lines. The radar data are data acquired by the laser radar in a real scene and mainly comprise coordinates, distances and the like of obstacles in the real scene, and a point cloud map established in advance based on the radar data acquired by the laser radar is the point cloud map of the real scene.
Optionally, the establishing method of the point cloud map may include: acquiring radar data collected by a laser radar; and obtaining a point cloud map by utilizing a preset pose estimation and reconstruction method according to the radar data.
The point cloud map is a map formed by dense three-dimensional information of a real environment acquired by a high-precision pose estimation and reconstruction method according to a radar detection result, wherein the high-precision pose estimation and reconstruction method can be an SLAM (Simultaneous Localization and Mapping) technology.
The parameters of the laser radar can comprise attribute parameters such as maximum observation distance, horizontal observation range, vertical observation range, radar line number, point cloud number and the like, and can also comprise external parameters such as installation position, attitude and the like. The attribute parameters of the laser radar can be directly obtained from the laser radar, and the external parameters of the laser radar can be obtained by calibration after the laser radar is installed.
After the parameters of the laser radar are obtained, an observation model of the laser radar can be generated according to the parameters of the laser radar, and the observation model is constructed according to the parameters of the laser radar and a detection principle and is used for realizing observation simulation of the laser radar with different parameters and models.
Alternatively, the parameters may include a vertical viewing range, a vertical angular resolution, a horizontal viewing range, and a horizontal angular resolution. Correspondingly, S102 may specifically be: and constructing an observation model of the laser radar by setting scanning lines by taking the central position of the laser radar as an origin and taking a vertical observation range, a vertical angle resolution, a horizontal observation range and a horizontal angle resolution as constraint conditions.
In order to realize the observation simulation of the laser radar, an observation model of the laser radar needs to be constructed, and the observation model is constructed in a scanning line setting mode based on the detection principle of the laser radar. The observation model takes the central position of the laser radar as an origin, horizontal scanning lines are emitted in a vertical observation range, the observation range of the scanning lines in the horizontal direction is 360 degrees, the observation model of the laser radar can be constructed based on the scanning lines, and the observation model is shown in figure 2. The spatial relationship between the scanning lines of the observation model should satisfy the constraints of parameters such as the horizontal observation range (the horizontal observation range of a single scanning line, which is usually 360 °), the horizontal angular resolution (the horizontal field angle between two adjacent point clouds on the single scanning line), the vertical observation range (the maximum field angle of the laser radar scanning line in the vertical direction, which is usually 30 °), the vertical angular resolution (the vertical field angle between two adjacent scanning lines of the laser radar, which determines the number of scanning lines of the laser radar), and the like of the laser radar.
For different types of laser radars, the attribute parameters such as the vertical observation range, the vertical angle resolution, the horizontal observation range and the horizontal angle resolution are different, so that corresponding observation models can be established for the different types of laser radars, and the adaptability of the observation models is stronger.
Optionally, the step of constructing the observation model of the laser radar by setting the scan line with the central position of the laser radar as the origin and the vertical observation range, the vertical angle resolution, the horizontal observation range, and the horizontal angle resolution as the constraint conditions may specifically be: calculating the number of scanning lines according to the vertical observation range and the vertical angle resolution, and determining the vertical angle position of each scanning line in the space; calculating the spatial angular position of the point cloud on each scanning line according to the horizontal observation range and the horizontal angular resolution; and according to the vertical angle position of each scanning line in the space and the spatial angle position of the point cloud on each scanning line, taking the central position of the laser radar as an original point, and transmitting the scanning lines at preset intervals to obtain an observation model of the laser radar.
The number of scanning lines of the lidar currently to be simulated can be calculated from the vertical observation range and the vertical angular resolution in the parameters of the lidar, for example, if the vertical observation range is 30 ° and the vertical angular resolution is 2 °, then the number of scanning lines S =30 °/2 ° +1, and the calculated number of scanning lines is 16. Since the vertical angular resolution between every two scan lines is not a constant value, the number of scan lines is often calculated iteratively. Since the vertical field of view is known, the number of scan lines is calculated, and based on these two parameters, the vertical angular position of each scan line in space can be determined. According to the horizontal observation range and the horizontal angle resolution in the parameters of the laser radar, the horizontal observation range is generally 360 degrees, and the horizontal angle resolution is a horizontal view field included angle between two adjacent point clouds on a single scanning line, so that the spatial position of each scanning line can be determined according to the two parameters. After the vertical angle position of each scanning line in the space and the space position of each scanning line are determined, the scanning lines are transmitted at preset intervals by taking the central position of the laser radar as an origin according to the vertical angle position and the space position, and then an observation model of the laser radar can be established.
The observation model is a mathematical model and has constraints on point clouds, the point cloud map is a point cloud under a real scene, and after the observation model is established, the point cloud which accords with the observation model can be extracted from the point cloud map, namely the point cloud which accords with the mathematical constraints is extracted from the point cloud of the real scene.
Optionally, before executing S103, the method provided in the embodiment of the present application may further execute: acquiring the pose and the detection range of the laser radar; with the pose as an origin, screening out point clouds in a detection distance range from the point cloud map; and converting the screened point cloud into a coordinate system taking the pose as a reference system to obtain a local point cloud map.
Correspondingly, S103 may specifically be: aiming at each point cloud in the local point cloud map, calculating the spatial angle error between the direction of the point cloud relative to the origin and each scanning line; and aiming at each point cloud in the local point cloud map, if a spatial angle error which is not greater than a preset threshold exists, extracting the point cloud.
In the implementation mode that the observation model is constructed by scanning lines based on the detection principle of the laser radar, firstly, the input pose of the laser radar is used as an origin, the point cloud is screened from the point cloud map by using the detection distance range of the laser radar as a threshold value, namely, the point cloud in the detection distance range is firstly screened, and the screened point cloud is converted into a seat local point cloud map under a coordinate system by using the pose of the laser radar as a reference system through rigid body transformation, wherein the local point cloud map is shown in fig. 3, and a dotted circular area in the map is the maximum detection range of the laser radar. The rigid body transformation refers to conversion between point clouds in different coordinate systems, for example, from a radar coordinate system to a coordinate system with the pose of a laser radar as a reference system.
And traversing the local point cloud map after the local point cloud map is obtained, calculating the space angle error between the direction of each point cloud relative to the origin and each scanning line of the laser radar, if the space angle error between the point cloud and a certain scanning line is not greater than a threshold value, extracting the point cloud, storing the point cloud as an observation point simulating the scanning line corresponding to the laser radar, and if the space angle error between the point cloud and a certain scanning line is less than the threshold value, skipping the point cloud.
The method comprises the steps of extracting point clouds conforming to an observation model from a point cloud map, screening the extracted point clouds, wherein the point clouds generated based on radar data possibly have a perspective effect, namely the point clouds which cannot be observed originally appear in the point cloud map, eliminating the point clouds with the perspective effect, and forming a point cloud observation result by the screened point clouds.
Optionally, the step of screening point clouds and rejecting the point clouds with perspective effect may specifically be: and keeping the point cloud closest to the origin on one scanning line, and rejecting other point clouds on the scanning line.
In general, only one point cloud on one scan line is observable, the point cloud is closest to the origin on the scan line, and other point clouds on the scan line are all point clouds with perspective effect. Therefore, when the point cloud is screened, only the point cloud closest to the origin on one scanning line can be reserved, and other point clouds on the scanning line can be eliminated.
To summarize the above, in one implementation, as shown in fig. 5, the input comprises 3 parts: the method comprises the steps of (1) a dense point cloud map of a real scene, (2) a 3D pose of the laser radar to be simulated and (3) laser radar parameters. The dense point cloud map is used for acquiring dense 3D information of an environment through a high-precision pose estimation and reconstruction method according to radar data acquired by a laser radar in a real scene; the 3D pose of the laser radar refers to the position and the posture of the laser radar to be simulated in a map; the lidar parameters represent lidar sensor parameters to be simulated, such as the maximum observation distance, the horizontal angle of view, the vertical angle of view, the radar line number, the point cloud number and the like of the lidar, wherein the radar line number refers to the number of laser scanning lines of the lidar in the vertical observation range, and is commonly 1, 4, 8, 16, 32 and 64 lines. And performing laser point cloud simulation on the input 3 parts to finally obtain laser radar single-frame simulation point cloud.
The laser point cloud simulation mainly comprises three parts, namely virtual laser radar generation, laser radar observation acquisition and point cloud screening and data integration, as shown in fig. 6.
The virtual laser radar generation refers to the generation of an observation model of the laser radar according to the parameters of the laser radar; the laser radar observation acquisition refers to extracting all point clouds which accord with an observation model from a point cloud map; the point cloud screening and data integration means that the extracted point cloud is screened, the point cloud with the perspective effect is eliminated, and the screened point cloud is divided, stored and output according to the laser radar data format and the wire harness. The specific processes of each part are described in the above embodiments, and are not described in detail here.
And applying the observation model to dense point cloud of a real scene to realize the simulation of the laser radar observation results of various parameters at different positions. Compared with the original application, a more real laser radar observation result can be obtained, and the simulation distortion of laser radar observation caused by a simulation scene is avoided. And simulating laser radar observation results of various line numbers, types and installation external parameters by configuring parameters of the laser radar according to the point cloud map. The point cloud map collected by the low-line-number laser radar can be used for realizing observation simulation of the high-line-number laser radar, the acquisition efficiency of the point cloud of the laser radar is improved, and the cost of the laser radar is reduced. And simulating the observation result of the laser radar at any position of the coverage area of the real scene point cloud map. The laser radar observation information under different positions can be obtained without collecting radar data for a plurality of times aiming at the known point cloud map, and the acquisition efficiency of the laser radar point cloud is improved.
In the following, the application of the lidar observation simulation method provided in the embodiment of the present application is described with reference to several specific scenarios.
The first embodiment is as follows: and under the scene of obstacle identification, the acquired radar data of the current scene is input into a pre-trained identification model, so that the obstacle under the current scene is identified. The recognition model is usually a neural network model and is obtained by sample training, and the more samples are obtained, the more accurate the recognition result of the trained recognition model is. In the traditional method for acquiring the sample, the radar data is continuously acquired in an actual scene, the acquired radar data is used as sample data, and the identification model is trained, so that a large amount of time cost is consumed in the sample acquisition process, the training efficiency of the identification model is low, and the identification accuracy of the identification model is poor. By using the laser radar observation simulation method provided by the embodiment of the application, the obtained point cloud observation result is closer to radar data in a real scene, that is, more radar data (point cloud observation result) in the real scene can be efficiently obtained without repeated acquisition, and the point cloud observation result is used as sample data to train the identification model. That is to say, the point cloud observation result obtained by observation simulation in the embodiment of the application is input as the sample of the recognition model, so that a more accurate recognition model can be trained efficiently and quickly, and the training efficiency and the recognition accuracy of the recognition model are improved.
Based on the analysis, under the scene of obstacle recognition, the method mainly comprises two steps of model training and obstacle recognition. The model training comprises the following steps: the method comprises the steps of acquiring radar data acquired by a laser radar in advance under a specified scene, establishing a point cloud map based on the radar data, acquiring parameters of the laser radar, generating an observation model of the laser radar according to the parameters of the laser radar, extracting point clouds conforming to the observation model from the point cloud map, screening the extracted point clouds, eliminating the point clouds with perspective effect, and determining a point cloud observation result based on the screened point clouds; and taking the point cloud observation result as a training sample, and training the initial neural network model to obtain the identification model. The obstacle recognition includes: when the vehicle runs in the appointed scene, real-time radar data in the appointed scene are collected through the installed laser radar, the real-time radar data are input into the recognition model, and the recognition model outputs a real-time obstacle recognition result in the appointed scene.
Example two: under the scene of global positioning, especially under the scene of global positioning of two-way lane, because the vehicle is when going along equidirectional not, the radar data of gathering are different, in order to guarantee the accuracy of global positioning result, need when setting up the point cloud map, the vehicle is along equidirectional not going repeatedly, gather a large amount of radar data, based on a large amount of radar data that gather, the point cloud map of setting up just can be more close to true scene, and is visible, and traditional mapping process is very complicated. By using the laser radar observation simulation method provided by the embodiment of the application, the obtained point cloud observation result is closer to radar data in a real scene, that is, more radar data (namely point cloud observation result) can be efficiently obtained without repeated collection, so that a point cloud map can be quickly and accurately established based on the point cloud observation result. The point cloud map can reflect the real scene more comprehensively, thereby ensuring the accuracy of the global positioning result.
Based on the analysis, under the scene of global positioning, the method mainly comprises two steps of point cloud map updating and global positioning. The point cloud map updating comprises the following steps: the method comprises the steps of acquiring radar data acquired by a laser radar in advance under a specified scene, establishing a point cloud map based on the radar data, acquiring parameters of the laser radar, generating an observation model of the laser radar according to the parameters of the laser radar, extracting point clouds conforming to the observation model from the point cloud map, screening the extracted point clouds, eliminating the point clouds with perspective effect, and determining a point cloud observation result based on the screened point clouds; and updating the point cloud map according to the point cloud observation result. Global positioning: when the vehicle runs in the appointed scene, the real-time radar data in the appointed scene is collected through the installed laser radar, the real-time radar data is matched with the point cloud map, and the positioning result of the vehicle is determined according to the matching result.
Therefore, through the laser radar observation simulation method provided by the embodiment of the application, a more real laser radar observation result is provided for the environment perception function of the laser radar, more real scene laser radar perception data are efficiently and quickly acquired, and the acquisition efficiency of the laser radar point cloud is improved. Furthermore, under the scene of obstacle recognition, a more accurate recognition model can be trained efficiently and quickly, and the training efficiency and recognition accuracy of the recognition model are improved; under the scene of global positioning, a point cloud map which reflects the real scene more comprehensively is established quickly and accurately, and the accuracy of the global positioning result is ensured.
Based on the foregoing method embodiment, an embodiment of the present application provides a lidar observation simulation apparatus, and as shown in fig. 7, the apparatus may include:
the acquisition module 710 is configured to acquire a point cloud map established in advance based on radar data acquired by a laser radar, and parameters of the laser radar;
the virtual laser radar generating module 720 is configured to generate an observation model of the laser radar according to the parameters of the laser radar;
the laser radar observation acquisition module 730 is used for extracting point clouds conforming to the observation model from the point cloud map;
and the point cloud screening and data integrating module 740 is configured to screen the extracted point cloud, remove the point cloud with the perspective effect, and determine a point cloud observation result based on the screened point cloud.
Optionally, the apparatus may further include: a point cloud map creation module;
the point cloud map creating module is used for acquiring radar data acquired by the laser radar; and obtaining a point cloud map by utilizing a preset pose estimation and reconstruction method according to the radar data.
Optionally, the parameters may include a vertical observation range, a vertical angular resolution, a horizontal observation range, and a horizontal angular resolution;
virtual lidar generation module 720 may be specifically configured to:
and constructing an observation model of the laser radar by setting scanning lines by taking the central position of the laser radar as an origin and taking a vertical observation range, a vertical angle resolution, a horizontal observation range and a horizontal angle resolution as constraint conditions.
Optionally, virtual lidar generating module 720 may be specifically configured to:
calculating the number of scanning lines according to the vertical observation range and the vertical angle resolution, and determining the vertical angle position of each scanning line in the space;
calculating the spatial angular position of the point cloud on each scanning line according to the horizontal observation range and the horizontal angular resolution;
and emitting the scanning lines at preset intervals by taking the central position of the laser radar as an origin according to the vertical angular position of each scanning line in the space and the spatial angular position of the point cloud on each scanning line, so as to obtain an observation model of the laser radar.
Optionally, the apparatus may further include: local point cloud map;
the local point cloud map is used for acquiring the pose and the detection distance range of the laser radar; with the pose as an origin, screening out point clouds in a detection distance range from a point cloud map; converting the screened point cloud into a coordinate system taking the pose as a reference system to obtain a local point cloud map;
lidar observation acquisition module 730 may be specifically configured to: aiming at each point cloud in the local point cloud map, calculating the spatial angle error between the direction of the point cloud relative to the origin and each scanning line; and aiming at each point cloud in the local point cloud map, if a spatial angle error which is not greater than a preset threshold exists, extracting the point cloud.
Optionally, the point cloud screening and data integration module 740 may be specifically configured to:
and keeping the point cloud closest to the origin on one scanning line, and rejecting other point clouds on the scanning line.
By applying the method and the device, a point cloud map established in advance based on radar data acquired by a laser radar and parameters of the laser radar are acquired, an observation model of the laser radar is generated according to the parameters, point clouds conforming to the observation model are extracted from the point cloud map, the point clouds are screened, the point clouds with perspective effect are removed, and a point cloud observation result is determined based on the screened point clouds. The method comprises the steps that radar data collected by a laser radar can reflect a real scene where the laser radar is located, therefore, a point cloud map established in advance based on the radar data collected by the laser radar is the point cloud map of the real scene, an observation model is generated based on parameters of the laser radar, point clouds in the point cloud map of the real scene are extracted and screened based on the observation model, and finally point cloud observation results obtained based on the screened point clouds are laser radar observation results.
An electronic device is provided in the embodiments of the present application, as shown in fig. 8, and includes a processor 801 and a machine-readable storage medium 802, where the machine-readable storage medium 802 stores machine-executable instructions that can be executed by the processor 801; processor 801 is caused by machine-executable instructions to: the laser radar observation simulation method provided by the embodiment of the application is realized.
In the embodiment of the present application, the processor 801 is caused by machine executable instructions to realize that by reading the machine executable instructions stored in the machine readable storage medium 802: the method comprises the steps of obtaining a point cloud map established in advance based on radar data collected by a laser radar and parameters of the laser radar, generating an observation model of the laser radar according to the parameters, extracting point clouds conforming to the observation model from the point cloud map, screening the point clouds, eliminating the point clouds with perspective effect, and determining a point cloud observation result based on the screened point clouds. The method comprises the steps that radar data collected by a laser radar can reflect a real scene where the laser radar is located, a point cloud map established in advance based on the radar data collected by the laser radar is the point cloud map of the real scene, an observation model is generated based on parameters of the laser radar, point clouds in the point cloud map of the real scene are extracted and screened based on the observation model, and finally point cloud observation results obtained based on the screened point clouds are laser radar observation results.
The machine-readable storage medium may include a RAM (Random Access Memory) and a NVM (Non-volatile Memory), such as at least one disk Memory. Alternatively, the machine-readable storage medium may be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component.
In addition, a machine-readable storage medium is provided, where machine-executable instructions are stored, and when the machine-readable storage medium is called and executed by a processor, the lidar observation simulation method provided in the embodiment of the present application is implemented.
In this embodiment, the machine executable instructions stored in the machine readable storage medium when executed can implement: the method comprises the steps of obtaining a point cloud map established in advance based on radar data collected by a laser radar and parameters of the laser radar, generating an observation model of the laser radar according to the parameters, extracting point clouds conforming to the observation model from the point cloud map, screening the point clouds, eliminating the point clouds with perspective effect, and determining a point cloud observation result based on the screened point clouds. The method comprises the steps that radar data collected by a laser radar can reflect a real scene where the laser radar is located, a point cloud map established in advance based on the radar data collected by the laser radar is the point cloud map of the real scene, an observation model is generated based on parameters of the laser radar, point clouds in the point cloud map of the real scene are extracted and screened based on the observation model, and finally point cloud observation results obtained based on the screened point clouds are laser radar observation results.
For the embodiments of the electronic device and the machine-readable storage medium, since the contents of the related methods are substantially similar to those of the foregoing method embodiments, the description is relatively simple, and for the relevant points, reference may be made to partial descriptions of the method embodiments.
It should be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising one of 8230; \8230;" 8230; "does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on differences from other embodiments. In particular, for the apparatus, electronic device, and machine-readable storage medium embodiments, the description is relatively simple because they are substantially similar to the method embodiments, and reference may be made to some descriptions of the method embodiments for relevant points.
The above description is only for the preferred embodiment of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application are included in the protection scope of the present application.

Claims (7)

1. A lidar observation simulation method, the method comprising:
acquiring a point cloud map established in advance based on radar data acquired by a laser radar and parameters of the laser radar; the point cloud map is a map formed by dense three-dimensional information of a real environment acquired by a high-precision pose estimation and reconstruction method according to the laser radar detection result; the parameters comprise a vertical observation range, a vertical angle resolution, a horizontal observation range and a horizontal angle resolution;
generating an observation model of the laser radar according to the parameters; establishing an observation model of the laser radar by setting scanning lines by taking the central position of the laser radar as an origin and the vertical observation range, the vertical angle resolution, the horizontal observation range and the horizontal angle resolution as constraint conditions;
extracting point clouds conforming to the observation model from the point cloud map;
screening the point cloud, eliminating the point cloud with perspective effect, and determining a point cloud observation result based on the screened point cloud;
before extracting, from the point cloud map, a point cloud conforming to the observation model, the method further comprises:
acquiring the pose and the detection range of the laser radar;
with the pose as an origin, screening out point clouds in the detection distance range from the point cloud map;
converting the screened point cloud into a coordinate system taking the pose as a reference system to obtain a local point cloud map;
the extracting of the point cloud conforming to the observation model from the point cloud map includes:
calculating the space angle error between the direction of the point cloud relative to the origin and each scanning line aiming at each point cloud in the local point cloud map;
and aiming at each point cloud in the local point cloud map, if a spatial angle error which is not greater than a preset threshold value exists, extracting the point cloud.
2. The method of claim 1, wherein the point cloud map is created by:
acquiring radar data collected by a laser radar;
and obtaining a point cloud map by utilizing a preset pose estimation and reconstruction method according to the radar data.
3. The method according to claim 1, wherein the constructing the observation model of the lidar by setting scan lines with the central position of the lidar as an origin and the vertical observation range, the vertical angular resolution, the horizontal observation range, and the horizontal angular resolution as constraints comprises:
calculating the number of scanning lines according to the vertical observation range and the vertical angle resolution, and determining the vertical angle position of each scanning line in the space;
calculating the spatial angular position of the point cloud on each scanning line in the scanning lines according to the horizontal observation range and the horizontal angular resolution;
and emitting the scanning lines at preset intervals by taking the central position of the laser radar as an origin according to the vertical angular position of each scanning line in the space and the spatial angular position of the point cloud on each scanning line, so as to obtain an observation model of the laser radar.
4. The method of claim 1, wherein the screening the point clouds to remove point clouds with perspective effect comprises:
and keeping the point cloud closest to the origin on one scanning line, and removing other point clouds on the scanning line.
5. A lidar observation simulation apparatus, the apparatus comprising:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring a point cloud map which is established in advance based on radar data acquired by a laser radar and parameters of the laser radar; the point cloud map is a map formed by dense three-dimensional information of a real environment, which is obtained by a high-precision pose estimation and reconstruction method according to the detection result of the laser radar; the parameters comprise a vertical observation range, a vertical angle resolution, a horizontal observation range and a horizontal angle resolution;
the virtual laser radar generating module is used for generating an observation model of the laser radar according to the parameters; establishing an observation model of the laser radar by setting scanning lines by taking the central position of the laser radar as an origin and the vertical observation range, the vertical angle resolution, the horizontal observation range and the horizontal angle resolution as constraint conditions;
the laser radar observation acquisition module is used for extracting point clouds conforming to the observation model from the point cloud map;
the point cloud screening and data integration module is used for screening the point cloud, eliminating the point cloud with perspective effect and determining a point cloud observation result based on the screened point cloud;
the laser radar observation acquisition model is specifically used for:
acquiring the pose and the detection distance range of the laser radar;
with the pose as an origin, screening out point clouds in the detection distance range from the point cloud map;
converting the screened point cloud into a coordinate system taking the pose as a reference system to obtain a local point cloud map;
the extracting of the point cloud conforming to the observation model from the point cloud map includes:
calculating the space angle error between the direction of the point cloud relative to the origin and each scanning line aiming at each point cloud in the local point cloud map;
and aiming at each point cloud in the local point cloud map, if a spatial angle error which is not greater than a preset threshold value exists, extracting the point cloud.
6. The apparatus of claim 5, further comprising: a point cloud map creation module;
the point cloud map creating module is used for acquiring radar data acquired by a laser radar; and obtaining a point cloud map by utilizing a preset pose estimation and reconstruction method according to the radar data.
7. The apparatus of claim 5, wherein the virtual lidar generation module is specifically configured to:
calculating the number of scanning lines according to the vertical observation range and the vertical angle resolution, and determining the vertical angle position of each scanning line in space;
calculating the spatial angular position of the point cloud on each scanning line in the scanning lines according to the horizontal observation range and the horizontal angular resolution;
and emitting the scanning lines at preset intervals by taking the central position of the laser radar as an origin according to the vertical angular position of each scanning line in the space and the spatial angular position of the point cloud on each scanning line, so as to obtain an observation model of the laser radar.
CN202010206893.0A 2020-03-23 2020-03-23 Laser radar observation simulation method and device Active CN113433568B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010206893.0A CN113433568B (en) 2020-03-23 2020-03-23 Laser radar observation simulation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010206893.0A CN113433568B (en) 2020-03-23 2020-03-23 Laser radar observation simulation method and device

Publications (2)

Publication Number Publication Date
CN113433568A CN113433568A (en) 2021-09-24
CN113433568B true CN113433568B (en) 2023-04-07

Family

ID=77752544

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010206893.0A Active CN113433568B (en) 2020-03-23 2020-03-23 Laser radar observation simulation method and device

Country Status (1)

Country Link
CN (1) CN113433568B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116719054B (en) * 2023-08-11 2023-11-17 光轮智能(北京)科技有限公司 Virtual laser radar point cloud generation method, computer equipment and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101572618B1 (en) * 2014-10-16 2015-12-02 연세대학교 산학협력단 Apparatus and method for simulating lidar
US11455565B2 (en) * 2017-08-31 2022-09-27 Ford Global Technologies, Llc Augmenting real sensor recordings with simulated sensor data
CN108564615B (en) * 2018-04-20 2022-04-29 驭势(上海)汽车科技有限公司 Method, device and system for simulating laser radar detection and storage medium
CN108732556B (en) * 2018-08-17 2020-03-27 西南交通大学 Vehicle-mounted laser radar simulation method based on geometric intersection operation
CN109271893B (en) * 2018-08-30 2021-01-01 百度在线网络技术(北京)有限公司 Method, device, equipment and storage medium for generating simulation point cloud data
CN109459734B (en) * 2018-10-30 2020-09-11 百度在线网络技术(北京)有限公司 Laser radar positioning effect evaluation method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN113433568A (en) 2021-09-24

Similar Documents

Publication Publication Date Title
CN109285220B (en) Three-dimensional scene map generation method, device, equipment and storage medium
CN109949372B (en) Laser radar and vision combined calibration method
US11455565B2 (en) Augmenting real sensor recordings with simulated sensor data
CN109459734B (en) Laser radar positioning effect evaluation method, device, equipment and storage medium
US11092444B2 (en) Method and system for recording landmarks in a traffic environment of a mobile unit
US20190065933A1 (en) Augmenting Real Sensor Recordings With Simulated Sensor Data
KR102249769B1 (en) Estimation method of 3D coordinate value for each pixel of 2D image and autonomous driving information estimation method using the same
US20200233061A1 (en) Method and system for creating an inverse sensor model and method for detecting obstacles
US11941888B2 (en) Method and device for generating training data for a recognition model for recognizing objects in sensor data of a sensor, in particular, of a vehicle, method for training and method for activating
CN110889808A (en) Positioning method, device, equipment and storage medium
CN114547866B (en) Prefabricated part intelligent detection method based on BIM-unmanned aerial vehicle-mechanical dog
CN111856499B (en) Map construction method and device based on laser radar
CN113724387A (en) Laser and camera fused map construction method
CN115825067A (en) Geological information acquisition method and system based on unmanned aerial vehicle and electronic equipment
CN113433568B (en) Laser radar observation simulation method and device
Jing et al. Efficient point cloud corrections for mobile monitoring applications using road/rail-side infrastructure
CN117830772A (en) Local map generation method and system based on point cloud image fusion
Dehbi et al. Improving gps trajectories using 3d city models and kinematic point clouds
Hebel et al. Change detection in urban areas by direct comparison of multi-view and multi-temporal ALS data
CN116129669A (en) Parking space evaluation method, system, equipment and medium based on laser radar
CN115965847A (en) Three-dimensional target detection method and system based on multi-modal feature fusion under cross view angle
CN115236643A (en) Sensor calibration method, system, device, electronic equipment and medium
US11846523B2 (en) Method and system for creating a localization map for a vehicle
CN115249347A (en) Mapping vehicle environments
CN112747757A (en) Method and device for providing radar data, computer program and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant