CN112530022A - Method for computer-implemented simulation of LIDAR sensors in a virtual environment - Google Patents

Method for computer-implemented simulation of LIDAR sensors in a virtual environment Download PDF

Info

Publication number
CN112530022A
CN112530022A CN202010973580.8A CN202010973580A CN112530022A CN 112530022 A CN112530022 A CN 112530022A CN 202010973580 A CN202010973580 A CN 202010973580A CN 112530022 A CN112530022 A CN 112530022A
Authority
CN
China
Prior art keywords
data set
laser beam
lidar sensor
virtual environment
dataset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010973580.8A
Other languages
Chinese (zh)
Inventor
图尔加伊·伊斯克·阿斯兰德里
阿兰·玛丽·罗杰·谢瓦利埃
迈克尔·玛拜沙
弗雷德里克·斯蒂芬
埃万盖洛斯·比特萨尼斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ford Global Technologies LLC
Original Assignee
Ford Global Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies LLC filed Critical Ford Global Technologies LLC
Publication of CN112530022A publication Critical patent/CN112530022A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/06Ray-tracing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/006Theoretical aspects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/12Geometric CAD characterised by design entry means specially adapted for CAD, e.g. graphical user interfaces [GUI] specially adapted for CAD
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/15Vehicle, aircraft or watercraft design
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/80Shading
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Architecture (AREA)
  • Human Computer Interaction (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention relates to a method for computer-implemented simulation of a LIDAR sensor in a virtual environment, comprising the following steps: (S100) reading a virtual reality data set (VR) representing the virtual environment, (S200) reading a parameter data set (PM) for the parametric LIDAR sensor simulation module (4), (S300) creating a laser beam data set (LD) representing the laser beams (LS1, LS2, LS3, LS4, and LS5) emitted by the LIDAR sensor by using the LIDAR sensor simulation module (4) and the parameter data set (PM), (S400) determining a range data set (RD) representing the ranges of the laser beams (LS1, LS2, LS3, LS4, and LS5) by ray tracing evaluating the virtual reality data set (VR) and the laser beam data set (LD), and (S500) creating image data (BD) representing the virtual environment by combining the virtual reality data set (VR) with the laser beam data set (LD) and the range data set (RD).

Description

Method for computer-implemented simulation of LIDAR sensors in a virtual environment
Technical Field
The invention relates to a method for computer-implemented simulation of a LIDAR sensor in a virtual environment. The invention also relates to a computer program product and a system for computer-implemented simulation of such a lidar sensor.
Background
Motor vehicles may be designed for so-called autonomous driving. An autonomously driven motor vehicle refers to an automatically driven motor vehicle that can be driven, operated and parked without any influence of a human driver (highly autonomous driving or autonomous driving). The term robotic car is also used without the need for manual operation by the driver. In this case, the driver's seat may remain empty; the steering wheel, brake pedal, and accelerator pedal may not be present.
Such autonomous motor vehicles can capture their environment by means of various sensors and determine their own position and the position of other road users from the acquired information, travel to a travel destination in cooperation with navigation software, and avoid collisions in the process of reaching said destination.
Such a sensor may be a so-called LIDAR sensor. LIDAR (abbreviation of light detection and ranging), in this case denotes a radar-related method for optically measuring distances and velocities and for ranging. LIDAR uses a laser beam, as opposed to using electromagnetic waves as in RADAR (RADAR).
The advantage of LIDAR sensors is that they can scan a 360 degree range around a motor vehicle with high resolution and high speed. LIDAR sensors typically use an arrangement of laser-based sensors (e.g., 64) that rotate at high speeds (hundreds of n/min). The LIDAR sensor can then capture the obstacle that is struck by the laser beam. The coordinates of each impact or each object in the vicinity of the motor vehicle can thus be determined. By evaluating the LIDAR data, information about the topography in the area surrounding the motor vehicle may also be obtained.
To test this autonomous driving, motor vehicles were tested in the real world. However, this is an expensive process and there is a significant risk of accidents occurring. In order to avoid accidents and at the same time reduce costs, it is necessary to perform tests in a computer-generated virtual environment, for example in a virtual town or city. VR technology (virtual reality technology) opens up many possibilities together with virtual environments. The main advantage of VR technology is that it allows users, e.g., engineers, to participate in the test and interact with the test scenario or configuration parameters.
KR101572618 discloses a LIDAR simulation unit for simulating a LIDAR sensor in a virtual environment.
There is a need to present how to improve the way in which computers implement analog LIDAR sensors in a virtual environment.
Disclosure of Invention
The object of the invention is achieved by a method for computer-implemented simulation of a LIDAR sensor in a virtual environment, comprising the steps of:
a virtual reality data set representing a virtual environment is read,
reading a parameter dataset for a parameterized LIDAR sensor simulation module,
a laser beam dataset representing a laser beam emitted by the LIDAR sensor is created using the LIDAR sensor simulation module and the parameter dataset,
determining a distance data set representing the laser beam distance by ray tracing evaluating the virtual reality data set and the laser beam data set, an
Image data representative of the virtual environment is created by combining the virtual reality dataset with the laser beam dataset and the range dataset.
Using the parameter dataset, various parameters of the LIDAR sensor to be simulated may be configured without a user, wherein a particularly computational resource-saving method may be provided in which a range data set representing the range of the laser beam is determined by ray tracing evaluating the virtual reality dataset and the laser beam dataset. In this way, the computer implementation of an analog LIDAR sensor in a virtual environment may be improved.
According to one embodiment, the parameter dataset parameterizes a static LIDAR sensor with a horizontal scan area and/or a vertical scan area and/or a maximum range and/or scan rate. Therefore, it is fully feasible to simulate a static LIDAR sensor in a virtual environment.
According to another embodiment, the parameter dataset parameterises a rotating LIDAR sensor, wherein the laser beam dataset represents a rotating laser beam. In addition, the parameter data set may define a horizontal scanning area and/or a vertical scanning area and/or a maximum distance and/or a scanning rate. Therefore, LIDAR sensors that simulate rotation in a virtual environment are fully feasible.
According to another embodiment, scan data representing scan points of the LIDAR sensor is cached in the ring memory. The ring memory is implemented such that when new data comes in, the oldest data will be deleted. In this way the size of the memory can be minimized. Each storage element of the ring memory stores data containing all scanned LIDAR points at one scan time point.
According to another embodiment, a laser beam in a virtual environment is visualized for a user from a laser beam data set. Visualization of the laser beam allows a user (e.g., an engineer) to interact with the laser beam in a virtual environment. In this way, the user can see objects captured by the LIDAR sensor in the virtual environment, simplifying the evaluation of the simulation results.
According to another embodiment, a user may utilize an HMI in a virtual environment. Thus, a user located in the virtual environment can make various inputs, such as inputting or modifying parameter data sets, through the HMI embodied in the form of a virtual HMI. Thus, the user can act entirely in the virtual environment without having to leave it to alter, for example, the parameter data set.
The invention also includes a computer program product and a system for computer-implemented simulation of such a LIDAR sensor.
Drawings
The invention will now be explained with reference to the drawings. In the drawings:
FIG. 1 shows a schematic diagram of optional components of a system for computer-implemented simulation of a LIDAR sensor in a virtual environment;
fig. 2A shows a schematic diagram of an analog static LIDAR sensor;
FIG. 2B shows another schematic diagram of an analog static LIDAR sensor;
FIG. 3A shows a perspective view of a visualization of a static LIDAR sensor in a virtual environment;
FIG. 3B shows a top view of a visualization of a static LIDAR sensor in a virtual environment;
fig. 4 shows a further visualization of a static LIDAR sensor;
FIG. 5A shows a schematic diagram of a LIDAR sensor simulating rotation about its x-axis;
FIG. 5B shows another schematic diagram of a LIDAR sensor simulating rotation about its z-axis;
FIG. 6A shows a schematic diagram of a LIDAR sensor simulating rotation about its x-axis;
FIG. 6B shows another schematic diagram of a LIDAR sensor simulating rotation about its z-axis;
fig. 7 shows a perspective view of a visualization of a rotating LIDAR sensor in a virtual environment;
FIG. 8 shows a visualization of an object captured with a LIDAR sensor;
FIG. 9 illustrates a visualization of an HMI for use in a virtual environment;
fig. 10 shows a schematic diagram of a method sequence for operating the system shown in fig. 1.
Detailed Description
Reference will first be made to fig. 1.
It shows a system 2 for computer-implemented simulation of LIDAR sensors in a virtual environment.
Virtual reality, or simply VR, represents the presentation and simultaneous perception of reality and its physical attributes in an interactive virtual environment generated by a computer in real time.
To create the immersive sensation, a specific output device (e.g., a virtual reality headset) is used to represent the virtual environment. To give a spatial impression, two images are generated from different angles and represented (stereoscopic projection).
In order to interact with the virtual world, a specific input device (not shown) such as a 3D mouse, a data glove, or a flying stick is required. The flying stick is used for navigation through an optical tracking system, where an infrared camera permanently informs the system 2 of the position in space by capturing markers located on the flying stick, so that the user 8 can move freely without cables. The optical tracking system can also be used to capture tools and complete phantoms to enable real-time operation in VR scenarios.
Some input devices provide force feedback to the hands or other body parts of the user 8, and as a result, the user 8 may position himself in the virtual environment as a further sensory impression through haptics and sensors.
To generate a virtual environment, software developed specifically for this purpose is required. The software must be able to compute a complex three-dimensional world stereoscopically (according to the separation of the left and right eyes of the user 8) in real time (i.e., at least 25 frames per second). This value varies from application to application-for example, a driving simulation requires at least 60 frames per second to prevent nausea (virtual dizziness).
Among the components of the system 2, fig. 1 shows a LIDAR sensor simulation module 4 and a ring memory 6, as well as a human-machine interface (HMI)10 connected to the system 2 for data exchange, the HMI being worn by a user 8 on the head.
The system 2 is implemented to provide image data BD based on a virtual environment, which image data BD takes into account the current viewing direction of the user 8 and is then visualized by the HMI 10 for the user 8.
In this case, the system 2 is also implemented to read a virtual reality dataset VR representing the virtual environment and to read a parameter dataset PM for the parameterized LIDAR sensor simulation module 4.
The LIDAR sensor simulation module 4 is implemented in the present exemplary embodiment to create a laser beam data set LD (see fig. 10) representing a laser beam emitted by the LIDAR sensor by using the LIDAR sensor simulation module and the parameter data set, and to determine a range data set RD (see fig. 10) representing a laser beam range by evaluating the virtual reality data set VR and the laser beam data set LD by ray tracing.
Ray tracing is herein understood to mean an algorithm based on the emission of rays for calculating occlusions, that is to say an algorithm for determining the visibility of a three-dimensional object from a particular point in space.
In other words, ray tracing is primarily a method for calculating occlusion, i.e. for determining the visibility of an object starting from a viewpoint 12 (see fig. 2).
As part of the ray tracing, a laser beam data set LD is evaluated, which indicates the starting point and direction of the simulated laser beam in the form of a spatial half-line. For an object in the virtual environment from the virtual reality data set VR, the intersection point of the simulated laser beam incident on the object is now determined geometrically. In this case, the distance from the viewpoint 12 of the LIDAR sensor to the intersection point is also calculated and provided in the form of a range data set RD.
The parameter dataset PM may parameterize a static LIDAR sensor with a horizontal scan area and/or a vertical scan area and/or a maximum range and/or scan rate, or the parameter dataset PM may parameterize a rotating LIDAR sensor, wherein the laser beam dataset LD represents a rotating laser beam, as will be explained in detail later on.
Scan data D representing scan points of the LIDAR sensor may be buffered in the ring memory 6. Furthermore, the system 2 is implemented-as will be explained in more detail below-to visualize the laser beam to the user 8 from the laser beam data set LD determined in the virtual environment and to allow the user 8 to use the HMI 10 in the virtual environment.
For these tasks and functions, and for those described below, the system 2, the LIDAR sensor simulation module 4, and the ring memory 6 may each have corresponding hardware and/or software components.
An analog static LIDAR sensor will now be explained with additional reference to fig. 2A and 2B.
The number of vertical laser beams LS1, LS2, LS3, LS4, LS5 in the virtual environment is determined. The tilt angle of the LIDAR sensor is also taken into account and the direction of the laser beams LS1, LS2, LS3, LS4, LS5 is adjusted accordingly.
A first laser beam LS1 is emitted at an initial angle SW1 at the position of the viewpoint 12 of the LIDAR sensor (see fig. 2A). If-as in the present exemplary embodiment-a value of VFOV equal to 40 ° is selected for the vertical scan region, the initial angle should be SW1 equal to 70 °. In other words, the value of the initial angle SW1 is calculated as follows:
SW1=π-VFOV/2。
the laser beams LS1, LS2, LS3, LS4, and LS5 are uniformly arranged at equal intervals in the circumferential direction. If-as in the present exemplary embodiment-the value VFOV of the vertical scanning area is 40 ° and the number of laser beams LS1, LS2, LS3, LS4, LS5 is 5, a step angle of 40 °/5 of 8 ° is obtained.
Similarly (see fig. 2B), an initial angle SW2 of the horizontal scanning region HFOV may be determined:
SW2=π-HFOV/2。
reference is now additionally made to fig. 3A and 3B.
Here, the laser beam emitted by the LIDAR sensor may be visualized in the virtual environment. The value of the vertical scanning area VFOV is 18 °, the value of the horizontal scanning area HFOV is 180 °, the number of horizontal laser beams m is 8, and the number of vertical laser beams n is 3.
In the present exemplary embodiment, the total number G of scanning points at a scanning rate t of 0.005s is 8 × 3 × 1/0.005 of 4800.
In other words, the total number of scanning spots G is derived from the total number of laser beams. The product of the number m of horizontal laser beams and the number n of vertical laser beams relates to a simulation run. For the purpose of accurate simulation, the values of m and n should be chosen according to the LIDAR parameters. Since each simulation run is performed at a fixed point in time, the simulation run is also referred to as a fixed update. In a virtual environment there is no restriction regarding the choice of a fixed time step T. However, a duration of 5 to 25 milliseconds (═ 0.005s to 0.025s) is typically selected. Therefore, simulation was performed at a scanning rate T of 1/T.
Therefore, the total number of dots per second G can be determined by G ═ m × n × T. Further, the scan rate t may be determined according to the following equation:
t=1/T=G/(m x n)。
reference is now additionally made to fig. 4.
The figure shows a visualization of the laser beam according to the image data BD.
The size of the LIDAR area is determined by the laser beam length from the data set RD.
If the laser beam intersects the object, the laser beam does not pass through the object. The maximum values of the range and other values may be determined from the LIDAR parameters, e.g. by the parameter dataset PM.
Simulations of a LIDAR sensor rotating about its x-axis (i.e., about its roll or pitch axis) are now explained with additional reference to fig. 5A and 5B.
To simulate, for example, a rotation about the x-axis, the laser beams LS1, LS2, LS3, LS4, LS5 also rotate about the x-axis, precisely stepping clockwise by the value of the rotation angle DW1 (see fig. 5B). In the present exemplary embodiment, the rotation is performed only in one direction.
In the present exemplary embodiment, the LIDAR sensor emits only one laser beam that is rotatable 360 °.
If in the present exemplary embodiment a value of 2 ° is selected for the rotation angle DW1 and the scanning rate lasts for t 0.005s, a full rotation of 360 degrees has a rotation time RT of 360 °/DW 1x t of (360/2) x 0.005s of 0.9 s. The total number of dots per second, G, will be m x n/T, m x n x T, 1x 64/0.005, 12800 dots. By choosing a higher scan rate, the total number of dots G can be increased.
In the present exemplary embodiment, only one laser beam is simulated to keep the requirements on hardware and computer resources at a low level, since in this way also the number of captured objects can be kept at a low level. Unlike the present exemplary embodiment, it is also possible to simulate a plurality of laser beams simultaneously.
Simulations of a LIDAR sensor rotated about its z-axis (i.e., about its vertical axis) are now explained with additional reference to fig. 6A and 6B.
The LIDAR sensor similarly emits only one laser beam that rotates 360 ° through the value of the rotation angle DW2 to precisely step clockwise about the z-axis (see fig. 6B).
A buffer means is required to buffer the scan data D belonging to the total number G of points, which scan data D will then be processed by the control device of the motor vehicle. A ring memory 6 is used for this purpose.
The ring memory 6 has a predetermined size. When new data comes in, the oldest data is deleted. Each storage element of the ring memory 6 stores scan data D containing all scanned LIDAR points at one point in time.
In the present exemplary embodiment, the size of the ring memory 6, i.e., the total number of its storage elements, is automatically calibrated. For example, the following formula may be used to determine the memory size S for only one axis of rotation or pivot:
the memory size is the scan area/rotation angle.
For the case of two axes of rotation or pivots:
S=HFOV/DW1+VFOV/DW2
reference is now additionally made to fig. 7.
The present exemplary embodiment provides a visualization of the laser beam for the user 8 from the laser beam data set LD in the virtual environment.
The figure shows a visualization of a laser beam of a LIDAR sensor of a motor vehicle on a road with a curb and a large body of water. The laser beam of the LIDAR sensor is rotated a full 360 ° about its vertical axis and 18 ° about its roll axis (x-axis), rotation angles DW1 and DW2 being 1 °, respectively.
Reference is now additionally made to fig. 8.
This figure shows a visualization of a laser beam incident on an object (incident on a pedestrian in the present exemplary embodiment) in accordance with image data BD in a virtual environment. For this purpose, the laser beam can be visualized using raster-based or ray-tracing-based methods (OpenGL, DirectX or ray chase engines). In the present exemplary embodiment, shader code (e.g., OpenGL GLSL) is implemented to enable real-time visualization of the laser beam. The shader code visualizes each laser beam with 2 nodes, the laser beam origin and the point of incidence at the viewpoint 12. The shader may be embedded into the game engine, such as Unity3d or Unreal. The laser beam may also be rendered on the same node using a ray tracing based approach. The laser beam can be visualized at each frame (frame in computer graphics).
Visualization of the laser beam allows a user 8, such as an engineer, to interact with the laser beam in the virtual environment. In this way, the user 8, such as an engineer, can see objects captured by the LIDAR sensor, thereby simplifying the evaluation of the simulation results.
Furthermore, the laser beam may be visualized in the virtual environment by combining and representing the laser beam incident on the object in the form of a 3D object, network or line.
Reference is now additionally made to fig. 9.
It shows a visualization of the image data BD.
The present exemplary embodiment enables a user 8 to utilize an HMI 10 in a virtual environment.
To this end, the system 2 is implemented to configure the LIDAR sensor with a desktop computer through a menu using a keyboard and mouse. However, this is not feasible if VR hardware is used, for example if the user 8 (e.g. an engineer) wears an HMI embodied in the form of an HMD on his head.
In other words, the HMI is a virtual HMI in a virtual environment for a user 8 located in the virtual environment, which the user 8 represented by his avatar 14 in the virtual environment can use to effect various inputs. The HMI 10 may also be used to input or actually modify the parameter data set PM.
The sequence of the method in relation to the operation of the system 2 will now be described with additional reference to fig. 10.
In a first step S100, the system 2 reads a virtual reality data set VR representing the virtual environment.
In a further step S200, the system 2 reads a parameter dataset PM for the parameterized LIDAR sensor simulation module 4.
In a further step S300, the LIDAR sensor simulation module 4 creates a laser beam dataset LD representing the laser beams LS1, LS2, LS3, LS4, LS5 emitted by the LIDAR sensor by using the parameter dataset PM.
In a further step S400, the LIDAR sensor simulation module 4 also creates a range data set RD representing the range of the laser beams LS1, LS2, LS3, LS4, LS5 by evaluating the virtual reality data set VR and the laser beam data set LD based on ray tracing.
In a further step S500, the LIDAR sensor simulation module 4 in the present exemplary embodiment combines the virtual reality data set VR with the laser beam data set LD and the range data set RD to create image data BD.
The parameter data set PM can here parameterize a static LIDAR sensor with a horizontal scan region HFOV and/or a vertical scan region VFOV and/or a maximum range and/or a scan rate t.
Alternatively, the parameter data set PM may parameterize a rotating LIDAR sensor, wherein the laser beam data set LD then represents the rotating laser beams LS1, LS2, LS3, LS4, LS 5.
In this case, the scan data D representing the scan points of the LIDAR sensor is buffered in the ring memory 6. Furthermore, laser beams LS1, LS2, LS3, LS4, LS5 are visualized for the user 8 in the virtual environment according to the laser beam data set LD. Further, the HMI 10 can be utilized by the user 8 in a virtual environment to effect various inputs, such as inputting or actually altering the parameter data set PM.
The order of the steps may also be different, unlike the present exemplary embodiment. Furthermore, multiple steps may be performed simultaneously. In addition, various steps may be skipped or omitted, unlike the present exemplary embodiment.
Thus, it is feasible to improve the computer implementation of analog LIDAR sensors in a virtual environment.
List of reference numerals
2 System
4 LIDAR sensor simulation module
6-ring memory
8 users
10 human-computer interface (HMI)
12 view point
14 avatar
BD image data
D scan data
DW1 rotation angle
DW2 rotation angle
Total number of G scanning points
HFOV horizontal scanning area
LD laser beam dataset
LS1 laser beam
LS2 laser beam
LS3 laser beam
LS4 laser beam
LS5 laser beam
m number of horizontal laser beams
n number of perpendicular laser beams
PM parameter data set
RD range data set
Time of rotation of RT
S memory size
SW1 initial angle
SW2 initial angle
t scan rate
T time step
VFOV vertical scan region
VR virtual reality data set
S100 step
S200 step
S300 step
S400 step
S500 step

Claims (13)

1. A method for computer-implemented simulation of LIDAR sensors in a virtual environment, comprising the steps of:
(S100) reading a virtual reality data set (VR) representing the virtual environment,
(S200) reading a parameter data set (PM) for a parameterized LIDAR sensor simulation module (4),
(S300) creating a laser beam dataset (LD) representing laser beams (LS1, LS2, LS3, LS4 and LS5) emitted by the LIDAR sensor by using the LIDAR sensor simulation module (4) and the parameter dataset (PM),
(S400) evaluating the virtual reality dataset (VR) and the laser beam dataset (LD) by ray tracing to determine a Range Dataset (RD) representing the range of the laser beams (LS1, LS2, LS3, LS4 and LS5), and
(S500) creating image data (BD) representative of the virtual environment by combining the virtual reality data set (VR) with the laser beam data set (LD) and the range data set (RD).
2. Method according to claim 1, wherein the parameter data set (PM) parameterizes a static LIDAR sensor with a horizontal scanning area (HFOV) and/or a vertical scanning area (VFOV) and/or a maximum range and/or a scanning rate (t).
3. Method according to claim 1 or 2, wherein the parameter dataset (PM) parameterises a rotating LIDAR sensor, wherein the laser beam dataset (LD) represents rotating laser beams (LS1, LS2, LS3, LS4 and LS 5).
4. Method according to claim 3, wherein scan data (D) representative of scan points of the LIDAR sensor are cached in a ring memory (6).
5. Method according to one of the claims 1 to 4, wherein laser beams (LS1, LS2, LS3, LS4 and LS5) are visualized for a user (8) in the virtual environment according to the laser beam data set (LD).
6. Method according to one of claims 1 to 5, wherein a user (8) can utilize an HMI (10) in the virtual environment.
7. A computer program product designed to perform the method according to one of claims 1 to 6.
8. A system (2) for computer-implemented simulation of LIDAR sensors in a virtual environment, wherein the system (2) is implemented to read a virtual reality data set (VR) representing the virtual environment, to read a parameter data set (PM) parameterizing a LIDAR sensor simulation module (4) of the system (2), to create a laser beam data set (LD) representing laser beams (LS1, LS2, LS3, LS4 and LS5) emitted by the LIDAR sensors by using the LIDAR sensor simulation module (4) and the parameter data set (PM), to determine a range data set (RD) representing ranges of the laser beams (LS1, LS2, LS3, LS4 and LS5) by ray tracing evaluation of the virtual reality data set (VR) and the laser beam data set (LD), and to create a virtual reality data set (VR) representing the virtual environment by combining the virtual reality data set (VR) with the laser beam data set (LD) and the range data set (RD) Image data (BD).
9. The system (2) according to claim 7, wherein the parameter dataset (PM) parameterizes a static LIDAR sensor having a horizontal scan region (HFOV) and/or a vertical scan region (VFOV) and/or a maximum range and/or scan rate (t).
10. The system (2) as claimed in claim 7 or 8, wherein the parameter dataset (PM) parameterizes a rotating LIDAR sensor, wherein the laser beam dataset (LD) represents rotating laser beams (LS1, LS2, LS3, LS4 and LS 5).
11. The system (2) according to claim 8, 9 or 10, wherein the system (2) has a ring memory (6), the ring memory (6) being used for buffering scan data (D) representing scan points of the LIDAR sensor.
12. The system (2) as claimed in one of claims 8 to 11, wherein the system (2) is realized to visualize laser beams (LS1, LS2, LS3, LS4 and LS5) for a user (8) in the virtual environment in accordance with the laser beam dataset (LD).
13. The system (2) according to one of claims 8 to 12, wherein the system (2) is implemented such that a user (8) can utilize an HMI (10) in the virtual environment.
CN202010973580.8A 2019-09-18 2020-09-16 Method for computer-implemented simulation of LIDAR sensors in a virtual environment Pending CN112530022A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019125075.7A DE102019125075A1 (en) 2019-09-18 2019-09-18 Method for the computer-implemented simulation of a LIDAR sensor in a virtual environment
DE102019125075.7 2019-09-18

Publications (1)

Publication Number Publication Date
CN112530022A true CN112530022A (en) 2021-03-19

Family

ID=74686272

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010973580.8A Pending CN112530022A (en) 2019-09-18 2020-09-16 Method for computer-implemented simulation of LIDAR sensors in a virtual environment

Country Status (2)

Country Link
CN (1) CN112530022A (en)
DE (1) DE102019125075A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116577762A (en) * 2023-07-12 2023-08-11 西安深信科创信息技术有限公司 Simulation radar data generation method, device, equipment and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114895316B (en) * 2022-07-11 2022-10-28 之江实验室 Rapid numerical simulation method and device for multi-laser radar ranging

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116577762A (en) * 2023-07-12 2023-08-11 西安深信科创信息技术有限公司 Simulation radar data generation method, device, equipment and storage medium
CN116577762B (en) * 2023-07-12 2023-10-31 西安深信科创信息技术有限公司 Simulation radar data generation method, device, equipment and storage medium

Also Published As

Publication number Publication date
DE102019125075A1 (en) 2021-03-18

Similar Documents

Publication Publication Date Title
WO2018116790A1 (en) Inconsistency detection system, mixed reality system, program, and inconsistency detection method
CN109856993B (en) Autonomous driving simulation platform
US8907950B2 (en) Driving simulation apparatus, wide-angle camera video simulation apparatus, and image deforming/compositing apparatus
US8243061B2 (en) Image processing apparatus and method of controlling operation of same
JP2004209641A (en) Method and system for programming industrial robot
Platt et al. Comparative analysis of ros-unity3d and ros-gazebo for mobile ground robot simulation
CN112530022A (en) Method for computer-implemented simulation of LIDAR sensors in a virtual environment
CN115244492A (en) Occlusion of virtual objects in augmented reality by physical objects
JP2020532797A (en) Generating a new frame with rendered and unrendered content from the previous perspective
US7092860B1 (en) Hardware simulation systems and methods for vision inspection systems
CN112847336A (en) Action learning method, action learning device, storage medium and electronic equipment
Koestler et al. Mobile robot simulation with realistic error models
JP4841903B2 (en) Driving simulator
Weber et al. Approach for improved development of advanced driver assistance systems for future smart mobility concepts
CN113238556A (en) Water surface unmanned ship control system and method based on virtual reality
Papa et al. A new interactive railway virtual simulator for testing preventive safety
CN112634342A (en) Method for computer-implemented simulation of optical sensors in a virtual environment
CN114896817A (en) Vehicle in-loop fusion test system and method
US11113435B2 (en) Evaluation of a simulated vehicle functionality feature
Peng et al. A Vehicle Driving Simulator Based on Virtual Reality
US20210141972A1 (en) Method for generating an image data set for a computer-implemented simulation
JP7531005B2 (en) Autonomous Driving Sensor Simulation
US20210097769A1 (en) Virtual reality vehicle testing
Karlsson et al. Vehicle sensor data real time visualizer
CN118229863B (en) Image synthesis method, device, equipment and medium for robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination