CN117270506A - Motion control method and system for host vehicle in virtual simulation scene - Google Patents

Motion control method and system for host vehicle in virtual simulation scene Download PDF

Info

Publication number
CN117270506A
CN117270506A CN202311386135.1A CN202311386135A CN117270506A CN 117270506 A CN117270506 A CN 117270506A CN 202311386135 A CN202311386135 A CN 202311386135A CN 117270506 A CN117270506 A CN 117270506A
Authority
CN
China
Prior art keywords
data
laser radar
virtual
vehicle
virtual simulation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311386135.1A
Other languages
Chinese (zh)
Inventor
朱冰
薛经纬
赵健
吴坚
赵男男
郭运娇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changsha Automobile Innovation Research Institute
Original Assignee
Changsha Automobile Innovation Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changsha Automobile Innovation Research Institute filed Critical Changsha Automobile Innovation Research Institute
Priority to CN202311386135.1A priority Critical patent/CN117270506A/en
Publication of CN117270506A publication Critical patent/CN117270506A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • G05B23/0205Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults
    • G05B23/0218Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterised by the fault detection method dealing with either existing or incipient faults
    • G05B23/0256Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterised by the fault detection method dealing with either existing or incipient faults injecting test signals and analyzing monitored process response, e.g. injecting the test signal while interrupting the normal operation of the monitored system; superimposing the test signal onto a control signal during normal operation of the monitored system

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to a method and a system for controlling the motion of a host vehicle in a virtual simulation scene. The method comprises the following steps: 1. generating virtual laser radar data; 2. analyzing the virtual laser radar data through the ROS, and transmitting the virtual laser radar data to the ECU; 3. acquiring real inertial navigation data and injecting the real inertial navigation data into an ECU; 4. generating a virtual sensor decision sensing result; 5. recharging the actual sensor data to the ECU; 6. embedding the decision sensing result into a vehicle dynamics model, and controlling the movement of a host vehicle; 7. evaluating the virtually generated laser radar data; the system comprises a data generation module, a data forwarding module, an acquisition module, a first generation module, a second generation module, a control module and an evaluation module. The ROS robot system is adopted to realize a double data injection strategy, and aims to effectively integrate virtual simulation environment and actual vehicle data, accurately control the movement of the main vehicle, and contribute to the comprehensive evaluation of algorithm performance and safety in a safe environment so as to meet strict intelligent network connection automobile system standards.

Description

Motion control method and system for host vehicle in virtual simulation scene
Technical Field
The invention belongs to the technical field of automatic driving, and particularly relates to a method and a system for controlling movement of a host vehicle in a virtual simulation scene.
Background
In the field of fast-developing autopilot, a high-efficiency and accurate laser radar algorithm becomes a core challenge for realizing autopilot. However, algorithm development based on actual road data involves expensive vehicle acquisition, time investment, and human resource costs, while also coping with strict vehicle safety standards. In order to solve the complex problem, a laser radar data generation and recharging method based on ROS is generated, and an efficient and economical solution is provided for intelligent network-connected automobile algorithm research.
Disclosure of Invention
In order to solve the problems, the invention provides a method and a system for controlling the movement of a host vehicle in a virtual simulation scene, which adopt an ROS robot system to realize a double data injection strategy, aim to effectively fuse virtual simulation environment and actual vehicle data and accurately control the movement of the host vehicle in the virtual simulation scene.
The technical scheme of the invention is as follows in combination with the accompanying drawings:
in a first aspect, the present invention provides a method for controlling motion of a host vehicle in a virtual simulation scene, including the following steps:
simulating a physical operation principle of a laser radar in a virtual simulation environment by using an Optix technology to generate virtual laser radar data;
analyzing the virtual laser radar data through the ROS, sending the virtual laser radar data to the vehicle-mounted Ethernet board through the UDP, and transmitting the virtual laser radar data to the domain controller (ECU) through the board;
step three, acquiring pose information of the virtual simulation vehicle through virtual laser radar data, transmitting the pose information of the vehicle to a six-degree-of-freedom platform, assembling an IMU on the six-degree-of-freedom platform to acquire real inertial navigation data of the virtual simulation vehicle, and injecting the real inertial navigation data into an ECU;
step four, merging the real inertial navigation data with the laser radar data with the actual environmental characteristics in the virtual simulation environment, integrating the camera and the GNSS real data in the virtual simulation scene, and generating a virtual sensor decision sensing result so as to support a decision sensing task;
fifthly, a virtual simulation software platform, namely a Ubuntu system, is mounted in the virtual simulation server, and data collected by a vehicle actual road are replayed through a Rosbag data replay mechanism; recharging the collected actual sensor data to the ECU through a UDP communication protocol to generate a decision and a perception result from actual road collection;
step six, embedding the decision sensing result of the virtual sensor into a vehicle dynamics model to control the movement of a host vehicle in the virtual simulation scene;
and step seven, comparing the laser radar data level with a decision algorithm perception level, and evaluating the virtually generated laser radar data.
Furthermore, in the first step, different types of lidar are required to be simulated when the lidar is simulated, and various weather conditions are simulated.
Further, the specific method of the first step is as follows:
11 Acquiring parameters of the laser radar in a virtual simulation environment;
the parameters of the laser radar comprise detection distance d and emitted laser power P T Wavelength lambda of the emitted laser, pulse width tau, pulse repetition frequency f, angle of view Fov h Resolution Δd and extinction coefficient γ;
wherein the detection distance d is calculated by the following formula:
wherein d is the detection distance; c is the speed of light; Δt is the time difference;
describing the detection capability of the laser radar from the angle of energy, the detection capability is expressed by a laser radar acting distance equation, and the detection capability is shown as follows:
wherein P is R To receive laser power; p (P) T For emitting laser power; g T Gain for the transmit antenna; sigma is the target scattering cross section; d is the receiving aperture; η (eta) Atm Is a single pass atmospheric transmission coefficient; η (eta) Sys Is the transmission coefficient of the optical system of the laser radar;
wherein the transmitting antenna gain G T Is decomposed into:
in θ T Bandwidth for lasing; lambda is the wavelength of the emitted laser light, and is set to 905nm; k (K) a Is aperture light transmission constant;
therefore, the laser radar range equation is as follows:
determining a laser radar maximum detectable distance R max The following is shown:
wherein sigma is the object scattering section to determine the object reflection attribute; p (P) Rmin Minimum power for detection required in a lidar system;
wherein the single pass atmospheric transmission coefficient eta Atm Represented by the formula:
η atm =exp[-2γ(λ)R]
wherein, gamma is the atmospheric attenuation coefficient at the position of the distance R from the transmitting end; the atmospheric attenuation coefficient is a function of wavelength, and is brought about by two parts in the environment, one part being atmospheric gas molecules and the other part being atmospheric aerosols, namely:
γ(λ)=γ molecules (λ)+γ aerosol (λ)
wherein, gamma molecules (lambda) is the molecular attenuation coefficient of the atmospheric gas; gamma ray aerosol (lambda) is the atmospheric aerosol attenuation coefficient;
when the wavelength is 905nm, the laser attenuation caused by the atmosphere is simplified into attenuation caused by the atmosphere aerosol;
γ(λ)≈γ aerosol (λ)
wherein the atmospheric aerosol attenuation coefficient gamma aerosol (lambda) is expressed as:
γ aerosol (λ)=σ α (λ)+k α (λ)
in sigma α (lambda) is the scattering coefficient of the aerosol; k (k) α (lambda) is the absorption coefficient of the aerosol;
the effect of the scattering coefficient and the absorption coefficient of the aerosol are combined into attenuation coefficient, and the expression is as follows:
γ(λ)=γ haze (λ)+γ fog (λ)+γ rain (λ)+γ snow (λ)
wherein, gamma haze (λ) is the extinction coefficient due to haze at wavelength λ; gamma ray fog (lambda) is the extinction coefficient due to haze at wavelength lambda; gamma ray rain (lambda) is the extinction coefficient at wavelength lambda due to rain; gamma ray snow (lambda) is the extinction coefficient due to snow at wavelength lambda;
12 Performing laser radar simulation by using an Nvidia OptiX ray tracing technology according to the acquired parameters of the laser radar to generate virtual laser radar data;
13 Receiving virtual laser radar data, analyzing and visualizing to obtain pose information of the virtual simulation vehicle.
Further, the specific method of the third step is as follows:
31 Mapping pose information of the virtual simulation vehicle to the six-degree-of-freedom turntable;
32 Mounting an IMU on the six-degree-of-freedom turntable to obtain real inertial navigation data.
Further, the pose information of the virtual simulation vehicle comprises a pitch angle, a roll angle and a course angle.
In a second aspect, the present invention further provides a motion control system for a host vehicle in a virtual simulation scene, including:
the data generation module is used for simulating a physical operation principle of the laser radar in a virtual simulation environment by using an Optix technology to generate virtual laser radar data;
the data forwarding module is used for analyzing the virtual laser radar data through the ROS, sending the virtual laser radar data to the vehicle-mounted Ethernet board through the UDP, and transmitting the virtual laser radar data to the domain controller (ECU) through the board;
the acquisition module is used for acquiring pose information of the virtual simulation vehicle through the virtual laser radar data, transmitting the pose information of the vehicle to the six-degree-of-freedom platform, assembling an IMU on the six-degree-of-freedom platform to acquire real inertial navigation data of the virtual simulation vehicle, and injecting the real inertial navigation data into the ECU;
the first generation module is used for fusing real inertial navigation data with laser radar data with actual environmental characteristics in the virtual simulation environment, integrating cameras and GNSS real data in the virtual simulation scene, and generating a virtual sensor decision sensing result so as to support a decision sensing task;
the second generation module is used for carrying a virtual simulation software platform, namely a Ubuntu system, in the virtual simulation server and playing back the data collected by the actual road of the vehicle through a Rosbag data playback mechanism; recharging the collected actual sensor data to the ECU through a UDP communication protocol to generate a decision and a perception result from actual road collection;
the control module is used for embedding the decision sensing result of the virtual sensor into the vehicle dynamics model so as to control the movement of the host vehicle in the virtual simulation scene;
and the evaluation module is used for comparing the laser radar data level with the decision algorithm perception level and evaluating the virtually generated laser radar data.
Further, the data generating module includes:
a laser transmitter module for simulating laser emission;
the control and direction module is used for accurately controlling the transmitting direction of the laser radar wire harness;
the motor movement module is used for realizing 360-degree omnidirectional vision of the laser radar wire harness;
the diversity module is used for simulating the configuration and the performance of different laser radar sensors;
the receiving simulation data module is used for receiving simulation data of the laser radar and analyzing and visualizing the simulation data;
further, the laser emitting module includes:
the wire harness number simulation module is used for simulating various laser radar models;
the ranging adaptability module is used for setting different ranging ranges;
and the rain and snow refraction module is used for simulating the refraction influence of raindrops or snowflakes on light rays in ray tracing.
Further, the control and direction module can adjust the scanning range of the wire harness according to the requirement.
The beneficial effects of the invention are as follows:
1) The invention not only can reduce the development cost, but also is beneficial to comprehensively evaluating the algorithm performance and the safety in a safe environment so as to meet the strict intelligent network connection automobile system standard;
2) According to the invention, the six-degree-of-freedom platform and the real laser radar data are used as real labels of the group trunk, so that the authenticity of the virtual laser radar point cloud data in the virtual simulation scene is continuously corrected, and the effect of the virtual laser radar point cloud data is as close as possible;
3) The invention adopts the ROS robot system to realize a double data injection strategy, and can evaluate the effect of the decision algorithm output by the virtual laser radar data by taking the decision algorithm output by the actual sensor injection data as a true value.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for controlling the motion of a host vehicle in a virtual simulation scene according to the present invention;
fig. 2 is a schematic structural diagram of a motion control system of a host vehicle in a virtual simulation scene according to the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
Example 1
Referring to fig. 1, the invention provides a method for controlling the motion of a host vehicle in a virtual simulation scene, which comprises the following steps:
simulating a physical operation principle of a laser radar in a virtual simulation environment by using an Optix technology to generate virtual laser radar data;
the static scene in the virtual simulation environment originates from high-precision maps of the real area, which maps are converted into static maps in the OpenDRIVE format and then imported into the virtual simulation environment. This step ensures the consistency of the static scene in the virtual simulation scene with the real high-precision map.
The specific method comprises the following steps:
11 Acquiring parameters of the laser radar in a virtual simulation environment;
the parameters of the laser radar include:
detection distance d (m): the furthest distance that the lidar system can detect;
emitted laser power P R (W): describing the intensity of light emitted by the laser source;
wavelength λ (μm): the color or frequency of the laser affects the propagation characteristics in the atmosphere;
pulse width τ (ms): the duration of the laser pulse;
pulse repetition frequency f (Hz): the emission frequency of the laser pulse;
angle of view Fov h : horizontal and vertical scanning ranges of the lidar;
resolution Δd (m): a minimum distance interval measured by the laser radar;
extinction coefficient gamma (km) -1 ): extinction effect of haze, rain, snow and the like in the atmosphere on laser.
Wherein the detection distance d is calculated by the following formula:
wherein d is the detection distance; c is the speed of light; Δt is the time difference;
the Time difference refers to the fact that the lidar sensor mainly uses the Time-of-Flight principle in terms of distance detection. Time of flight refers to the time difference that elapses between when a laser beam is emitted from a laser transmitter until it is reflected back to a receiver by a target.
Describing the detection capability of the laser radar from the angle of energy, the detection capability is expressed by a laser radar acting distance equation, and the detection capability is shown as follows:
wherein P is R To receive laser power; p (P) T For emitting laser power; g T Gain for the transmit antenna; sigma is the target scattering cross section;d is the receiving aperture; η (eta) Atm Is a single pass atmospheric transmission coefficient; η (eta) Sys Is the transmission coefficient of the optical system of the laser radar;
wherein the transmitting antenna gain G T Is decomposed into:
in θ T Bandwidth for lasing; lambda is the wavelength of the emitted laser light, and is set to 905nm; k (K) a Is aperture light transmission constant;
therefore, the laser radar range equation is as follows:
determining a laser radar maximum detectable distance R max The following is shown:
wherein sigma is the object scattering section to determine the object reflection attribute; p (P) Rmin Minimum power for detection required in a lidar system;
wherein the single pass atmospheric transmission coefficient eta Atm Represented by the formula:
η atm =exp[-2γ(λ)R]
wherein, gamma is the atmospheric attenuation coefficient at the position of the distance R from the transmitting end; the atmospheric attenuation coefficient is a function of wavelength, and is brought about by two parts in the environment, one part being atmospheric gas molecules and the other part being atmospheric aerosols, namely:
γ(λ)=γ molecules (λ)+γ aerosol (λ)
wherein, gamma molecules (lambda) is the molecular attenuation coefficient of the atmospheric gas; gamma ray aerosol (lambda) is the atmospheric aerosol attenuation coefficient;
when the wavelength is 905nm, the laser attenuation caused by the atmosphere is simplified into attenuation caused by the atmosphere aerosol;
γ(λ)≈γ aerosol (λ)
wherein the atmospheric aerosol attenuation coefficient gamma aerosol (lambda) is expressed as:
γ aerosol (λ)=σ α (λ)+K α (λ)
in sigma α (lambda) is the scattering coefficient of the aerosol; k (k) α (lambda) is the absorption coefficient of the aerosol;
the effect of the scattering coefficient and the absorption coefficient of the aerosol are combined into attenuation coefficient, and the expression is as follows:
γ(λ)=γ haze (λ)+γ fog (λ)+γ rain (λ)+γ snow (λ)
wherein, gamma haze (λ) is the extinction coefficient due to haze at wavelength λ; gamma ray fog (lambda) is the extinction coefficient due to haze at wavelength lambda; gamma ray rain (lambda) is the extinction coefficient at wavelength lambda due to rain; gamma ray snow (lambda) is the extinction coefficient due to snow at wavelength lambda;
12 Performing laser radar simulation in a virtual simulation environment according to the obtained parameters Nvidia OptiX ray tracing ray tracing technology of the laser radar, thereby generating virtual laser radar data;
13 Receiving virtual laser radar data, analyzing and visualizing to obtain pose information of the virtual simulation vehicle;
lidar simulation effects performed in a Virtual Test Drive (VTD) environment. ROS1 (first generation of robot operating system) and ROS2 (second generation of robot operating system) can both parse RDB (Runtime Data Bus) data of the virtual simulation environment VTD, RDB (Runtime Data Bus) being a core component of the VTD for transferring data between different parts of the simulation. In the present invention, ROS2 is selected as the parsing RDB tool. One of the main reasons when choosing ROS2 over ROS1 is that ROS2 introduces a de-centralized node communication mechanism and uses DDS (Data Distribution Service) as a communication middleware. These improvements make it easier for ROS2 to establish multi-source heterogeneous communication connections with other devices.
The pose information of the vehicle comprises the following three parameters:
pitch angle (Pitch): it represents the angle of rotation about the X-axis, simulating the degree of fore-aft tilting of the vehicle.
Roll angle (Roll): this is the angle of rotation about the Y-axis, which is used to simulate the degree of tilting of the vehicle left and right.
Heading angle (Yaw): this is the angle of rotation about the Z-axis used to simulate the direction and extent of a vehicle turn.
And secondly, analyzing the virtual laser radar data through the ROS, sending the virtual laser radar data to the vehicle-mounted Ethernet board through the UDP, and transmitting the virtual laser radar data to the domain controller (ECU) through the board.
Step three, acquiring pose information of a virtual simulation vehicle through a virtual simulation environment, transmitting the pose information of the vehicle to a six-degree-of-freedom platform, and assembling an IMU on the six-degree-of-freedom platform to acquire real inertial navigation data of the virtual simulation vehicle;
31 Mapping pose information of the virtual simulation vehicle to the six-degree-of-freedom turntable;
the pose information of the virtual simulation vehicle comprises a pitch angle, a roll angle and a course angle.
The pose information of the vehicle is acquired from the virtual simulation environment and transmitted to the six-degree-of-freedom platform, and the pose information comprises the following three parameters:
pitch angle (Pitch): it represents the angle of rotation about the X-axis, simulating the degree of fore-aft tilting of the vehicle.
Roll angle (Roll): this is the angle of rotation about the Y-axis, which is used to simulate the degree of tilting of the vehicle left and right.
Heading angle (Yaw): this is the angle of rotation about the Z-axis used to simulate the direction and extent of a vehicle turn.
By installing an Inertial Measurement Unit (IMU) on a six-degree-of-freedom platform, the rotational motion of a simulated vehicle on the platform can be measured and recorded in real time, thereby obtaining the following real data:
pitch angle (Pitch): it indicates the rotation angle around the X-axis, reflecting the degree of forward and backward tilting of the vehicle.
Roll angle (Roll): this is the rotation angle around the Y-axis, reflecting the degree of tilting of the vehicle from side to side.
Heading angle (Yaw): this is the angle of rotation about the Z-axis, reflecting the direction and extent of the vehicle turn.
32 Mounting an IMU on the six-degree-of-freedom turntable to obtain real inertial navigation data.
Virtual simulation environments typically rely on modeling and physical simulation methods, although best efforts can be made to improve their accuracy, absolute perfect accuracy levels can never be achieved. Thus, the virtual pose data may be affected by model errors. In contrast, the IMU data on an actual six-degree-of-freedom platform is real and is not subject to model errors.
Step four, merging the real inertial navigation data with the laser radar data with the actual environmental characteristics in the virtual simulation environment, integrating the camera and the GNSS real data in the virtual simulation scene, and generating a virtual sensor decision sensing result so as to support a decision sensing task;
in the field of autopilot, decision-making algorithms are generally responsible for making decisions on the behavior of a host vehicle (i.e., an autopilot vehicle) based on perceived data and environmental information, including, but not limited to, acceleration, deceleration, steering, and lane change. The decision algorithm adopted in the invention for generating the decision perception result mainly refers to an open-source automatic driving framework, such as Autoware of Japan TIER IV company and Apollo of China hundred degree company. Autoware which is mainly used for laser radar and assisted by other sensors is deployed on the domain controller as a decision algorithm.
Fifthly, a virtual simulation software platform, namely a Ubuntu system, is mounted in the virtual simulation server, and data collected by a vehicle actual road are replayed through a Rosbag data replay mechanism; recharging the collected actual sensor data to the ECU through a UDP communication protocol to generate a decision and a perception result from actual road collection;
laser radar point cloud data collected from an actual vehicle is recharged into a domain controller by deploying a Rosbag data playback mechanism in the ROS system.
The real data recharging system plays a role of groudtluth (actual situation or accurate reference) herein, and aims to evaluate the quality of the virtually-simulated generated lidar data by comparing the difference between the virtually-simulated environment-generated lidar data and the actual host vehicle behavior decision recharged to the ECU.
Step six, embedding the decision sensing result of the virtual sensor into a vehicle dynamics model to control the movement of a host vehicle in the virtual simulation scene;
and step seven, comparing the laser radar data level with a decision algorithm perception level, and evaluating the virtually generated laser radar data.
Example two
Referring to fig. 2, the present embodiment provides a motion control system for a host vehicle in a virtual simulation scene, including:
the data generation module is used for simulating a physical operation principle of the laser radar in a virtual simulation environment by using an Optix technology to generate virtual laser radar data;
the data generation module is deployed on the Ubuntu operating system and is used for realizing automatic driving virtual simulation software constructed based on the Optix technology. The software environment comprises VTD, CARLA, LGSVL, gazebo and other components, and can analyze laser radar data deployed on a host vehicle in virtual simulation software through ros.
The data generation module comprises:
a laser transmitter module for simulating laser emission; the laser emission module includes:
the wire harness number simulation module is used for simulating various laser radar models; the number of strands allows simulating the types of lidar commonly found on the market, such as 16, 32 and 64 lines. The different wire harness numbers influence the angular resolution and shielding condition of laser radar scanning, thereby influencing the precision and accuracy of data.
The ranging adaptability module is used for setting different ranging ranges; this is very useful in modeling obstacle detection and ranging capabilities at different distances. Different ranging ranges may simulate the behavior of a lidar in different scenarios.
And the rain and snow refraction module is used for simulating the refraction influence of raindrops or snowflakes on light rays in ray tracing. By calculating parameters such as the incident angle of the light and the refractive index of the medium, the refractive angle of the light at the intersection point of raindrops or snowflakes can be determined, so that the propagation path of the light in the environment is changed.
The angle of refraction may be calculated using Snell's Law, which describes the Law of refraction of light rays as they enter one medium from another. According to snell's law, the relationship between the angle of incidence and angle of refraction of a ray can be expressed by the following formula:
N 1 *sinθ 1 =N 2 *sinθ 2
wherein N is 1 And N 2 Refractive index of two media (ratio of light velocity of the media to light velocity in vacuum), θ 1 Is the incident angle theta 2 Is the angle of refraction and sin represents a sine function. The refractive angle can be calculated using the above formula with known angles of incidence, refractive indices of the two media.
In addition to the angle of incidence and the refractive index of the medium, it may be necessary to consider parameters such as the physical properties of the raindrops or snowflakes, such as size, shape, density, etc., and the propagation speed of light in air. These parameters may influence the calculation of the refraction angle and thus the values of these parameters need to be taken into account when simulating the effect of raindrops or snowflakes on the light.
Once the angle of refraction is calculated, i.e., determined by snell's law, the new propagation direction of the ray at the point of intersection with the raindrop or snowflake can be accurately described. As a result of snell's law, a ray will change direction as it enters a different medium, causing its path to bend and no longer follow the original propagation path. This refraction effect causes the light to deflect in the presence of rain drops or snow. In the simulation, the propagation direction of the light can be updated accordingly to simulate the interaction effect between the light and the raindrops or snowflakes. This process helps to more accurately simulate the performance of a lidar in rainy and snowy weather conditions.
The performance of the lidar under severe weather conditions can be more realistically simulated from the above description.
The control and direction module is used for accurately controlling the emitting direction of the laser radar wire harness, which is very important for simulating the conditions of different scanning angles; the module has setting options in the horizontal and vertical directions, so that a user can adjust the scanning range of the wire harness according to the needs. This capability is critical to data generation simulating different terrain, scenes and traffic conditions.
And the motor movement module is used for realizing 360-degree omnidirectional vision of the laser radar wire harness, and the laser radar can continuously rotate in the horizontal direction by controlling the movement of the motor, so that the omnidirectional scanning of the surrounding environment is realized. The adjustability of the motor speed directly affects the data acquisition resolution in the horizontal direction. Higher motor speeds may provide more data points, thereby increasing horizontal resolution.
And the diversity module is used for simulating the configuration and the performance of different laser radar sensors so as to meet different simulation requirements. Each data packet may be designed to contain a different number of longitudinal data sets, for example, a user may customize 24 sets, 32 sets, etc. of longitudinal data according to actual needs. This diversity can simulate the configuration and performance of different lidar sensors. Each data packet not only contains basic information such as position coordinates and time stamps, but also can further contain multidimensional attributes. For example, properties such as reflectivity, energy intensity, etc. may be added to the data packet to more accurately simulate information perceived by lidar in the real world. The dimension of the multidimensional attribute containing the simulation data makes the generated data more realistic. In terms of data transmission, different communication protocols such as TCP, UDP and the like can be selected and used according to requirements.
The simulation data receiving module is used for receiving, analyzing and visualizing the simulation data of the laser radar, creating a special node in the ROS to analyze and release the simulation data of the laser radar, and subscribing the data and visualizing the data by means of a tool RViz. Meanwhile, the UDP/TCP communication function is realized in the laser radar simulation program, so that data is sent to an upper computer and a six-degree-of-freedom platform through a network, the generated laser radar data and the state of the sub-vehicle can be packaged into a data packet, and the data packet is sent to a designated IP address and port through a selected communication mode (UDP or TCP). And creating a receiving program at the Labview section at the upper computer side, and analyzing the data packet sent by the laser radar simulation program and extracting laser radar data and sub-vehicle state information. And analyzing pose information data in the state of the host vehicle for the six-degree-of-freedom platform, so that the pose information data can accurately simulate the change of the pitching, rolling and yaw poses of the host vehicle in the virtual simulation environment in the real world.
The data forwarding module is used for analyzing the virtual laser radar data through the ROS, sending the virtual laser radar data to the vehicle-mounted Ethernet board through the UDP, and transmitting the virtual laser radar data to the domain controller (ECU) through the board;
the system consists of a National Instruments (NI) PXI chassis, an NI controller and a vehicle-mounted Ethernet board, and mainly aims to process and convert incoming data so as to enable the incoming data to conform to a standardized format which can be received and processed by a domain controller;
the acquisition module is used for acquiring pose information of the virtual simulation vehicle through the virtual laser radar data, transmitting the pose information of the vehicle to the six-degree-of-freedom platform, assembling an IMU on the six-degree-of-freedom platform to acquire real inertial navigation data of the virtual simulation vehicle, and injecting the real inertial navigation data into the ECU;
in order to realize the pitching, rolling and swaying of the host vehicle in the virtual simulation environment displayed in a real object and ensure that the IMU can acquire data reflected by real hardware, a corresponding control program is realized on the six-degree-of-freedom turntable. The control program will receive host vehicle status information, including pitch, roll and yaw angle data, from the virtual simulation environment. By analyzing the data, the control program correspondingly adjusts the posture of the six-degree-of-freedom turntable so as to simulate the posture change of the host vehicle in the virtual simulation environment. Real-time attitude control ensures that the six-degree-of-freedom turntable can accurately simulate the motion state of the host vehicle in a virtual simulation environment. By mapping the host vehicle posture information in the virtual environment to the actual hardware six-degree-of-freedom turntable, high consistency with the virtual simulation environment can be realized, so that the IMU is ensured to acquire real data related to actual motion.
The first generation module is used for fusing real inertial navigation data with laser radar data with actual environmental characteristics in the virtual simulation environment, integrating cameras and GNSS real data in the virtual simulation scene, and generating a virtual sensor decision sensing result so as to support a decision sensing task;
the second generation module is used for carrying a virtual simulation software platform, namely a Ubuntu system, in the virtual simulation server and playing back the data collected by the actual road of the vehicle through a Rosbag data playback mechanism; and recharging the collected actual sensor data to the ECU through a UDP communication protocol to generate a decision and a perception result from actual road acquisition, wherein the module does not need to acquire the posture information of the host vehicle.
And the control module is used for embedding the decision sensing result of the virtual sensor into the vehicle dynamics model so as to control the movement of the host vehicle in the virtual simulation scene.
And (3) controlling the deployment of the Carsim vehicle dynamics model by means of a MATLAB/Simulink tool so as to embed the decision sensing result of the generating module into the vehicle dynamics model. The vehicle dynamics model aims at mapping these embedded results onto the host vehicle object in the virtual simulation software environment to effectively control the motion behavior of the host vehicle.
In the communication between the generation module and the control module, the effective connection between two different systems and different platforms is realized by setting the same ROSDOMAIN ID under the same local area network. This enables the vehicle dynamics unit to successfully subscribe to topics related to the perception results issued by the generation module, enabling an organic collaboration between decision awareness and the vehicle dynamics model.
And the evaluation module is used for comparing the laser radar data level with the decision algorithm perception level and evaluating the virtually generated laser radar data.
In summary, the present invention employs ROS robotic systems to implement dual data injection strategies aimed at comparing virtual simulation environments with actual vehicle data. Firstly, the system can acquire laser radar data from a virtual simulation environment in real time, and inject the laser radar data into a decision algorithm to provide perception information with virtual rain and snow effects for the laser radar data. Through the ROSbag technology, the system can play back the data acquired by the actual vehicle, and then the actual vehicle data is injected into a decision algorithm, so that data verification in a real scene is realized. Meanwhile, a six-degree-of-freedom platform is introduced into the system, so that the posture of the host vehicle in the virtual simulation environment can be mapped to the actual environment in real time. Thus, the IMU equipment can sense pose information of the host vehicle in the virtual simulation software reflected by the six-degree-of-freedom platform. These pose information are then used as inputs to a decision algorithm to make the decision output more closely approximate the actual vehicle state. The dual injection mode greatly facilitates the comparison of actual data and virtual simulation data, and provides a basis for system evaluation and verification, so that the movement of the main vehicle can be controlled more accurately.
Meanwhile, in the laser radar data simulation module, it is required to ensure that the rotation ranges of the simulated laser radar harness and the virtual motor are consistent with laser radar parameters used when the actual vehicle collects data. In addition, the frequency of the lidar data must also be consistent for each transmission. The design goal of this consistency is to ensure that the resulting lidar data perception results are comparable when data is re-filled and virtual simulation data is injected. By comparing and analyzing the results, the performance of the system in the virtual simulation and actual environment can be accurately estimated, so that a more accurate reference basis is provided for algorithm verification and performance analysis.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (9)

1. The motion control method of the host vehicle in the virtual simulation scene is characterized by comprising the following steps of:
simulating a physical operation principle of a laser radar in a virtual simulation environment by using an Optix technology to generate virtual laser radar data;
analyzing the virtual laser radar data through the ROS, sending the virtual laser radar data to the vehicle-mounted Ethernet board through the UDP, and transmitting the virtual laser radar data to the domain controller (ECU) through the board;
step three, acquiring pose information of the virtual simulation vehicle through virtual laser radar data, transmitting the pose information of the vehicle to a six-degree-of-freedom platform, assembling an IMU on the six-degree-of-freedom platform to acquire real inertial navigation data of the virtual simulation vehicle, and injecting the real inertial navigation data into an ECU;
step four, merging the real inertial navigation data with the laser radar data with the actual environmental characteristics in the virtual simulation environment, integrating the camera and the GNSS real data in the virtual simulation scene, and generating a virtual sensor decision sensing result so as to support a decision sensing task;
fifthly, a virtual simulation software platform, namely a Ubuntu system, is mounted in the virtual simulation server, and data collected by a vehicle actual road are replayed through a Rosbag data replay mechanism; recharging the collected actual sensor data to the ECU through a UDP communication protocol to generate a decision and a perception result from actual road collection;
step six, embedding the decision sensing result of the virtual sensor into a vehicle dynamics model to control the movement of a host vehicle in the virtual simulation scene;
and step seven, comparing the laser radar data level with a decision algorithm perception level, and evaluating the virtually generated laser radar data.
2. The method for controlling the movement of a host vehicle in a virtual simulation scene according to claim 1, wherein in the first step, different types of lidar are simulated when the lidar is simulated, and various weather conditions are simulated.
3. The method for controlling the motion of a host vehicle in a virtual simulation scene according to claim 1, wherein the specific method in the first step is as follows:
11 Acquiring parameters of the laser radar in a virtual simulation environment;
the parameters of the laser radar comprise detection distance d and emitted laser power P T Wavelength lambda of the emitted laser, pulse width tau, pulse repetition frequency f, angle of view Fov h Resolution Δd and extinction coefficient γ;
wherein the detection distance d is calculated by the following formula:
wherein d is the detection distance; c is the speed of light; Δt is the time difference;
describing the detection capability of the laser radar from the angle of energy, the detection capability is expressed by a laser radar acting distance equation, and the detection capability is shown as follows:
wherein P is R To receive laser power; p (P) T For emitting laser power; g T Gain for the transmit antenna; sigma is the target scattering cross section; d is the receiving aperture; η (eta) Atm Is a single pass atmospheric transmission coefficient; η (eta) Sys Is the transmission coefficient of the optical system of the laser radar;
wherein the transmitting antenna gain G T Is decomposed into:
in θ T Bandwidth for lasing; lambda is the wavelength of the emitted laser light, and is set to 905nm; k (K) a Is aperture light transmission constant;
therefore, the laser radar range equation is as follows:
determining a laser radar maximum detectable distance R max The following is shown:
wherein sigma is the object scattering section to determine the object reflection attribute; p (P) Rmin Is the minimum power required for detection in a lidar system. The method comprises the steps of carrying out a first treatment on the surface of the
Wherein the single pass atmospheric transmission coefficient eta Atm Represented by the formula:
η atm =exp[-2γ(λ)R]
wherein, gamma is the atmospheric attenuation coefficient at the position of the distance R from the transmitting end; the atmospheric attenuation coefficient is a function of wavelength, and is brought about by two parts in the environment, one part being atmospheric gas molecules and the other part being atmospheric aerosols, namely:
γ(λ)=γ molecules (λ)+γ aerosol (λ)
wherein, gamma molecules (lambda) is the molecular attenuation coefficient of the atmospheric gas; gamma ray aerosol (lambda) is the atmospheric aerosol attenuation coefficient;
when the wavelength is 905nm, the laser attenuation caused by the atmosphere is simplified into attenuation caused by the atmosphere aerosol;
γ(λ)≈γ aerosol (λ)
wherein the atmospheric aerosol attenuation coefficient gamma aerosol (lambda) is expressed as:
γ aerosol (λ)=σ α (λ)+k α (λ)
in sigma α (lambda) is the gas-soluble scatteringCoefficients; k (k) α (lambda) is the absorption coefficient of the aerosol;
the effect of the scattering coefficient and the absorption coefficient of the aerosol are combined into attenuation coefficient, and the expression is as follows:
γ(λ)=γ haze (λ)+γ fog (λ)+γ rain (λ)+γ snow (λ)
wherein, gamma haze (λ) is the extinction coefficient due to haze at wavelength λ; gamma ray fog (lambda) is the extinction coefficient due to haze at wavelength lambda; gamma ray rain (lambda) is the extinction coefficient at wavelength lambda due to rain; gamma ray snow (lambda) is the extinction coefficient due to snow at wavelength lambda;
12 Performing laser radar simulation by using an Nvidia OptiX ray tracing technology according to the acquired parameters of the laser radar to generate virtual laser radar data;
13 Receiving virtual laser radar data, analyzing and visualizing to obtain pose information of the virtual simulation vehicle.
4. The method for controlling the motion of a host vehicle in a virtual simulation scene according to claim 3, wherein the specific method in the third step is as follows:
31 Mapping pose information of the virtual simulation vehicle to the six-degree-of-freedom turntable;
32 Mounting an IMU on the six-degree-of-freedom turntable to obtain real inertial navigation data.
5. A method for controlling movement of a host vehicle in a virtual simulation scene according to claim 3, wherein the pose information of the virtual simulation vehicle includes a pitch angle, a roll angle and a heading angle.
6. A motion control system for a host vehicle in a virtual simulation scene, comprising:
the data generation module is used for simulating a physical operation principle of the laser radar in a virtual simulation environment by using an Optix technology to generate virtual laser radar data;
the data forwarding module is used for analyzing the virtual laser radar data through the ROS, sending the virtual laser radar data to the vehicle-mounted Ethernet board through the UDP, and transmitting the virtual laser radar data to the domain controller (ECU) through the board;
the acquisition module is used for acquiring pose information of the virtual simulation vehicle through the virtual laser radar data, transmitting the pose information of the vehicle to the six-degree-of-freedom platform, assembling an IMU on the six-degree-of-freedom platform to acquire real inertial navigation data of the virtual simulation vehicle, and injecting the real inertial navigation data into the ECU;
the first generation module is used for fusing real inertial navigation data with laser radar data with actual environmental characteristics in the virtual simulation environment, integrating cameras and GNSS real data in the virtual simulation scene, and generating a virtual sensor decision sensing result so as to support a decision sensing task;
the second generation module is used for carrying a virtual simulation software platform, namely a Ubuntu system, in the virtual simulation server and playing back the data collected by the actual road of the vehicle through a Rosbag data playback mechanism; recharging the collected actual sensor data to the ECU through a UDP communication protocol to generate a decision and a perception result from actual road collection;
the control module is used for embedding the decision sensing result of the virtual sensor into the vehicle dynamics model so as to control the movement of the host vehicle in the virtual simulation scene;
and the evaluation module is used for comparing the laser radar data level with the decision algorithm perception level and evaluating the virtually generated laser radar data.
7. The motion control system of a host vehicle in a virtual simulation scenario of claim 6, wherein the data generation module comprises:
a laser transmitter module for simulating laser emission;
the control and direction module is used for accurately controlling the transmitting direction of the laser radar wire harness;
the motor movement module is used for realizing 360-degree omnidirectional vision of the laser radar wire harness;
the diversity module is used for simulating the configuration and the performance of different laser radar sensors;
and the receiving simulation data module is used for receiving the simulation data of the laser radar and analyzing and visualizing the simulation data.
8. The motion control system of a host vehicle in a virtual simulation scene according to claim 7, wherein the laser emitting module comprises:
the wire harness number simulation module is used for simulating various laser radar models;
the ranging adaptability module is used for setting different ranging ranges;
and the rain and snow refraction module is used for simulating the refraction influence of raindrops or snowflakes on light rays in ray tracing.
9. The motion control system of a host vehicle in a virtual simulation scene according to claim 7, wherein the control and direction module can adjust the scanning range of the wire harness according to the requirement.
CN202311386135.1A 2023-10-25 2023-10-25 Motion control method and system for host vehicle in virtual simulation scene Pending CN117270506A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311386135.1A CN117270506A (en) 2023-10-25 2023-10-25 Motion control method and system for host vehicle in virtual simulation scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311386135.1A CN117270506A (en) 2023-10-25 2023-10-25 Motion control method and system for host vehicle in virtual simulation scene

Publications (1)

Publication Number Publication Date
CN117270506A true CN117270506A (en) 2023-12-22

Family

ID=89215986

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311386135.1A Pending CN117270506A (en) 2023-10-25 2023-10-25 Motion control method and system for host vehicle in virtual simulation scene

Country Status (1)

Country Link
CN (1) CN117270506A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118050697A (en) * 2024-04-16 2024-05-17 中国电子科技集团公司第十四研究所 Space-based air detection flow verification method based on simulator

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118050697A (en) * 2024-04-16 2024-05-17 中国电子科技集团公司第十四研究所 Space-based air detection flow verification method based on simulator

Similar Documents

Publication Publication Date Title
US11487288B2 (en) Data synthesis for autonomous control systems
US20190236380A1 (en) Image generation system, program and method, and simulation system, program and method
US20220075067A1 (en) Ladar System with Intelligent Selection of Shot Patterns Based on Field of View Data
CN109459734B (en) Laser radar positioning effect evaluation method, device, equipment and storage medium
US20190179979A1 (en) Simulated Sensor Testing
CN108958266A (en) A kind of map datum acquisition methods
CN109211575A (en) Pilotless automobile and its field test method, apparatus and readable medium
WO2020079698A1 (en) Adas systems functionality testing
US11954411B2 (en) High fidelity simulations for autonomous vehicles based on retro-reflection metrology
CN111795832A (en) Intelligent driving vehicle testing method, device and equipment
CN117270506A (en) Motion control method and system for host vehicle in virtual simulation scene
US20220198107A1 (en) Simulations for evaluating driving behaviors of autonomous vehicles
WO2018066352A1 (en) Image generation system, program and method, and simulation system, program and method
US11657572B2 (en) Systems and methods for map generation based on ray-casting and semantic class images
CN112987593B (en) Visual positioning hardware-in-the-loop simulation platform and simulation method
Ponn et al. Systematic analysis of the sensor coverage of automated vehicles using phenomenological sensor models
US20220221585A1 (en) Systems and methods for monitoring lidar sensor health
Degen et al. Development of a Lidar Model for the Analysis of Borderline Cases Including Vehicle Dynamics in a Virtual City Environment in Real Time
US12020476B2 (en) Data synthesis for autonomous control systems
EP2639771B1 (en) Augmented vision in image sequence generated from a moving vehicle
CN117472081B (en) Unmanned aerial vehicle obstacle avoidance method based on perception constraint
CN116449807B (en) Simulation test method and system for automobile control system of Internet of things
CN112000082B (en) Unmanned aircraft perception avoidance capability detection and evaluation system and method
CN116822188A (en) Laser radar simulation method in digital twin environment
CN117724435A (en) Unmanned vehicle formation simulation test system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination