CN111752261A - Automatic driving test platform based on autonomous driving robot - Google Patents

Automatic driving test platform based on autonomous driving robot Download PDF

Info

Publication number
CN111752261A
CN111752261A CN202010673172.0A CN202010673172A CN111752261A CN 111752261 A CN111752261 A CN 111752261A CN 202010673172 A CN202010673172 A CN 202010673172A CN 111752261 A CN111752261 A CN 111752261A
Authority
CN
China
Prior art keywords
driving
robot
simulation
decision
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010673172.0A
Other languages
Chinese (zh)
Other versions
CN111752261B (en
Inventor
雷银
刘富强
吴迪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN202010673172.0A priority Critical patent/CN111752261B/en
Publication of CN111752261A publication Critical patent/CN111752261A/en
Application granted granted Critical
Publication of CN111752261B publication Critical patent/CN111752261B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • G05B23/0205Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults
    • G05B23/0208Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterized by the configuration of the monitoring system
    • G05B23/0213Modular or universal configuration of the monitoring system, e.g. monitoring system having modules that may be combined to build monitoring program; monitoring system that can be applied to legacy systems; adaptable monitoring system; using different communication protocols
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B17/00Systems involving the use of models or simulators of said systems
    • G05B17/02Systems involving the use of models or simulators of said systems electric

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides an automatic driving test platform based on an autonomous driving robot, which is characterized by comprising the following components: the automobile driving simulator is provided with a driver seat, a display screen and a driving simulation mechanism which are arranged in front of the driver seat, and a computer which is respectively in communication connection with the display screen and the driving simulation mechanism; and the driving robot is seated on the driving seat and operates the simulated driving mechanism, and is provided with a robot body matched with the driving seat, a monocular camera and a driving operation mechanism which are arranged on the robot body, and a central controller which is in communication connection with the binocular camera, the driving operation mechanism and the computer respectively. The automatic driving test platform is low in cost and can meet various application scenes required in unmanned vehicle tests.

Description

Automatic driving test platform based on autonomous driving robot
Technical Field
The invention belongs to the field of unmanned driving, relates to optimization and testing of unmanned driving algorithms, and particularly relates to an automatic driving test platform based on an autonomous driving robot.
Background
In the unmanned vehicle field, each step of development of technique must be taken care of to ensure personal safety, and at present, there are two bottlenecks in the development of unmanned vehicle: the method comprises the steps of firstly, official requirements on the lowest experimental mileage of the vehicle, and secondly, a driving decision model needs to test and verify mass data under different driving scenes. The research shows that: the safety of unmanned driving reaches the safety degree equivalent to that of human-like drivers, and actually, the safety degree needs billions of miles for experimental mileage to prove. However, even with the most reasonable intent, existing driverless vehicles also require several decades or even hundreds of years to complete a predetermined mileage test. It is therefore an impossible task to do in the short term if the test is placed on a real road. In general, road actual measurement is expensive and takes a long time, virtual testing is a necessary route, but the virtual environment and the effectiveness of the test result are still to be demonstrated and improved.
Simulation simulations from software to hardware, when reasonably modeled, provide the possibility for companies to test and verify their car models. This includes a wide variety of application scenarios including traffic information, driver behavior, weather, and road environment, among others. Google, telas, Zoox … … have also been more companies attempting to reach billions of miles as quickly as possible for unmanned vehicles by means of simulations. Today, products such as Vires, TaSS, PreScan, CarSim, Oktal, ScanNer, and ROSGazebo offer engineers the possibility to simulate sensors and their generation mechanisms and mechanical structures. Despite their respective strengths, they have at the same time ignored areas of critical importance for simulation, including oversimplification of existing sensor outputs, and understanding of how complex the environment affects the autonomous model. In addition, such driving scene simulation software is not only expensive, but also has a relatively large difference between the simulated scene and the real environment, which causes difficulty in simulating the external perception of most sensors. And once a problem occurs in the simulation process, the problem that traffic accidents occur to the unmanned vehicle and the like, which easily endanger life safety, can be caused.
As described above, compared to the conventional automobile, due to the complexity of the system of the unmanned automobile, the automobile needs to perform a laboratory simulation related to the conventional automobile and a test in a conventional automobile test field, and also needs to perform a massive road test in various scenes to train and learn the autonomous driving ability of the automobile so as to meet the safety requirement. At present, the technical means for testing the unmanned automobile mainly comprises an intelligent network automobile test field, virtual simulation and the like. At present, related intelligent networking automobile test sites are built in many countries in the world to serve the development of intelligent networking automobiles. The foreign intelligent networking automobile test field which is currently put into operation has the following characteristics: the Annaburg demonstration area of the United states (M-City), the SilowRun demonstration area of the United states (WillowRun), the European ITS corridor, the AstaZero test arena of Sweden, the City of the Japanese Corrugation science, and the like. The construction cost of the cooperative intelligent network-connected automobile test field which is perfect in communication test environment, integrates various test scenes and meets the full-working-condition communication test environment can reach more than hundreds of millions of yuan. Besides the disadvantage of insufficient environment restoration degree, the authorization cost of the software is over a million level.
Disclosure of Invention
In order to solve the problems, the invention provides an automatic driving simulation platform which has low cost and simultaneously meets the requirements of high-fidelity test environments of various application scenes required in vehicle tests, and adopts the following technical scheme:
the invention provides an automatic driving test platform based on an autonomous driving robot, which is characterized by comprising the following components: the automobile driving simulator is provided with a driver seat, a display screen and a driving simulation mechanism which are arranged in front of the driver seat, and a computer which is respectively in communication connection with the display screen and the driving simulation mechanism; and a driving robot which sits on the driver seat and operates the simulated driving mechanism, and has a robot body matched with the driver seat, a binocular camera and a driving operation mechanism which are arranged on the robot body, and a central controller which is respectively connected with the camera, the driving operation mechanism and the computer in a communication way, wherein the computer has a simulation engine storage part and a driving simulation picture generation control part, the central controller has a driving model storage part, a control instruction generation part, a driving control part, a driving simulation data acquisition storage part, a decision model iteration part and a controller communication part, the simulation engine storage part stores various high-simulation racing game engines, the driving model storage part stores a driving decision model for generating driving decision information which can make a decision on the driving operation of the vehicle and a vehicle control model for generating a corresponding robot control instruction according to the driving decision, the driving simulation picture generation control part generates a driving scene image simulating a virtual driving environment based on a high-simulation racing game engine and controls a display screen to display correspondingly, the binocular camera shoots the driving scene image displayed on the display screen in real time and outputs a pair of screen shot images shot by a pair of binoculars to a central controller in real time, once the controller communication part receives the screen shot images, the control instruction generation part generates corresponding driving decision information based on the screen shot images and a driving decision model in real time and inputs the driving decision information into a vehicle control model to obtain a corresponding robot control instruction, and the driving control part controls a driving operation mechanism to perform simulated driving operation on a simulated driving mechanism in real time based on the robot control instruction so that the simulated driving mechanism sends driving simulation information corresponding to the simulated driving operation to a computer, and the driving simulation data acquisition and storage part is used for at least acquiring all the robot control instructions generated by the control instruction generation part and corresponding screen shot images as driving simulation data and correspondingly storing the driving simulation data, and the decision model iteration part is used for iteratively updating the driving decision model according to all the driving simulation data stored in the driving simulation data acquisition and storage part.
The automatic driving test platform based on the autonomous driving robot provided by the invention can also have the technical characteristics that the computer is also provided with a driving simulation result generation part, the central controller is also provided with a model verification part, the driving scoring index generation part generates corresponding scoring indexes based on game results generated by a high-simulation racing game engine according to driving simulation information and a preset scoring method, and the model verification output part verifies a driving decision model and a vehicle control model according to the scoring indexes and outputs model evaluation results for evaluating the quality of the models.
The autonomous driving robot-based automatic driving test platform provided by the invention may further have a technical feature that the driving decision model includes a decision generation module and a decision correction module, and the control command generation unit includes: the screen image preprocessing unit is used for preprocessing a pair of screen shot images in real time to form an environment image to be input corresponding to the virtual driving environment, and recognizing the state parameters of the virtual vehicle generated by the high-simulation racing game engine from the screen shot images as vehicle state data; the preliminary decision generation unit is used for inputting the environmental image to be input into the decision generation module of the driving decision model so as to generate preliminary decision information; the driving decision correction unit is used for inputting the preliminary decision information and the vehicle state data into the decision correction module so as to output the driving decision information; and the instruction generating unit is used for inputting the driving decision information into the vehicle control model to generate a robot control instruction.
The automatic driving test platform based on the autonomous driving robot provided by the invention can also have the technical characteristics that the driving robot can also sit on a real driving seat of a vehicle to be tested and operate a driving mechanism of the vehicle to be tested, the driving robot also comprises a monocular camera, a high-precision positioning system and a pose sensor, the monocular camera is used for shooting an instrument panel of the vehicle to be tested and outputting a shot instrument shooting image to a central controller in real time, the high-precision positioning system is used for positioning the position of the driving robot and generating corresponding high-precision positioning information in real time, the pose sensor is used for detecting the pose and acquiring the pose information of the vehicle to be tested, the binocular camera is also used for shooting a road scene in which the vehicle to be tested runs and outputting a pair of live-action shot images shot by two eyes respectively to the central controller in real time, the control command generation unit further includes: and the real-scene image preprocessing unit is used for preprocessing a pair of real-scene shot images, the instrument shot images, the high-precision positioning information and the attitude information in real time, forming an environment image to be input corresponding to the actual driving environment according to the real-scene shot images, identifying the instrument information in the instrument shot images, and further taking the instrument information, the high-precision positioning information and the attitude information as the vehicle state data of the vehicle to be detected.
The automatic driving test platform based on the autonomous driving robot provided by the invention can also have the technical characteristics that the driving simulation mechanism at least comprises a steering wheel, a speed regulation gear and an accelerator brake pedal, and the driving operation mechanism at least comprises a steering manipulator for operating the steering wheel, a gear shifting manipulator for operating the speed regulation gear and a mechanical leg for operating the accelerator brake pedal.
The automatic driving test platform based on the autonomous driving robot provided by the invention can also have the technical characteristics that the virtual driving environment at least comprises one or more effects of dynamic weather, day and night circulation, vehicle body dirt and halation.
Action and Effect of the invention
According to the automatic driving test platform based on the autonomous driving robot, the automobile driving simulator generates a driving scene image simulating a virtual driving environment by using a high-simulation racing game engine and displays the driving scene image through the display screen, the driving robot processes a screen shot image shot by the camera through the driving decision model and the vehicle control model to form a corresponding robot control instruction, and further the driving robot drives a simulated driving mechanism of the automobile driving simulator according to the instruction, so that the driving decision model in the driving robot can be iteratively trained and verified in the form of simulation, and the perception and decision-making capability of the model under different scenes are enhanced. The driving robot and the driving simulator are combined, so that a complete set of complete virtual driving behaviors from a high-fidelity simulation environment obtained by the simulator to an automobile in which the robot drives a virtual environment can be realized. Through the automatic driving simulation platform, a large amount of initial training can be carried out on the unmanned algorithm by utilizing the game engine, the cost of originally needing a real vehicle test and constructing professional software is reduced, various enterprises are facilitated to construct the unmanned algorithm for various vehicles, and the time and the cost required for constructing the unmanned algorithm are greatly saved.
Furthermore, the driving robot is provided with a robot body matched with the driver seat, and a driving operation executing mechanism formed by a gear shifting mechanical arm, a steering mechanical arm, an accelerator, a braking mechanical leg and the like is installed on the driving robot, so that the driving robot can be installed in the driving cab without damage under the condition that a vehicle or a simulated environment is not required to be modified, a human driver is simulated to automatically drive the vehicle under severe conditions, dangerous environments or virtual environments, and related enterprises or personnel can verify and test unmanned driving algorithms by using the driving robot.
Drawings
FIG. 1 is a block diagram of an autonomous driving robot-based autopilot testing platform according to an embodiment of the present invention;
FIG. 2 is a structural diagram of an autonomous driving robot-based automatic driving test platform according to an embodiment of the invention;
FIG. 3 is a block diagram of a computer in an embodiment of the invention;
FIG. 4 is a schematic view of an image of a driving scene in an embodiment of the present invention;
FIG. 5 is a block diagram of a central controller according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of an end-to-end unmanned configuration in an embodiment of the present invention;
FIG. 7 is a flow chart of a simulated driving process in an embodiment of the present invention;
FIG. 8 is a flow chart of an actual vehicle driving process in an embodiment of the present invention; and
FIG. 9 is a schematic diagram of the fusion of the driving models in the embodiment of the present invention.
Detailed Description
In order to make the technical means, the creation features, the achievement purposes and the effects of the invention easy to understand, the automatic driving test platform based on the autonomous driving robot of the invention is specifically described below with reference to the embodiments and the accompanying drawings.
< example one >
Fig. 1 is a block diagram of an autonomous driving robot-based autopilot test platform according to an embodiment of the present invention, and fig. 2 is a schematic structural diagram of an autonomous driving robot-based autopilot test platform.
As shown in fig. 1 and 2, the autonomous driving robot-based automatic driving test platform 100 includes an automobile driving simulator 101, a driving robot 102, and a communication network 103.
The communication network 103 is a 5G wireless network, and the automobile driving simulator 101 is in communication connection with the driving robot 102 through the communication network 103.
The automobile driving simulator 101 is an automobile game simulation platform integrated with a superelevation computer PC, driving operation mechanisms (a steering wheel, a gear shift lever, a seat, pedals and the like) and a display screen. In the present embodiment, the automobile driving simulator 101 includes a simulator frame 11, a driver seat 12, a display screen 13, a simulated driving mechanism 14, and a computer 15.
The simulator frame 11 is used to fix all hardware components of the automobile driving simulator 101, and the simulator frame 11 is a conventional plastic or metal frame.
A driver seat 12 is provided on the simulator frame 11 for the driving robot 102 to sit on the driver seat 12. In the present embodiment, the configuration of the operator's seat 12 is the same as that of a general operator's seat 12.
The display screen 13 is disposed in front of the driver seat 12 and opposite to the driver seat 12, and may face the display screen 12 in front when the robot 102 is seated on the driver seat 12.
The simulated driving mechanism 14 is provided around the driver's seat 12, and includes simulated driving components such as a steering wheel, a shift lever, and an accelerator/brake pedal (i.e., an accelerator pedal, a brake pedal, and a clutch pedal), and the arrangement positions of the simulated driving components are matched with the arrangement positions of actual vehicles (hereinafter referred to as actual vehicles).
In this embodiment, when each simulated driving mechanism is operated, corresponding driving simulation information, such as steering information, braking information, acceleration information, and the like, is generated. The simulated driving means 14 outputs the generated driving simulation information to the computer 15 in real time, so that the computer 15 simulates the behavior of the vehicle based on the driving simulation information.
Fig. 3 is a block diagram of a computer in the embodiment of the present invention.
As shown in fig. 3, the computer 15 includes a simulation engine storage unit 151, a driving simulation screen generation control unit 152, a driving score index generation unit 153, a simulator communication unit 154, and a simulator control unit 155 for controlling the above-described units,
the simulation engine storage unit 151 stores a plurality of different highly simulated racing game engines.
In this embodiment, the high-simulation racing game engine is various vehicle game engines with high simulation effect, such as a world car pull championship 5, a top-quality racing car 20, and a limit racing: a ground level 4, game dust 4, etc. in the world. The computer 15 can construct a highly restored realistic virtual driving environment through the game engines, and can provide effects of supporting a dynamic weather system, day and night circulation, real physical damage effect, vehicle body dirt, halo lens display and the like in the virtual driving environment, so that the environment obtained by shooting is more real and has enough complexity.
The driving simulation screen generation control unit 152 generates a driving scene image capable of simulating a virtual driving environment based on the highly simulated racing game engine stored in the simulation engine storage unit 151, and controls the display screen 12 to display the driving scene image in real time.
In the present embodiment, the driving scene image is a game screen generated by a highly simulated racing game engine, and at least a virtual environment (such as a runway) and a virtual vehicle (as shown in fig. 4) are displayed on the game screen. When the simulator communication unit 154 receives the driving simulation information outputted when the driving simulation mechanism 14 is operated, the driving scene image is updated accordingly, that is, the virtual vehicle in the game screen executes the corresponding driving behavior (such as acceleration and braking) based on the driving simulation information, and the virtual environment is changed accordingly.
The driving score index generating unit 153 generates a corresponding score index based on a game result generated by the highly simulated racing game engine from the driving simulation information and a predetermined scoring method.
In this embodiment, the scoring method is to calculate the length of time for the virtual vehicle to move for a fixed number of turns, and use the time as a scoring index. When the time in the score index is shorter, it indicates that the virtual vehicle is driven more smoothly in the virtual environment (i.e., fewer accidents occur), and the quality of unmanned driving of the robot 102 can be determined by the score index.
In other schemes of the invention, the scoring method can also help the unmanned tester to better judge whether the control algorithm of the driving robot 102 is good or bad and whether the control algorithm can be actually applied according to the scoring indexes by judging whether the driving robot can operate the virtual vehicle to complete overtaking and obstacle avoidance actions or not and giving the corresponding scoring indexes.
The piloted robot 102 includes a robot body 21, a binocular camera 22, a monocular camera 23, a high-precision positioning system 24, a pose sensor 25, a 5G communication module 26, a piloting operation mechanism 27, and a central controller 28.
The robot body 21 is used to mount and fix all hardware components of the robot 102. In this embodiment, the robot body 21 is a human-like body that can simulate a driver sitting on the driver seat 12.
The binocular camera 22 is configured to capture a driving scene image displayed on the display screen 13 in real time and output the captured screen captured image to the central controller 28 in real time when the driving robot 102 is set on the automobile driving simulator 101 (i.e., when the driving robot 102 is seated in the driver seat 12 to perform a driverless test, which will be referred to as a virtual test hereinafter).
In addition, in order to ensure the shooting effect of the binocular camera 22, such as avoiding noise like stripes in the screen shot image, the configuration of the automobile driving simulator 101 and the driving robot 102 needs to be adjusted accordingly before starting the two.
The monocular camera 23 is used to capture automobile data displayed on the instrument of the vehicle to obtain an instrument captured image when the driving robot 102 is set in the real vehicle (that is, when the driving robot 102 is allowed to sit on the real vehicle to perform an unmanned test, hereinafter referred to as a real vehicle test).
In this embodiment, the monocular camera 23 is disposed at the head of the driving robot through the pan-tilt mechanism, and the pan-tilt mechanism can automatically adjust the position and angle of the lens so that the lens center of the monocular camera 23 is aligned with the center of the instrument of the reagent vehicle.
In this embodiment, since the monocular camera 23 cannot acquire position information during live-action shooting, the binocular camera 22 also shoots a running road scene in front of the vehicle and outputs a pair of live-action shot images shot by two eyes respectively during live-action test, so that the central controller 28 can directly measure the distance of the front scene (the shot range of the images) by calculating the parallax of the two images without determining what type of obstacle appears in front.
The high-precision positioning system 24 is an additive hybrid of a GPS/BDS satellite navigation positioning system and an enhanced RTK system thereof, and is capable of performing high-precision positioning of the position of the driving robot 102 and outputting corresponding high-precision positioning information when the driving robot 102 performs a real-time vehicle test.
The posture sensor 25 is provided on the robot body 21, and is used for acquiring the posture information of the vehicle itself.
In this embodiment, during the simulation test, the monocular camera 23, the high-precision positioning system 24, and the pose sensor 25 may not work. When the unmanned driving test of the real vehicle is performed, the three images and the binocular camera 22 are collected together in real time, and the binocular camera 22 shoots the actual road scene in front of the vehicle and obtains a corresponding real-scene shot image.
The 5G communication module 26 is arranged on the robot body 21 and is used for performing communication connection between the central controller 28 and the computer 15.
The driving operation means 27 is used for driving operation of the simulated driving means 14 of the automobile driving simulator 101.
In the present embodiment, the driving operation mechanism 27 includes a manipulator and a mechanical leg corresponding to each of the artificial driving components in the artificial driving mechanism 14, specifically: the driving operation mechanism 27 includes a steering manipulator 261 for operating a steering wheel, a shift manipulator 262 for operating a speed control gear, and a mechanical leg 263 (for performing clutch, brake, and accelerator operation) for operating an accelerator brake pedal. Since such a manipulator (leg) and a corresponding control method are prior art, they are not described herein again.
The central controller 28 is used for controlling the driving operation mechanism 27 to perform corresponding driving operation on the simulated driving mechanism 14.
Fig. 5 is a block diagram of the central controller according to the embodiment of the present invention.
As shown in fig. 5, the central controller 28 includes a driving model storage unit 281, a control command generation unit 282, a driving control unit 283, a driving simulation data acquisition storage unit 284, a decision model iteration unit 285, a model verification unit 286, a controller communication unit 287, and a controller control unit 288 for controlling the above units.
The driving model storage portion 281 stores a driving decision model for generating driving decision information enabling a decision to be made on the driving operation of the vehicle and a vehicle control model for generating a corresponding robot control instruction according to the driving decision.
The driving decision model is a convolutional neural network, and can make a decision (i.e., output driving decision information) on the driving behavior of the vehicle according to the image captured by the driving robot 102, for example, output driving behaviors such as stopping when stepping on a brake at a red light, how much driving speed needs to be kept during driving, whether to turn or not. The driving decision model needs a large amount of iterative training to be able to output correct driving behavior.
The vehicle control model is a conventional driving robot control algorithm, and the algorithm can generate a specific robot control instruction according to driving decision information output by the driving decision model, and if the driving decision information is braking, the algorithm can generate a corresponding control instruction for stepping on a brake pedal.
In this embodiment, the driving decision model is composed of a decision generation module and a decision correction module.
The control instruction generating section 282 has a screen image preprocessing unit 2821, a live view image preprocessing unit 2822, a decision generating unit 2823, and an instruction generating unit 2825.
The screen image preprocessing unit 2821 is configured to preprocess, in real time, a pair of screen shot images output by the binocular camera 22 at the time of a virtual test, to form an environment image to be input corresponding to a virtual driving environment, and to recognize, from the screen shot images, a state parameter of a virtual vehicle generated by a highly simulated racing game engine as vehicle state data.
In this embodiment, since the driving simulation screen displayed on the display screen 13 is generated by simulating the highly simulated racing game engine, the screen displays the status parameters of the virtual vehicle, such as the meter image (e.g., speedometer) of the virtual vehicle, the small map (representing the location of the vehicle), and the like, and the screen image preprocessing unit 2821 can recognize the status parameters by recognizing the screen shot image.
The real-scene image preprocessing unit 2822 is configured to, during a real-vehicle test, preprocess, in real time, a pair of real-scene captured images output by the binocular camera 22, an instrument captured image output by the monocular camera 23, high-precision positioning information output by the high-precision positioning system 24, and attitude information output by the attitude sensor 25, form an environment image to be input corresponding to an actual driving environment according to the real-scene captured images, recognize instrument information in the instrument captured images, and further use the instrument information, the high-precision positioning information, and the attitude information as vehicle state data of the vehicle to be detected.
The preliminary decision generating unit 2823 is configured to input the environmental image to be input into the decision generating module of the driving decision model so as to generate preliminary decision information.
The driving decision correction unit 2824 is configured to input the initial decision information and the vehicle state data output by the initial decision generation unit 2823 into the decision correction module, so as to output the driving decision information.
The instruction generating unit 2825 is configured to input the driving decision information into the vehicle control model to generate a robot control instruction.
In this embodiment, the central controller 28 mainly uses machine vision to realize unmanned driving, and the principle is an end-to-end unmanned driving technology based on a deep neural network.
Fig. 6 is a schematic structural diagram of end-to-end unmanned driving in the embodiment of the invention.
As shown in fig. 6, the core of end-to-end unmanned driving is a deep learning model. The method comprises the steps of acquiring image data of different road conditions in the driving process in real time, and simultaneously recording control parameters of a driver on an automobile under different road conditions. These data are input as training data to the deep learning model for training. When the deep learning model is used for controlling the automatic driving of the automobile, real-time road condition images (namely live-action shot images) are collected through the binocular camera, depth information is measured and calculated, the images are fused and then input into the deep learning model to obtain automobile drive-by-wire parameters (namely robot control instructions), so that the robot can be controlled to operate the automobile to automatically drive, and the depth information enters the next layer of decision network.
The driving control unit 283 is configured to control the driving operation mechanism 27 in real time in accordance with the robot control command generated by the control command generating unit 282.
In this embodiment, the driving simulation picture generation control unit 152 generates a driving scene image in real time based on the driving simulation information output by the highly-simulated racing game engine and the simulated driving mechanism 14 and controls the display screen 12 to display the driving scene image, the control instruction generation unit 282 generates a robot control instruction according to the screen shot image shot by the monocular camera 23 on the display screen 12, and the driving control unit 283 controls the driving operation mechanism 27 to perform driving operation on the simulated driving mechanism 14 according to the robot control instruction, so that a cyclic automatic driving process is formed, the purpose of controlling the virtual vehicle in the virtual environment is achieved, and finally, the closed-loop control and algorithm verification of the whole driving behavior are achieved.
The driving simulation data acquisition and storage unit 284 is configured to acquire and store all the robot control commands generated by the control command generation unit 282, the corresponding screen shot images, and the score index generated by the driving score index generation unit 153 as driving simulation data.
In the present embodiment, when the driving robot 102 performs the unmanned test of the real vehicle, the driving simulation data acquisition and storage unit 284 acquires the high-precision positioning information acquired by the high-precision positioner 24 and the image captured by the binocular camera 22 as the driving simulation data.
The decision model iterator 285 iteratively updates the driving decision model based on all the driving simulation data stored in the driving simulation data acquisition storage 284.
The model verification output unit 286 verifies the driving decision model and the vehicle control model based on the score index, and outputs a model evaluation result for evaluating the quality of the model.
In this embodiment, the model verification output unit 286 may output the result to a terminal held by a tester, so that the tester can determine whether the driving decision model and the vehicle control model can be put into practical use or need to be adjusted according to the result.
The controller communication section 287 is used to exchange data between the central controller 28 and the computer 15, the monocular camera 23, the binocular camera 22, the high-precision positioning system 24, and the driving operation mechanism 27.
Fig. 6 is a flowchart of a simulated driving process in an embodiment of the invention.
As shown in fig. 6, after the automobile driving simulator 101 and the driving robot 102 are started, the following steps are started:
step S1-1, the driving simulation screen generation control unit 152 in the computer 15 obtains a highly simulated racing game engine from the simulation engine storage unit 151, generates a driving scene image in real time based on the highly simulated racing game engine, controls the display screen 13 to display the driving scene image, and then proceeds to step S1-2;
step S1-2, the binocular camera 23 shoots the display screen 13 to obtain a screen shot image corresponding to the driving simulation picture, and then proceeds to step S1-3;
in step S1-3, the control command generating part 282 generates a corresponding robot control command based on the screen shot image shot in step S1-2 and the driving decision model and the vehicle control model stored in the driving model storing part 171, and then proceeds to step S1-4;
in step S1-4, the driving control unit 283 controls the driving operation mechanism 27 to perform the driving operation on the simulated driving mechanism 14 in accordance with the robot control instruction generated in step S1-3, and then proceeds to step S1-5;
step S1-5, the simulated driving mechanism 14 generates corresponding driving simulation information according to the driving operation of the step S1-4 and sends the driving simulation information to the computer 15, and then the step S1-6 is carried out;
step S1-6, the driving simulation screen generation control part 152 generates a new driving scene image based on the highly simulated racing game engine acquired in step S1-1 and the driving simulation information transmitted in step S1-5 and controls the display screen 13 to display, and then proceeds to step S1-7;
step S1-7, the central controller 28 judges whether to end a round of simulated driving according to the preset simulated driving end condition, if not, the step S1-2 is carried out, and if so, the step S1-8 is carried out;
step S1-8, the driving simulation data obtaining storage unit 284 obtains and stores all the robot control commands and corresponding screen shot images generated by one round of simulated driving as driving simulation data, and then proceeds to step S1-9;
step S1-9, the decision model iterator 285 performs one iteration on the driving decision model stored in the driving model storage 171 according to the driving simulation data stored in the driving simulation data acquisition storage 284, and then proceeds to step S1-10;
in step S1-10, the central controller 28 determines whether the driving decision model meets a predetermined iteration completion condition, if not, proceeds to step S1-1, and if so, proceeds to an end state.
Through the simulation driving process, the driving decision model can be preliminarily perfected, so that the driving decision model can accurately make a driving behavior decision according to the image shot by the driving robot, and the driving robot has preliminary unmanned driving capability at the moment.
In this embodiment, in the steps S1-1 to S1-7 of the simulated driving process, the driving score generation unit 153 generates a corresponding score, and the simulated driving end condition is to determine whether to end one round of simulated driving according to whether the score is generated. In other aspects of the present invention, the simulated driving ending condition may also be set according to actual requirements, for example, whether to end a round of simulated driving is determined according to whether the driving duration reaches a preset threshold.
In this embodiment, the iteration completion condition is whether the number of rounds of iteration reaches a preset threshold. In another aspect of the present invention, the iteration completion condition may be set according to an actual demand, for example, whether the score generated by the driving score generation unit 153 meets a predetermined criterion or not may be detected.
After the iteration of the driving decision model is completed through the simulated driving process, and the unmanned ability of the driving robot 102 is tested and passed through the process and the scoring indexes again, the driving robot 102 can be arranged on the driving seat of the actual vehicle for the actual vehicle test.
Fig. 7 is a flowchart of an actual vehicle driving process of the robot according to the embodiment of the present invention.
As shown in fig. 7, after the driving robot 102 is set on the driver's seat of the actual vehicle and the driving robot 102 is started, the following steps are started:
step S2-1, the monocular camera 23 shoots the instrument panel of the actual vehicle and outputs the shot instrument shot image to the central controller 28, and then the step S2-2 is performed;
step S2-2, the binocular camera 22 shoots a road scene in front of the actual vehicle and outputs a pair of live-action shot images shot by the binocular camera to the central controller 28, and then the process proceeds to step S2-3;
step S2-3, the high-precision positioning system 24 positions the position of the driving robot 102 and outputs corresponding high-precision positioning information to the central controller 28, and then the step S2-4 is performed;
step S2-4, the control command generating part preprocesses the meter captured image output in step S2-1, the live view captured image output in step S2-2, and the high-precision positioning information output in step S2-3 to generate data to be input, generates a corresponding robot control command based on the data to be input and the driving decision model and the vehicle control model stored in the driving model storing part 171, and then proceeds to step S2-5;
in step S2-5, the driving control unit 283 controls the driving operation mechanism 27 to perform the driving operation of the simulated driving mechanism 14 based on the robot control command generated in step S2-4, and then the process proceeds to step S2-5
Step S2-6, the central controller 28 judges whether to end a round of real vehicle driving according to the preset real vehicle driving end condition, if not, the step S2-1 is carried out, and if so, the step S2-7 is carried out;
step S2-7, the driving simulation data acquisition and storage unit 284 acquires all the robot control commands generated by a round of real vehicle driving and corresponding instrument photographed images, real scene photographed images and high-precision positioning information as driving simulation data and stores them correspondingly, and then proceeds to step S2-8;
step S2-8, the decision model iterator 285 performs one iteration on the driving decision model stored in the driving model storage 171 according to the driving simulation data stored in the driving simulation data acquisition storage 284, and then proceeds to step S2-9;
in step S2-9, the central controller 28 determines whether the driving decision model meets a predetermined iteration completion condition, if not, proceeds to step S2-1, and if so, proceeds to an end state.
Through the process, the driving robot trained by the automatic driving simulation platform can be used for carrying out real-vehicle test and verification.
In addition, a final driving decision model obtained by simulating a driving process and an actual vehicle driving process can be fused with a vehicle control model, as shown in fig. 8, a whole vehicle control mathematical model based on a vehicle dynamics model can be formed by combining the vehicle model of an actual vehicle, the whole vehicle control mathematical model can be integrated into a system of an unmanned vehicle, and unmanned control is performed by combining various data acquired by a sensor of the unmanned vehicle.
In addition, the central controller 28 may further include a data recording and output interface, and may export the entire driving behavior data (driving behaviors such as acceleration, braking, and gear shifting) to a third-party software platform such as MATLAB for further analysis (different drivers have different driving habits, and the driving behavior may be further optimized through big data analysis to achieve the purposes of smoothness and economy of vehicle operation, etc.).
Examples effects and effects
According to the autonomous driving robot-based automatic driving test platform provided by the embodiment, the automobile driving simulator generates a driving scene image simulating a virtual driving environment by using a high-simulation racing game engine and displays the driving scene image through the display screen, the driving robot processes a screen shot image shot by a monocular camera through the driving decision model and the vehicle control model to form a corresponding robot control instruction, and further the driving robot drives a simulated driving mechanism of the automobile driving simulator according to the instruction, so that the driving decision model in the driving robot can be subjected to iterative training and verification in the form of simulation, and the perception and decision-making capability of the model under different scenes are enhanced. The driving robot and the driving simulator are combined, so that a complete set of complete virtual driving behaviors from a high-fidelity simulation environment obtained by the simulator to an automobile in which the robot drives a virtual environment can be realized. Through the automatic driving simulation platform, a large amount of initial training can be carried out on the unmanned algorithm by utilizing the game engine, the cost of an original real vehicle test and construction of professional software is reduced, various enterprises can construct the unmanned algorithm for various vehicles, and the time and the cost required for constructing the unmanned algorithm are greatly saved.
The semi-physical virtual simulation platform built by combining the game simulation engine (the authorized cost of the game engine is only hundreds of yuan RMB) and the driving robot can meet the restoration of high-fidelity test environments of various application scenes, and the manufacturing cost of the whole set of system does not exceed ten thousand yuan RMB.
Furthermore, the driving robot is provided with a robot body matched with the driver seat, and a driving operation executing mechanism formed by a gear shifting mechanical arm, a steering mechanical arm, an accelerator, a braking mechanical leg and the like is installed on the driving robot, so that the driving robot can be installed in a cab without damage under the condition that a vehicle or a simulated environment is not required to be modified, a human driver is simulated to drive the vehicle automatically under severe conditions and dangerous environments or virtual environments, and the verification and the test of an unmanned driving algorithm by using the driving robot are facilitated.
At present, unmanned vehicles do not really appear on the market, most of the unmanned vehicles are transformed from traditional automobiles (the transformation cost is high, time and labor are wasted), the autonomous driving robot can be nondestructively installed on various test automobiles without modifying the automobiles, the time and the cost are greatly saved, and therefore, the autonomous driving robot is used as a simulation tool for unmanned driving training and inspection models with low cost and high cost performance, the problems of the test environment and the test vehicles are solved practically at the same time, and the unmanned driving robot can be flexibly matched and expanded in universality rather than the situation that different game engines are replaced to replace test scenes, so that the material resources and the time cost for testing the unmanned vehicles can be greatly reduced, and the market application value is high.
Furthermore, the driving decision model obtained through the iterative training of the driving simulation platform can also be extracted from and integrated with the driving control model to form an unmanned module, and the unmanned module can be arranged in a system of an unmanned automobile and realizes unmanned driving.
The above-mentioned embodiments are merely illustrative of specific embodiments of the present invention, and the present invention is not limited to the description of the above-mentioned embodiments.
For example, in the above-described embodiment, the simulated driving mechanism includes only the mechanisms such as the steering wheel, the shift lever, the accelerator brake pedal, and the like. In other schemes of the invention, the simulated driving mechanism can also comprise a hand brake, an ignition switch and the like, and the driving robot is configured with a corresponding operation arm and an operation algorithm, so that unmanned driving simulation is better carried out.
For example, in the above-described embodiment, the driving model storage unit stores only one type of driving decision model, and the optimization is completed through a plurality of iterations. In another aspect of the present invention, the driving model storage unit may further store a plurality of driving decision models (e.g., corresponding to different driving styles), and the driving decision models are optimized through the automatic driving test platform of the present invention, so that after the test is completed, a tester selects the most suitable driving decision model according to the test result and uses the most suitable driving decision model for practical application.

Claims (6)

1. An autonomous driving robot-based autopilot test platform, comprising:
the automobile driving simulator is provided with a driver seat, a display screen and a driving simulation mechanism which are arranged in front of the driver seat, and a computer which is respectively in communication connection with the display screen and the driving simulation mechanism; and
the driving robot is arranged on the driving seat and operates the simulated driving mechanism, and is provided with a robot body matched with the driving seat, a binocular camera and a driving operation mechanism which are arranged on the robot body, and a central controller which is respectively in communication connection with the binocular camera, the driving operation mechanism and the computer,
wherein the computer has a simulation engine storage unit and a driving simulation screen generation control unit,
the central controller is provided with a driving model storage part, a control instruction generation part, a driving control part, a driving simulation data acquisition and storage part, a decision model iteration part and a controller communication part,
the simulation engine storage part stores a plurality of high-simulation racing game engines,
the driving model storage part stores a driving decision model for generating driving decision information enabling a decision to be made on a driving operation of a vehicle and a vehicle control model for generating a corresponding robot control instruction according to the driving decision,
the driving simulation picture generation control part generates a driving scene image simulating a virtual driving environment based on the high-simulation racing game engine and controls the display screen to display correspondingly,
the binocular camera shoots the driving scene image displayed by the display screen in real time and outputs a pair of screen shot images respectively shot in binocular mode to the central controller in real time,
the control instruction generating part generates corresponding driving decision information based on the screen shot image and the driving decision model in real time once the controller communication part receives the screen shot image, and inputs the driving decision information into the vehicle control model to obtain a corresponding robot control instruction,
the driving control part controls the driving operation mechanism to carry out simulated driving operation on the simulated driving mechanism in real time based on the robot control instruction, so that the simulated driving mechanism sends driving simulation information corresponding to the simulated driving operation to the computer, and further causes the driving simulation picture generation control part to generate a new driving scene image based on the high-simulation racing game engine and the received driving simulation information and control the display screen to carry out display updating,
the driving simulation data acquisition and storage part is used for at least acquiring all the robot control instructions generated by the control instruction generation part and corresponding screen shot images as driving simulation data and correspondingly storing the driving simulation data,
the decision model iteration part is used for iteratively updating the driving decision model according to all the driving simulation data stored in the driving simulation data acquisition and storage part.
2. The autonomous-driving-robot-based autopilot testing platform of claim 1 wherein:
wherein the computer further has a driving simulation result generating section,
the central controller also has a model verification section,
the driving score index generating part generates a corresponding score index based on a game result generated by the highly simulated racing game engine according to the driving simulation information and a predetermined scoring method,
and the model verification output part verifies the driving decision model and the vehicle control model according to the grading index and outputs a model evaluation result for evaluating the quality of the model.
3. The autonomous-driving-robot-based autopilot testing platform of claim 1 wherein:
wherein the driving decision model comprises a decision generation module and a decision modification module,
the control instruction generation unit includes:
the screen image preprocessing unit is used for preprocessing the pair of screen shot images in real time, forming an environment image to be input corresponding to the virtual driving environment, and recognizing the state parameters of the virtual vehicle generated by the high-simulation racing game engine from the screen shot images as vehicle state data input;
a preliminary decision generating unit, configured to input the to-be-input environment image into a decision generating module of the driving decision model to generate preliminary decision information;
a driving decision correction unit for inputting the preliminary decision information and the vehicle state data into the decision correction module to output the driving decision information; and
and the instruction generating unit is used for inputting the driving decision information into the vehicle control model to generate the robot control instruction.
4. The autonomous-driving-robot-based autopilot testing platform of claim 3 wherein:
wherein, the driving robot can also sit on a real driving seat of the vehicle to be tested and operate a driving mechanism of the vehicle to be tested,
the driving robot is also provided with a monocular camera, a high-precision positioning system and a pose sensor,
the monocular camera is used for shooting an instrument panel of the vehicle to be tested and outputting a shot instrument image to the central controller in real time,
the high-precision positioning system is used for positioning the position of the driving robot and generating corresponding high-precision positioning information in real time,
the pose sensor is used for detecting the pose and acquiring the pose information of the vehicle to be detected,
the binocular camera is also used for shooting a road scene where the vehicle to be detected runs and outputting a pair of live-action shot images respectively shot by two eyes to the central controller in real time,
the control instruction generation unit further includes:
and the real-scene image preprocessing unit is used for preprocessing the pair of real-scene shooting images, the instrument shooting image, the high-precision positioning information and the attitude information in real time, forming an environment image to be input corresponding to an actual driving environment according to the real-scene shooting images, identifying instrument information in the instrument shooting images, and further taking the instrument information, the high-precision positioning information and the attitude information as vehicle state data of the vehicle to be detected.
5. The autonomous-driving-robot-based autopilot testing platform of claim 1 wherein:
wherein the driving simulation mechanism at least comprises a steering wheel, a speed regulating gear and an accelerator brake pedal,
the driving operation mechanism at least comprises a steering manipulator for operating the steering wheel, a gear shifting manipulator for operating the speed regulating gear and a mechanical leg for operating the accelerator brake pedal.
6. The autonomous-driving-robot-based autopilot testing platform of claim 1 wherein:
wherein, the virtual driving environment at least comprises one or more effects of dynamic weather, day and night circulation, vehicle body dirt and halation.
CN202010673172.0A 2020-07-14 2020-07-14 Automatic driving test platform based on autonomous driving robot Active CN111752261B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010673172.0A CN111752261B (en) 2020-07-14 2020-07-14 Automatic driving test platform based on autonomous driving robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010673172.0A CN111752261B (en) 2020-07-14 2020-07-14 Automatic driving test platform based on autonomous driving robot

Publications (2)

Publication Number Publication Date
CN111752261A true CN111752261A (en) 2020-10-09
CN111752261B CN111752261B (en) 2021-07-06

Family

ID=72710938

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010673172.0A Active CN111752261B (en) 2020-07-14 2020-07-14 Automatic driving test platform based on autonomous driving robot

Country Status (1)

Country Link
CN (1) CN111752261B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112987593A (en) * 2021-02-19 2021-06-18 中国第一汽车股份有限公司 Visual positioning hardware-in-the-loop simulation platform and simulation method
CN113146649A (en) * 2021-03-24 2021-07-23 北京航空航天大学 Helicopter piloting robot system for controlling helicopter steering column
CN113237669A (en) * 2021-04-08 2021-08-10 联合汽车电子有限公司 Automatic driving control system for vehicle hub test
US20220055651A1 (en) * 2020-08-24 2022-02-24 Toyota Research Institute, Inc. Data-driven warm start selection for optimization-based trajectory planning
CN114434466A (en) * 2022-03-14 2022-05-06 交通运输部公路科学研究所 Automobile intelligent cockpit performance evaluation simulation robot
WO2022141294A1 (en) * 2020-12-30 2022-07-07 深圳市大疆创新科技有限公司 Simulation test method and system, simulator, storage medium, and program product
CN115813676A (en) * 2022-10-25 2023-03-21 南京康尼机电股份有限公司 Remote control device of electric wheelchair
CN118135867A (en) * 2024-05-06 2024-06-04 成都运达科技股份有限公司 Signal equipment display method, driving training device and storage medium

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101530329A (en) * 2009-03-06 2009-09-16 北京理工大学 Semi-physical driving fatigue vision simulation system platform
CN101714302A (en) * 2009-12-15 2010-05-26 中国民航大学 Automatic-piloting simulator of aeroplane
CN101770708A (en) * 2010-01-28 2010-07-07 公安部交通管理科学研究所 Simulation testing platform system for researching driving behaviors
CN101794523A (en) * 2009-12-15 2010-08-04 中国民航大学 Aircraft hardware-in-the-loop simulation device
CN102435442A (en) * 2011-09-05 2012-05-02 北京航空航天大学 Automatic drive robot used in vehicle road tests
CN102637058A (en) * 2011-08-25 2012-08-15 上海交通大学 Automatic drive robot
CN103426338A (en) * 2013-09-05 2013-12-04 北京汽车股份有限公司 Automobile driving test simulating system
US20150138301A1 (en) * 2013-11-21 2015-05-21 Electronics And Telecommunications Research Institute Apparatus and method for generating telepresence
CN105068857A (en) * 2015-07-24 2015-11-18 同济大学 Driving behavior data acquisition method based on high fidelity driving simulator
CN107775641A (en) * 2016-08-24 2018-03-09 南京乐朋电子科技有限公司 In generation, drives robot
JP2018103352A (en) * 2016-12-22 2018-07-05 セイコーエプソン株式会社 Control apparatus, robot and robot system
CN108263307A (en) * 2017-01-03 2018-07-10 福特全球技术公司 For the spatial hearing alarm of vehicle
CN209505463U (en) * 2019-01-31 2019-10-18 上海淞泓智能汽车科技有限公司 A kind of automatic robot driver for autonomous driving vehicle test
CN111026162A (en) * 2019-12-10 2020-04-17 长沙中联重科环境产业有限公司 Self-following cleaning robot
CN111338235A (en) * 2020-03-28 2020-06-26 中汽数据(天津)有限公司 Driving robot simulation test platform based on VTD

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101530329A (en) * 2009-03-06 2009-09-16 北京理工大学 Semi-physical driving fatigue vision simulation system platform
CN101714302A (en) * 2009-12-15 2010-05-26 中国民航大学 Automatic-piloting simulator of aeroplane
CN101794523A (en) * 2009-12-15 2010-08-04 中国民航大学 Aircraft hardware-in-the-loop simulation device
CN101770708A (en) * 2010-01-28 2010-07-07 公安部交通管理科学研究所 Simulation testing platform system for researching driving behaviors
CN102637058A (en) * 2011-08-25 2012-08-15 上海交通大学 Automatic drive robot
CN102435442A (en) * 2011-09-05 2012-05-02 北京航空航天大学 Automatic drive robot used in vehicle road tests
CN103426338A (en) * 2013-09-05 2013-12-04 北京汽车股份有限公司 Automobile driving test simulating system
US20150138301A1 (en) * 2013-11-21 2015-05-21 Electronics And Telecommunications Research Institute Apparatus and method for generating telepresence
CN105068857A (en) * 2015-07-24 2015-11-18 同济大学 Driving behavior data acquisition method based on high fidelity driving simulator
CN107775641A (en) * 2016-08-24 2018-03-09 南京乐朋电子科技有限公司 In generation, drives robot
JP2018103352A (en) * 2016-12-22 2018-07-05 セイコーエプソン株式会社 Control apparatus, robot and robot system
CN108263307A (en) * 2017-01-03 2018-07-10 福特全球技术公司 For the spatial hearing alarm of vehicle
CN209505463U (en) * 2019-01-31 2019-10-18 上海淞泓智能汽车科技有限公司 A kind of automatic robot driver for autonomous driving vehicle test
CN111026162A (en) * 2019-12-10 2020-04-17 长沙中联重科环境产业有限公司 Self-following cleaning robot
CN111338235A (en) * 2020-03-28 2020-06-26 中汽数据(天津)有限公司 Driving robot simulation test platform based on VTD

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HASAN A.POONAWALAB,MARK W.SPONGA: "Time-optimal velocity tracking control for differential drive robots", 《AUTOMATICA》 *
YESIM ONIZA,OKYAY KAYNAKAB: "Control of a direct drive robot using fuzzy spiking neural networks with variable structure systems-based learning algorithm", 《NEUROCOMPUTING》 *
陈刚: "汽车驾驶机器人系统的研究进展", 《汽车电器》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220055651A1 (en) * 2020-08-24 2022-02-24 Toyota Research Institute, Inc. Data-driven warm start selection for optimization-based trajectory planning
US11958498B2 (en) * 2020-08-24 2024-04-16 Toyota Research Institute, Inc. Data-driven warm start selection for optimization-based trajectory planning
WO2022141294A1 (en) * 2020-12-30 2022-07-07 深圳市大疆创新科技有限公司 Simulation test method and system, simulator, storage medium, and program product
CN112987593A (en) * 2021-02-19 2021-06-18 中国第一汽车股份有限公司 Visual positioning hardware-in-the-loop simulation platform and simulation method
CN112987593B (en) * 2021-02-19 2022-10-28 中国第一汽车股份有限公司 Visual positioning hardware-in-the-loop simulation platform and simulation method
CN113146649A (en) * 2021-03-24 2021-07-23 北京航空航天大学 Helicopter piloting robot system for controlling helicopter steering column
CN113237669A (en) * 2021-04-08 2021-08-10 联合汽车电子有限公司 Automatic driving control system for vehicle hub test
CN114434466A (en) * 2022-03-14 2022-05-06 交通运输部公路科学研究所 Automobile intelligent cockpit performance evaluation simulation robot
CN115813676A (en) * 2022-10-25 2023-03-21 南京康尼机电股份有限公司 Remote control device of electric wheelchair
CN118135867A (en) * 2024-05-06 2024-06-04 成都运达科技股份有限公司 Signal equipment display method, driving training device and storage medium

Also Published As

Publication number Publication date
CN111752261B (en) 2021-07-06

Similar Documents

Publication Publication Date Title
CN111752261B (en) Automatic driving test platform based on autonomous driving robot
CN108803607B (en) Multifunctional simulation system for automatic driving
CN112987703B (en) System and method for developing and testing in-loop automatic driving of whole vehicle in laboratory
CN112925291B (en) Digital twin automatic driving test method based on camera dark box
CN111309600A (en) Virtual scene injection automatic driving test method and electronic equipment
CN108319249B (en) Unmanned driving algorithm comprehensive evaluation system and method based on driving simulator
CN112997060A (en) Method and system for modifying a control unit of an autonomous vehicle
CN110103983A (en) System and method for the verifying of end-to-end autonomous vehicle
Zhang et al. Roadview: A traffic scene simulator for autonomous vehicle simulation testing
CN109884916A (en) A kind of automatic Pilot Simulation Evaluation method and device
CN109901546A (en) Auxiliary drives vehicle hardware assemblage on-orbit test method and system
CN105068857B (en) A kind of driving behavior data capture method based on high fidelity driving simulator
CN106530891A (en) Driving simulation system based on VR technology
CN110046833A (en) A kind of traffic congestion auxiliary system virtual test system
CN109461342B (en) Teaching system for unmanned motor vehicle and teaching method thereof
US12031883B2 (en) Apparatus and method for testing automated vehicles
Solmaz et al. A vehicle-in-the-loop methodology for evaluating automated driving functions in virtual traffic
CN111798718A (en) Driving adaptability training method, host and device based on virtual reality
CN110930811B (en) System suitable for unmanned decision learning and training
CN116755954A (en) Automatic driving test system and method based on digital twin virtual-real combination
CN208655066U (en) Automotive visibility evaluation system
Kowol et al. A-eye: Driving with the eyes of ai for corner case generation
CN106446335B (en) A kind of road alignment method for evaluating quality under three-dimensional space
CN114896817A (en) Vehicle in-loop fusion test system and method
CN113946212A (en) Steady driving test system based on virtual reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant