CN109084992B - Method for testing intelligence of unmanned vehicle based on rack - Google Patents

Method for testing intelligence of unmanned vehicle based on rack Download PDF

Info

Publication number
CN109084992B
CN109084992B CN201810842710.7A CN201810842710A CN109084992B CN 109084992 B CN109084992 B CN 109084992B CN 201810842710 A CN201810842710 A CN 201810842710A CN 109084992 B CN109084992 B CN 109084992B
Authority
CN
China
Prior art keywords
vehicle
unmanned vehicle
behavior
data
bench
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810842710.7A
Other languages
Chinese (zh)
Other versions
CN109084992A (en
Inventor
赵祥模
徐志刚
王文威
连心雨
承靖钧
时恒心
王振
闵海根
周豫
陈南峰
冀建新
阚春辉
谷占勋
李玉
杨建辉
卢春波
李拓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changan University
Original Assignee
Changan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changan University filed Critical Changan University
Priority to CN201810842710.7A priority Critical patent/CN109084992B/en
Publication of CN109084992A publication Critical patent/CN109084992A/en
Application granted granted Critical
Publication of CN109084992B publication Critical patent/CN109084992B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Traffic Control Systems (AREA)

Abstract

The method for testing the intelligence of the unmanned vehicle based on the bench comprises the following steps: arranging the unmanned vehicle to be tested on a test bench frame to simulate driving; constructing a virtual scene and each data type in the virtual scene; the method comprises the following steps that a main vehicle obtains operation parameters of an unmanned vehicle to be tested and road information simulated by a test bench and simulates the main vehicle into a virtual scene; the method comprises the following steps that a main vehicle interacts with various data types in a virtual scene to generate virtual driving behavior data; and taking driving behavior data generated by the main vehicle as a sample set, and taking the category with the maximum output times in the decision tree as the category of the test sample by adopting a random forest algorithm. The data driving is adopted to collect the driving behavior data between the main vehicle and each data type, so that the data collection cost is lower, the provided original data has higher authenticity, and the controllability of the data realizes the reappearance of the detection scene.

Description

Method for testing intelligence of unmanned vehicle based on rack
Technical Field
The application relates to the technical field of unmanned vehicles, in particular to an intelligent unmanned vehicle testing method based on a rack.
Background
As the degree of intelligence continues to increase, more and more unmanned vehicles come out in succession. How to effectively verify and evaluate the behavior of the unmanned vehicle is a problem worthy of intensive study.
The environment model simulated by the existing intelligent method for testing the unmanned vehicle based on the bench is single, and the intelligent type of the unmanned vehicle cannot be fully tested.
Disclosure of Invention
The application provides a method for testing the intelligence of the unmanned vehicle based on a rack, and provides a more accurate test for the intelligence of the unmanned vehicle.
According to a first aspect, there is provided in one embodiment a method for bench-based testing of unmanned vehicle intelligence, comprising the steps of: arranging the unmanned vehicle to be tested on a test bench frame to simulate driving; constructing a virtual scene and each data type in the virtual scene; the method comprises the following steps that a main vehicle obtains operation parameters of an unmanned vehicle to be tested and road information simulated by a test bench and simulates the main vehicle into a virtual scene; the method comprises the following steps that a main vehicle interacts with various data types in a virtual scene to generate virtual driving behavior data; and taking driving behavior data generated by the main vehicle as a sample set, and taking the category with the maximum output times in the decision tree as the category of the test sample by adopting a random forest algorithm.
Preferably, the data types in the virtual scene comprise a slave vehicle module, an environment module, a road surface module and a master vehicle behavior control module, each data type has autonomous communication capability, and the master vehicle behavior control module generates corresponding driving behavior data according to the sensed virtual scene event and self variables.
Preferably, the master vehicle behavior control module is provided with a basic driving behavior library, a low-level driving behavior library and a trigger rule library of corresponding behaviors, and the master vehicle matches sensed virtual scene events with events in the rule library to generate corresponding driving behavior data.
Preferably, an emergent event and uncertainty event model is preset in the slave vehicle module, so that the master vehicle and the slave vehicle which has an emergent event or uncertainty event interact to generate driving behavior data.
Preferably, the master and slave vehicle modules are provided with virtual vision sensors, virtual radar sensors and virtual GPS sensors.
Preferably, the environment modules comprise traffic signs, traffic markings, plants, buildings, bridges, tunnel models, pedestrians, each model matching a rule base in the master vehicle behavior control module.
Preferably, the road modules include highway, urban road, rural road models, each model matching a rule base in the master behavior control module.
Preferably, the rule base in the master behavior control module is provided with a priority level, the priority level of the sudden behavior is higher than that of the medium-long term behavior, and the priority level of the simple behavior is higher than that of the complex behavior.
Preferably, the road inclination simulation is realized through a test bench, and the test bench also collects driving behavior data of the unmanned vehicle to be tested.
Preferably, the method further comprises the steps of dividing the intelligent grade of the unmanned vehicle, setting an intelligent evaluation index of the vehicle, taking driving behavior data corresponding to the evaluation index as a training set after the intelligent grade division, and outputting the category after the intelligent grade division by adopting a random forest algorithm.
According to the method for testing the intelligence of the unmanned vehicle based on the bench, the data driving is adopted to collect the driving behavior data between the unmanned vehicle and each data type, so that the data collection cost is lower, the provided original data has higher authenticity, and the controllability of the data can realize the reappearance of a detection scene. Furthermore, a slave vehicle module, an environment module and a road module with autonomous communication capacity are adopted to interact with the master vehicle, so that the master vehicle can obtain more real driving behavior data, and more accurate testing basis is provided for the intelligence of the unmanned vehicle.
Drawings
FIG. 1 is a flow chart of an embodiment;
FIG. 2 is a flow chart of a slave vehicle driving behavior control according to an embodiment;
FIG. 3 is a list of road modules in one embodiment;
FIG. 4 is an embodiment of an intelligent level classification table;
FIG. 5 is a flowchart of the bagging algorithm;
FIG. 6 is a SVM based classifier;
fig. 7 is a schematic diagram of the calculation of the intelligence of the unmanned vehicle by using the classifier of the SVM.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings. Wherein like elements in different embodiments are numbered with like associated elements. In the following description, numerous details are set forth in order to provide a better understanding of the present application. However, those skilled in the art will readily recognize that some of the features may be omitted or replaced with other elements, materials, methods in different instances. In some instances, certain operations related to the present application have not been shown or described in detail in order to avoid obscuring the core of the present application from excessive description, and it is not necessary for those skilled in the art to describe these operations in detail, so that they may be fully understood from the description in the specification and the general knowledge in the art.
The numbering of the components as such, e.g., "first", "second", etc., is used herein only to distinguish the objects as described, and does not have any sequential or technical meaning. The term "connected" and "coupled" when used in this application, unless otherwise indicated, includes both direct and indirect connections (couplings).
Referring to fig. 1, the method for testing the intelligence of the unmanned vehicle based on the bench according to the present application includes the following steps:
101. and arranging the unmanned vehicle to be tested on the test bench to simulate driving.
The unmanned vehicle to be tested is an experimental unmanned vehicle placed on the test bench, and the unmanned vehicle to be tested is a verified and evaluated object. The test bench is a road surface simulation system, provides three-angle simulation of rolling, course and pitching for the unmanned vehicle to be tested, and simulates the vehicle motion state in a real scene.
102. The server constructs the virtual scene and the various data types in the virtual scene.
The data type is a computing program that can implement the autonomous, communication program function. And a plurality of autonomous and communication function models are set in the virtual scene, so that the whole environment has intelligence and conditions are created for verification and evaluation. In one embodiment, the data types include a slave vehicle module, an environment module, a road surface module and a master vehicle behavior control module, each data type has autonomous communication capability, and the master vehicle behavior control module generates corresponding driving behavior data according to the sensed virtual scene event and the self variable.
The intelligent behavior of the unmanned vehicle to be tested is mainly reflected in the reaction to sudden and uncertain events. Therefore, when constructing a virtual scene, it is necessary to set models of varying degrees of bursty or uncertain events. The emergency event and uncertainty event models are preset in the slave vehicle modules, so that on one hand, the real traffic environment of the master vehicle can be simulated more truly, on the other hand, different emergency and uncertainty event models are introduced into the virtual scene, so that the master vehicle and the slave vehicle which generates the emergency event or the uncertainty event interact to generate driving behavior data which can be used as a part basis for verifying and evaluating the intelligent behavior of the unmanned vehicle to be tested.
The auxiliary vehicle module can simulate driving behaviors of different drivers and the same driver in different states, different difficulties are brought to autonomous driving of the main vehicle, and various conditions are provided for intelligent behavior verification and evaluation of the unmanned vehicle to be tested.
The slave vehicle module presets self speed, acceleration and steering wheel rotation angle for each slave vehicle in the slave vehicle module, so the design of the trigger rule base of the behavior of the master vehicle behavior control module needs to determine the relationship between the speed, position and distance information of the master vehicle and the speed, acceleration and steering wheel rotation angle of the slave vehicle. As shown in FIG. 2, the parameters of the initialized speed, the accelerated speed, the position and the like of the slave vehicle are firstly determined according to the slave vehicle module, then the surrounding environment perception module of the virtual scene is continuously called, and once the master vehicle is detected to be in the detection range, the set emergent event scenario is called to give a question to the master vehicle, so that conditions are provided for intelligent behavior verification and evaluation. If no main vehicle appears in the set range, corresponding control behaviors are executed according to the normal detection method for detecting whether obstacles exist in the front, the left and the right.
103. The method comprises the steps that a main vehicle obtains operation parameters of an unmanned vehicle to be tested and road information simulated by a test bench, the main vehicle is simulated into a virtual scene, the main vehicle interacts with each data type in the virtual scene, and virtual driving behavior data are generated.
The master vehicle behavior control module is provided with a basic driving behavior library, a low-level driving behavior library and a trigger rule library of corresponding behaviors, and the master vehicle matches sensed virtual scene events with events in the rule library to generate corresponding driving behavior data. The low-level driving behavior library comprises basic behaviors of the virtual unmanned vehicle, such as steering, advancing, backing and the like. The low-level behavior is the automatic driving behavior in a normal scene, and is mainly used for timely reaction of the main vehicle to the normal traffic scene. The basic reaction behavior library stores motivation behaviors of the virtual unmanned vehicle, such as autonomous perception, autonomous decision and the like. The low-level behavior layer is connected with the basic behavior layer by adopting an inclusive structure, the basic reaction behavior takes the low-level driving behavior as a basic unit, the function of the low-level behavior is included, and meanwhile, more complex task-level behaviors can be formed. And all behaviors are matched with the virtual scene events through the behavior trigger rule base to carry out automatic driving control. Therefore, the main vehicle can realize the paralleling of low-level behaviors and basic reaction behaviors, not only ensures the real-time reaction to the sudden traffic scene, but also can fully exert the cognitive planning capability of the main vehicle and realize the automatic driving of the complex traffic scene.
A rule base in the master behavior control module is provided with a priority level, the priority level of the sudden behavior is higher than that of the medium-long term behavior, and the priority level of the simple behavior is higher than that of the complex behavior.
The host behavior control module is capable of autonomously communicating with the host vehicle and recording behaviors occurring between the host vehicle and the respective models in the virtual scene. The interaction between the main vehicle and a certain module is found through trigger detection, and the processed result is transmitted to the intelligent behavior verification and evaluation module, so that a basis is provided for intelligent behavior verification and evaluation of the vehicle.
At present, basic motive response behaviors of virtual unmanned vehicles (including a master vehicle and slave vehicles) mainly comprise automatic light-on, U-shaped turning around, signal light recognition, lane change, following, overtaking, roundabout exiting, parking space recognition, lateral parking, vertical parking, emergency braking, lane departure early warning, lane keeping and the like, and low-level response behaviors such as starting/stopping vehicles, left/right turning, advancing/backing, acceleration/deceleration and the like are packaged. The behaviors are represented in a parameterized form and stored in respective behavior libraries.
The behavior selection mechanism of the virtual unmanned vehicle determines the behavior decision process, and is the key for realizing high autonomy of the virtual unmanned vehicle. The virtual unmanned vehicle behavior controller is responsible for selection, activation and termination of basic reaction behaviors through a behavior selection mechanism.
In the simulation process, task-level complex behaviors planned by the virtual unmanned vehicle cognition module, perception events of a virtual environment, real-time control instructions of a user and the like are used as external events (events), and the external events and the current internal state value (InnerState) of the virtual unmanned vehicle are used as input of the behavior module together to express the current requirements of the virtual unmanned vehicle. When a certain requirement exceeds its behavior selection threshold, a corresponding behavior rule is triggered. If the situation is simple or urgent, triggering low-level behaviors to generate timely response actions; if the complex behaviors in the medium and long term are complex behaviors, the behavior controller selects one or a group of basic reaction behaviors according to corresponding rules to form an action instruction. And the action command is realized through the motion layer, so that the current requirement of the virtual human is met. Once the requirement is met, the internal attribute value of the system gradually returns to a normal level; if more than one demand needs to be handled, it is prioritized. The basic principle of priority determination is that the most important requirements will get the highest priority, the simple behavior priority is higher than the priority of the complicated behavior, and the emergency priority is higher than the priority of the medium-term and long-term events. The behavior selection rule specifies the behavior activation condition and its trigger result.
In order to truly express the behavior characteristics of the unmanned vehicle, the virtual unmanned vehicle in the virtual scene can autonomously sense the external dynamic environment and the change of the internal attribute of the virtual unmanned vehicle, autonomously determine the behavior mode according to the current target or requirement, and simultaneously can exchange information with other virtual unmanned vehicles or users to change the state of the virtual unmanned vehicle. Through effective behavior control, the model can not only react to emergencies in real time, but also have strong cognitive planning capability and can generate vivid unmanned vehicle behaviors.
The virtual unmanned vehicle obtains external virtual environment information and stimulation of an internal vehicle body state in real time through the sensing module. For sensing of an external virtual environment, the virtual unmanned vehicle is provided with a virtual vision sensor, a virtual radar sensor and a virtual GPS sensor, the current position and direction of the virtual unmanned vehicle can be obtained, information of static objects, dynamic objects and sudden traffic scenes in the virtual environment can be sensed, wherein the cognitive part of the host vehicle is completely completed by a tested vehicle, and related data is transmitted to a server through threads.
The environment module comprises traffic signs, traffic markings, plants, buildings, bridges, tunnel models and pedestrians, and each model is matched with a rule base in the main vehicle behavior control module. When the main vehicle drives autonomously, the surrounding environment model must be accurately identified, and certain intelligent behaviors can be embodied.
The road modules comprise highway, urban road and country road models, and each model is matched with a rule base in the master vehicle behavior control module. Fig. 3 lists 3 kinds of surrounding environment models, where the model 1 is based on a highway environment, and has lane lines and no pedestrians, and requires that an unmanned vehicle to be tested has functions of automatic driving, lane line recognition and overtaking; the model 2 is an urban street environment, has a lane line and pedestrians, and requires that the to-be-detected unmanned vehicle has the functions under the condition of the model 1 and also can identify and automatically avoid the pedestrians; the model 3 has the highest requirement on the intelligent behavior of the vehicle, has the functions under the condition of the model 2, and needs to identify an unstructured road. In summary, different ambient modules are established to satisfy the verification and evaluation tests of vehicle behavior at different levels of intelligence.
The main vehicle senses road information including road gradient and side inclination in real time and transmits the road information to the road simulating system. Therefore, the main vehicle needs to detect the terrain surface in real time, obtain elevation data of four wheels, and calculate the slope and the roll angle of the road surface according to the heights of the four wheels. The slave vehicle behavior simulation also needs to sense and detect the surrounding environment model, so that automatic driving is realized.
In one embodiment, road inclination simulation is achieved through a test bench which also collects driving behavior data of the unmanned vehicle to be tested.
104. The intelligent behavior verification and evaluation module takes driving behavior data generated by the main vehicle as a sample set, and takes the category with the maximum output times in the decision tree as the category of the test sample by adopting a random forest algorithm.
In one embodiment, the method further comprises the steps of dividing the intelligent grade of the unmanned vehicle, setting an intelligent evaluation index of the vehicle, taking driving behavior data corresponding to the evaluation index as a training set after the intelligent grade division, and outputting the category after the intelligent grade division by adopting a random forest algorithm.
As shown in fig. 4, the intelligent classification of the unmanned vehicle based on the bench test is based on the classification of the intelligent attribute of the unmanned vehicle, so that the bench test task and the test requirement of the bench test can be realized by easily entering the unmanned vehicle and simplifying the unmanned vehicle. Currently, in the intelligent aspect of intelligent automobiles, the SAE hierarchy definition in the United states is generally accepted in the industry. Based on this and in combination with the complexity of the current state of Chinese road traffic, it can be divided into five levels (as shown in the following table) of Driving Assistance (DA), partial autonomous driving (PA), conditional autonomous driving (CA), highly autonomous driving (HA), and fully autonomous driving (FA). And the five levels are used as the output of the unmanned vehicle intelligence energy, the feedback of each behavior is used as the input, and an unmanned vehicle intelligence evaluation analysis model is established to evaluate the intelligence of the vehicle.
Random forest algorithms are proposed by Leo Breiman and Adele Cutler. The algorithm combines the "boosting aggregating" idea of Breimans with the "random subspac" method of Ho. The essence is a classifier comprising a plurality of tables trees, and the decision trees are formed by a random method, so that the trees are also called random decision trees, and the trees in the random forest have no relation. When the test data enters the random forest, each decision tree is classified, and finally the class with the highest classification result in all the decision trees is taken as a final result. A random forest is thus a classifier comprising a plurality of decision trees and the class of its output is dependent on the mode of the class of the individual tree output.
Resampling by Bootstrap method
If the set S contains n different samples { x1, x2, ·, xn }, and if one sample is extracted from the set S with return each time, n times in total, to form a new set S, the probability that a sample xi (i ═ 1, 2, 3 · · · n) is not included in the set S is given as
Figure BDA0001745959690000061
When n → ∞ is present
Figure BDA0001745959690000062
Therefore, although the total number of samples in the new set S is equal (n), the new set contains duplicate samples (with put back extraction), and if the duplicate samples are removed, the new set S contains only about 1-0.368 × 100% ═ 63.2% of the samples in the original set S.
Overview of Bagging Algorithm
The Bagging (abbreviation of boosting aggregation) algorithm is the earliest ensemble learning algorithm, and the basic idea thereof is shown in fig. 5. The specific steps can be described as follows:
resampling by using a Bootstrap method, and randomly generating T training sets S1, S2, … and ST;
generating corresponding decision trees C1, C2, … and Ct by utilizing each training set;
for the test set sample X, testing by using each decision tree to obtain corresponding categories C1(X), C2(X), … and Cr (X);
and adopting a voting method to take the category with the most output in the T decision trees as the category to which the test set sample X belongs.
Algorithm flow of random forest
The random forest algorithm is similar to the Bagging algorithm, and resampling is carried out on the basis of a Bootstrap method to generate a plurality of training sets. The difference is that the random forest algorithm adopts a method of randomly selecting a splitting attribute set when constructing the decision tree. The detailed flow of the random forest algorithm is as follows (without setting the number of attributes of the sample as M, M is an integer greater than zero and less than M):
resampling by using a Bootstrap method, and randomly generating T training sets S1, S2, … and ST.
Generating corresponding decision trees C1, C2, … and CT by utilizing each training set; before selecting attributes on each non-leaf node (internal node), randomly extracting M attributes from the M attributes as a splitting attribute set of the current node, and splitting the node in the best splitting mode of the M attributes (generally, the value of M is kept unchanged in the whole forest growing process).
Each tree grew intact without pruning.
For the test set sample X, each decision tree is used for testing, resulting in the corresponding category C1(X), cz (X), …, ct (X).
And adopting a voting method to take the category with the most output in the T decision trees as the category to which the test set sample X belongs.
Based on the classifier design of the support vector machine, the architecture of the support vector machine is shown in fig. 6. Wherein b is a bias parameter.
The unmanned vehicle test data is from actual data measured by an unmanned vehicle with a grade on a test bench. The data contains a specific number of samples and specific feature components, and the intelligence level is given as a class label for each sample. Before formal classification, training an SVM (support vector machine) to obtain a classification model, and predicting class labels of a test set by using the obtained model to obtain intelligent level evaluation.
Intelligent evaluation process
The intelligent vehicles passing through different identified intelligent levels are tested on the rack. And the acquisition of different scene data generated in the main server is completed through a large number of repeated detections. And training the SVM classifier by using the acquired data as a training set, and determining the optimal punishment parameter and function parameter. And next, collecting test data, sending the test set sample to a decision tree, selecting a mode according to the voting, and determining the intelligence of the tested vehicle.
Model overall process
The vehicle intelligent evaluation indexes are divided into a vehicle basic behavior i, a vehicle advanced behavior j, a basic traffic behavior k and an advanced traffic behavior m, and are set according to parameters: the method comprises the following steps of 1, keeping a straight lane i1, stopping a line parking i2, a U-shaped bend i3, limiting a speed i4 and avoiding a static obstacle i 5; the vehicle advanced behavior j is set to: the method comprises the following steps of (1) curve lane keeping j1, vehicle language instruction j2, intersection passing j3, dynamic planning j4 and GPS navigation performance j 5; basic traffic behaviors: forbidding reverse k1 and keeping the vehicle distance k 2; advanced traffic behavior: traffic sign identification m1, signal light identification m2 and emergency brake m 3.
The driving behavior data contains a specific number of samples and specific feature components, and the intelligence level serves as a category label for each sample. Before formal classification, training an SVM (support vector machine) to obtain a classification model, and predicting class labels of a test set by using the obtained model to obtain intelligent level evaluation.
As shown in fig. 7, the smart level evaluation Y ═ Y1, Y2, Y3, Y4, Y5], Y1 Driving Assistance (DA), Y2 partial autonomous driving (PA), Y3 conditional autonomous driving (CA), Y4 high autonomous driving (HA), and Y5 full autonomous driving (FA).
Wherein X (i1) -X (m3) is a sample set collected under each intelligent evaluation index; k () is a kernel function; x is a feature vector and respectively corresponds to X (i1) - -X (m 3); xi1--Xm3Is a parameter corresponding to the behavior of the vehicle.
The present invention has been described in terms of specific examples, which are provided to aid understanding of the invention and are not intended to be limiting. For a person skilled in the art to which the invention pertains, several simple deductions, modifications or substitutions may be made according to the idea of the invention.

Claims (7)

1. The method for testing the intelligence of the unmanned vehicle based on the bench is characterized by comprising the following steps of:
arranging the unmanned vehicle to be tested on a test bench frame to simulate driving;
the method comprises the steps that a virtual scene and each data type in the virtual scene are built, the data types in the virtual scene comprise a slave vehicle module, an environment module, a road surface module and a master vehicle behavior control module, each data type has autonomous communication capacity, an emergent event and an uncertain event model are preset in the slave vehicle module, and the master vehicle behavior control module is provided with a basic driving behavior library, a low-level driving behavior library and a trigger rule library of corresponding behaviors;
the method comprises the following steps that a main vehicle obtains operation parameters of an unmanned vehicle to be tested and road information simulated by a test bench and simulates the main vehicle into a virtual scene; the host vehicle interacts with various data types in the virtual scene to generate virtual driving behavior data, wherein the driving behavior data comprises: the driving behavior data generated by interaction between the master vehicle and the slave vehicles in which sudden events or uncertain events occur, and the driving behavior data generated by matching the perceived virtual scene events with the events in the rule base by the master vehicle;
and taking driving behavior data generated by the main vehicle as a sample set, and taking the category with the maximum output times in the decision tree as the category of the test sample by adopting a random forest algorithm.
2. The method for bench-based testing of unmanned vehicle intelligence of claim 1, further comprising: the master vehicle module and the slave vehicle module are provided with a virtual vision sensor, a virtual radar sensor and a virtual GPS sensor.
3. The method for bench-based testing of unmanned vehicle intelligence of claim 1, further comprising: the environment module comprises traffic signs, traffic marking lines, plants, buildings, bridges, tunnel models and pedestrians, and each model is matched with a rule base in the main vehicle behavior control module.
4. The method for bench-based testing of unmanned vehicle intelligence of claim 1, further comprising: the road modules comprise highway, urban road and country road models, and each model is matched with a rule base in the master vehicle behavior control module.
5. The method for bench-based testing of unmanned vehicle intelligence of claim 1, further comprising: the rule base in the master behavior control module is provided with a priority level, the priority level of the sudden behavior is higher than that of the medium-long term behavior, and the priority level of the simple behavior is higher than that of the complex behavior.
6. The method for bench-based testing of unmanned vehicle intelligence of claim 1, further comprising: road inclination simulation is achieved through the test bench, and the test bench is used for collecting driving behavior data of the unmanned vehicle to be tested.
7. The method for bench-based testing of unmanned vehicle intelligence of claim 1, further comprising: the method further comprises the steps of dividing the intelligent grade of the unmanned vehicle, setting an intelligent evaluation index of the vehicle, taking driving behavior data corresponding to the evaluation index as a training set after the intelligent grade division, and outputting the category after the intelligent grade division by adopting a random forest algorithm.
CN201810842710.7A 2018-07-27 2018-07-27 Method for testing intelligence of unmanned vehicle based on rack Expired - Fee Related CN109084992B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810842710.7A CN109084992B (en) 2018-07-27 2018-07-27 Method for testing intelligence of unmanned vehicle based on rack

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810842710.7A CN109084992B (en) 2018-07-27 2018-07-27 Method for testing intelligence of unmanned vehicle based on rack

Publications (2)

Publication Number Publication Date
CN109084992A CN109084992A (en) 2018-12-25
CN109084992B true CN109084992B (en) 2020-06-16

Family

ID=64831053

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810842710.7A Expired - Fee Related CN109084992B (en) 2018-07-27 2018-07-27 Method for testing intelligence of unmanned vehicle based on rack

Country Status (1)

Country Link
CN (1) CN109084992B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110864913B (en) * 2019-11-28 2021-09-03 苏州智加科技有限公司 Vehicle testing method and device, computer equipment and storage medium
CN110955972B (en) * 2019-11-28 2023-08-18 苏州智加科技有限公司 Virtual scene generation method, device, computer equipment and storage medium
CN111079800B (en) * 2019-11-29 2023-06-23 上海汽车集团股份有限公司 Acceleration method and acceleration system for intelligent driving virtual test
CN111178402B (en) * 2019-12-13 2023-04-07 赛迪检测认证中心有限公司 Scene classification method and device for road test of automatic driving vehicle
CN113574530B (en) * 2020-02-12 2024-09-20 深圳元戎启行科技有限公司 Driving scene information processing method, driving scene information processing device, electronic equipment and readable storage medium
CN111649957A (en) * 2020-06-08 2020-09-11 山东省交通规划设计院有限公司 Tunnel environment automatic driving vehicle driving capability test system and test method
CN111881031A (en) * 2020-07-23 2020-11-03 深圳慕智科技有限公司 Intelligent transportation software and hardware precision disturbance method library and risk index construction method
CN112230228B (en) * 2020-09-30 2024-05-07 中汽院智能网联科技有限公司 Intelligent automobile vision sensor testing method based on field testing technology
CN112373483B (en) * 2020-11-23 2022-07-29 浙江天行健智能科技有限公司 Vehicle speed and steering prediction method based on forward neural network
CN112912883B (en) * 2021-02-07 2022-06-28 华为技术有限公司 Simulation method and related equipment
CN113219944B (en) * 2021-04-27 2022-05-31 同济大学 Intelligent vehicle control strategy test platform for mixed traffic flow working condition
CN113627740B (en) * 2021-07-20 2024-06-25 东风汽车集团股份有限公司 Driving load evaluation model construction system and construction method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107024927A (en) * 2016-02-01 2017-08-08 上海无线通信研究中心 A kind of automated driving system and method
CN107957583A (en) * 2017-11-29 2018-04-24 江苏若博机器人科技有限公司 A kind of round-the-clock quick unmanned vehicle detection obstacle avoidance system of Multi-sensor Fusion

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102393744B (en) * 2011-11-22 2014-09-10 湖南大学 Navigation method of pilotless automobile
CN103234763B (en) * 2013-04-09 2015-04-15 北京理工大学 System and method for quantitatively evaluating unmanned vehicles
CN103335853B (en) * 2013-07-18 2015-09-16 中国科学院自动化研究所 A kind of automatic driving vehicle Cognitive Aptitude Test system and method
CN104477167B (en) * 2014-11-26 2018-04-10 浙江大学 A kind of intelligent driving system and its control method
GB2557252B (en) * 2016-12-02 2020-03-25 Christian Schwazl Physical environment simulator for vehicle testing
CN108267322A (en) * 2017-01-03 2018-07-10 北京百度网讯科技有限公司 The method and system tested automatic Pilot performance
CN107918392B (en) * 2017-06-26 2021-10-22 深圳瑞尔图像技术有限公司 Method for personalized driving of automatic driving vehicle and obtaining driving license
CN107272683A (en) * 2017-06-19 2017-10-20 中国科学院自动化研究所 Parallel intelligent vehicle control based on ACP methods
CN107272687A (en) * 2017-06-29 2017-10-20 深圳市海梁科技有限公司 A kind of driving behavior decision system of automatic Pilot public transit vehicle
CN107526906A (en) * 2017-10-11 2017-12-29 吉林大学 A kind of driving style device for identifying and method based on data acquisition
CN108267325A (en) * 2018-01-29 2018-07-10 上海测迅汽车科技有限公司 Unmanned vehicle material object is in ring test method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107024927A (en) * 2016-02-01 2017-08-08 上海无线通信研究中心 A kind of automated driving system and method
CN107957583A (en) * 2017-11-29 2018-04-24 江苏若博机器人科技有限公司 A kind of round-the-clock quick unmanned vehicle detection obstacle avoidance system of Multi-sensor Fusion

Also Published As

Publication number Publication date
CN109084992A (en) 2018-12-25

Similar Documents

Publication Publication Date Title
CN109084992B (en) Method for testing intelligence of unmanned vehicle based on rack
US12030520B2 (en) Method and system for validating autonomous control software for a self-driving vehicle
CN112703459B (en) Iterative generation of confrontational scenarios
US9665802B2 (en) Object-centric fine-grained image classification
Dogan et al. Autonomous driving: A comparison of machine learning techniques by means of the prediction of lane change behavior
CN109774724A (en) The scene of exploitation & assessment for autonomous driving system generates and the method and apparatus of parameter scanning
CN109466543A (en) Plan autokinetic movement
CN116134292A (en) Tool for performance testing and/or training an autonomous vehicle planner
CN109782751A (en) Method and apparatus for autonomous system performance and benchmark test
Thammachantuek et al. Comparison of machine learning algorithm's performance based on decision making in autonomous car
CN113918615A (en) Simulation-based driving experience data mining model construction method and system
CN117521389A (en) Vehicle perception test method based on vehicle-road collaborative perception simulation platform
CN117217314A (en) Driving situation reasoning method based on metadata driving and causal analysis theory
CN117242460A (en) Computerized detection of unsafe driving scenarios
Xiong et al. Research on the quantitative evaluation system for unmanned ground vehicles
CN115855531A (en) Test scene construction method, device and medium for automatic driving automobile
CN114987495A (en) Man-machine hybrid decision-making method for highly automatic driving
CN115285134A (en) Improved machine learning
Guo et al. How will humans cut through automated vehicle platoons in mixed traffic environments? A simulation study of drivers’ gaze behaviors based on the dynamic areas of interest
KR20230144646A (en) Generating unknown-unsafe scenarios, improving automated vehicles, computer systems
Wang et al. Driver modeling based on vehicular sensing data
Khezaz et al. Driving Context Detection and Validation using Knowledge-based Reasoning.
Zyner Naturalistic driver intention and path prediction using machine learning
Luo et al. UML-based design of intelligent vehicles virtual reality platform
CN116680932B (en) Evaluation method and device for automatic driving simulation test scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200616

CF01 Termination of patent right due to non-payment of annual fee