CN117111578A - Automatic driving system detection blind area guiding fuzzy test method and system - Google Patents
Automatic driving system detection blind area guiding fuzzy test method and system Download PDFInfo
- Publication number
- CN117111578A CN117111578A CN202311060753.7A CN202311060753A CN117111578A CN 117111578 A CN117111578 A CN 117111578A CN 202311060753 A CN202311060753 A CN 202311060753A CN 117111578 A CN117111578 A CN 117111578A
- Authority
- CN
- China
- Prior art keywords
- scene
- scenes
- vehicle
- fitness
- scores
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 46
- 238000010998 test method Methods 0.000 title claims description 10
- 238000012360 testing method Methods 0.000 claims abstract description 116
- 238000000034 method Methods 0.000 claims abstract description 72
- 230000035772 mutation Effects 0.000 claims abstract description 24
- 230000008859 change Effects 0.000 claims abstract description 20
- 210000005155 neural progenitor cell Anatomy 0.000 claims description 39
- 230000006870 function Effects 0.000 claims description 29
- 230000008569 process Effects 0.000 claims description 24
- 238000004422 calculation algorithm Methods 0.000 claims description 21
- 238000004364 calculation method Methods 0.000 claims description 21
- 230000002068 genetic effect Effects 0.000 claims description 17
- 238000004088 simulation Methods 0.000 claims description 14
- 230000006399 behavior Effects 0.000 claims description 10
- 230000001186 cumulative effect Effects 0.000 claims description 10
- 230000003042 antagnostic effect Effects 0.000 claims description 8
- 210000000349 chromosome Anatomy 0.000 claims description 7
- 238000010187 selection method Methods 0.000 claims description 5
- 230000002759 chromosomal effect Effects 0.000 claims description 4
- 238000013461 design Methods 0.000 claims description 4
- 108090000623 proteins and genes Proteins 0.000 claims description 4
- 230000036039 immunity Effects 0.000 claims description 3
- 238000012804 iterative process Methods 0.000 claims description 3
- 230000007246 mechanism Effects 0.000 claims description 3
- 230000000717 retained effect Effects 0.000 claims description 3
- 230000003247 decreasing effect Effects 0.000 claims description 2
- 230000000452 restraining effect Effects 0.000 claims description 2
- 239000000243 solution Substances 0.000 description 9
- 238000004590 computer program Methods 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 6
- 238000002474 experimental method Methods 0.000 description 6
- 230000007547 defect Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 238000011056 performance test Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 241001237961 Amanita rubescens Species 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000013100 final test Methods 0.000 description 1
- 238000011990 functional testing Methods 0.000 description 1
- 230000007614 genetic variation Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- 238000010845 search algorithm Methods 0.000 description 1
- 238000013522 software testing Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B23/00—Testing or monitoring of control systems or parts thereof
- G05B23/02—Electric testing or monitoring
- G05B23/0205—Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults
- G05B23/0218—Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterised by the fault detection method dealing with either existing or incipient faults
- G05B23/0243—Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterised by the fault detection method dealing with either existing or incipient faults model based detection method, e.g. first-principles knowledge model
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/24—Pc safety
- G05B2219/24065—Real time diagnostics
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Traffic Control Systems (AREA)
Abstract
A method and a system for guiding fuzzy test of a detection blind area of an automatic driving system comprise the steps of randomly generating a plurality of initial scenes, running in a simulator, respectively calculating fitness scores, and carrying out variation on the instantaneous speed and the variation condition of each time node of a traffic participant in the scenes according to the corresponding score condition of each scene; the mutated scenes are used as new use cases to run in a simulator, new fitness scores are obtained, mutation is conducted again based on the new scores, after the mutation is repeated for a plurality of times, scenes with scores larger than the average value are selected for local search, and scenes with scores smaller than the average value are randomly restarted; calculating the similarity among the scenes of the scenes with the scores smaller than the average value, keeping a plurality of scenes with the lowest similarity to update the initial scene, and carrying out the steps again; and (4) changing the parameters in the scene with higher fitness again, and reserving the scene with higher score before and after the change as an output scene. The invention solves the problems of single generated scene, low test efficiency and the like.
Description
Technical Field
The invention belongs to the technical field of intelligent vehicle road systems, and particularly relates to a method and a system for guiding fuzzy test of a detection blind area of an automatic driving system.
Background
With the rapid development of autopilot technology, the safety performance testing technology of autopilot systems is also continuously innovated and developed. Early automatic driving system testing technologies mainly focus on aspects of vehicle performance, sensor precision, electronic equipment reliability and the like, and with continuous improvement of complexity and criticality of an automatic driving system, testing technologies in aspects of functional safety, algorithm robustness and the like are also increasingly important. External factors such as changeable weather conditions, complex traffic environment, diversity of driving tasks and the like all provide new challenges for automatic driving automobile testing. In the early stage of development of the automatic driving system, the safety performance of the automatic driving system is usually verified in a real vehicle test mode, but the real vehicle test is mainly suitable for relatively single various functional tests and verification. As the level of autopilot of an autopilot vehicle increases, the need for test scenarios becomes complex and unpredictable, and real vehicle testing has not covered all of the situations it may face.
The simulation test is favored by enterprises and research institutions because of the characteristics of abundant test scenes, low test cost, high test efficiency, high scene repeatability and the like. Compared with the real vehicle test, the simulation test can find the functional problem of the automatic driving vehicle at lower cost, and particularly can conveniently perform test driving development, namely, the test result participates in the iterative training of the algorithm, so that the simulation test becomes one of the most potential simulation test methods.
The fuzzy test is a mainstream and effective software defect detection method in the current simulation test. The method is characterized in that unexpected input is continuously provided for a tested program, and whether the program is abnormal or not is monitored, so that the aim of detecting software defects is fulfilled, and the method is an automatic software testing technology based on defect injection. It uses a large number of automatically generated test cases as inputs to the application to discover security flaws that may exist in the application. The current fuzzy test technology is applied to the research on the aspect of the safety performance test of the automatic driving system, and mainly adopts different variation strategies when generating test scenes so as to obtain key violation scenes with great influence on the safety performance of the automatic driving system, thereby achieving the purposes of reducing the number of test cases and improving the test efficiency.
Disclosure of Invention
Aiming at the problems of single generated scene, low test efficiency and the like in the key scene searching method in the prior art, the invention provides a method and a system for guiding fuzzy test of a detection blind area of an automatic driving system, which abstract a specific traffic scene represented by an array in a simulation simulator into one test case, design an adaptability function taking a sensor detection blind area as a guide for representing the safety of each test case, perform grouping, mutation, iteration and the like on a plurality of cases, and have the capability of searching complex test scenes containing multiple vehicles.
In order to achieve the above purpose, the present invention has the following technical scheme:
a method for guiding fuzzy test of a detection blind area of an automatic driving system comprises the following steps:
according to the related semantic limit of the simulator, randomly generating a plurality of initial scenes, wherein each scene comprises a host vehicle with a built-in tested automatic driving system and a plurality of traffic participants, setting the duration of each scene and interval time nodes, and taking the instantaneous speed and the channel changing condition of the traffic participants on each time node as variable parameters;
running each initial scene in a simulator, calculating fitness scores according to fitness functions which are designed in advance and describe safety conditions among vehicles in the scenes according to blind area shielding conditions of detection areas of sensors of main vehicles, reserving scenes with higher scores and poorer safety performance according to the corresponding score conditions of each initial scene, and carrying out variation on the instantaneous speeds and lane changing conditions of time nodes of traffic participants in the scenes;
Operating the mutated scene queue in a simulator as a new use case to obtain a new fitness score, carrying out mutation again based on the new score, calculating the average value of all scene fitness scores after repeating the above process for a plurality of times, selecting scenes with scores larger than the average value for local search, and randomly restarting other scenes with scores smaller than the average value;
after randomly restarting, aiming at scenes with fitness scores smaller than the average value, calculating the similarity among the scenes, keeping a plurality of scenes with the lowest similarity as initial scenes, and carrying out the steps again after updating the initial scenes;
and (3) carrying out mutation again based on parameters in the scene with higher fitness, operating the mutated scene in a simulator, calculating a fitness score, comparing the score of the scene before and after mutation, and reserving the higher scene as an output scene.
As a preferable scheme, the variable parameters define a value range in a scene, the speed of a host vehicle is controlled to be 30km/h at a constant speed, the speed of a traffic participant is controlled to be in a range of 20km/h to 120km/h, the lane changing condition of the traffic participant is represented by 0, 1 and 2, 0 represents straight lane changing, 1 represents left lane changing, and 2 represents right lane changing; and also, restraining all vehicle behaviors in the scene, and eliminating invalid scenes.
As a preferable scheme, the fitness function for describing the safety condition among vehicles in the scene is designed according to the blind area shielding condition of the detection area of the sensor of the host vehicle in the following manner: if the host vehicle follows in the same direction, the length of the vehicle is l, the width is d, the left lower corner of the front Fang Zhongdian of the rectangular front of the host vehicle and the rectangular front of the front vehicle is taken as a reference point, the transverse distance between the front vehicle and the rectangular front of the host vehicle is taken as x, the longitudinal distance between the front vehicle and the rectangular front of the host vehicle is taken as y, the blind area angle caused by shielding of the front vehicle to the host vehicle in the detection range of the detector is taken as S, when the vehicle runs, the closer other vehicles in the scene are to the host vehicle, the larger the blind area angle S is, and the calculation modes of the shielding angle S are different along with the change of the relative positions between the vehicles.
As a preferable mode, the calculation mode of the shielding angle S along with the change of the relative positions between the vehicles is as follows:
when 0 < x.ltoreq.y-d:
when y-d is less than x and less than y:
when y < x < y+l:
as a preferable scheme, when more than two non-host vehicles simultaneously appear in the detection area of the host vehicle sensor, the two vehicles simultaneously detect the host vehicle sensor to generate shielding, if the blind area angles of the vehicles are overlapped, the corresponding scene is considered to be a dangerous scene, otherwise, the safety is ensured;
Blind zone overlap angle S b The calculated expression of (2) is as follows:
S b =S area1 ∩S area2
blind zone overlap angle S b The following relationship exists: when S is b >0 at risk, when S b Safety when=0;
defining the angle range of the blind area of each vehicle as S area Based on the calculation result of S area The calculated expression of (2) is as follows:
when S is b When the distance between the vehicles is larger, the shielding condition of the front vehicle to the main vehicle sensor is more serious, and the distance between the vehicles is also closer; on the contrary, when the ratio of the blind area angle overlapping part is higher, but the distance of the vehicle is longer, and the ratio of the blind area angle overlapping part is lower, but the distance of the vehicle is shorter, the danger of the corresponding scene is lower when the two conditions occur, and S b The value of (2) is also lower.
As a preferred solution, the step of operating the mutated scene queue in a simulator as a new use case to obtain a new fitness score, mutating again based on the new score, repeatedly performing the above process for several times, calculating the average value of all the scene fitness scores, selecting scenes with scores greater than the average value for local search, and randomly restarting other scenes with scores less than the average value includes:
maximizing discovery of highly resistant hazard test scenarios within a given search budget, definition b t For inputting x t E, finding a set of antagonistic scenes when the X fuzzy iteration number is t times, and then finding the antagonistic scene y t The expression is as follows:
y t =max{b 1 ,b 2 ,b 3 ……,b t }
using a fuzzy search method based on a genetic algorithm, wherein the final objective of the fuzzy search is to maximize various found resistance scenes, and the fuzzy search corresponds to the optimal individuals in the offspring to be found in the genetic algorithm;
in genetic algorithms, fitness scores of individuals are used to determine whether the individual's genes should be retained or eliminated; selecting a design fitness function based on vehicle field knowledge and scene situation, wherein the fitness function calculates a maximum distance alpha which can be moved under the condition that a host vehicle EV does not collide with other vehicle NPCs in each time step according to vehicle related parameters in the scene, and the shielding rate beta of other vehicles in the scene to the host vehicle radar, and the fitness function at a certain moment t in the scene is as follows:
f t (EV,NPC)=α t +β t
if a constant c is additionally added when a vehicle collides in the scene, collision compensation as a function of fitness:
f t (EV,NPC)=α t +β t +c
in the course of scene simulation, the fitness score is measured once per time step t, f measured at time step t t The larger the higher the risk of vehicle collision at time t; according to the dynamic change of the scene, the adaptability score of the scene can be increased or decreased after each time step; the fitness score calculation expression of the scene is as follows :
The method comprises the steps of designing chromosome crossover as exchange operation of NPCs in scenes, randomly selecting one NPC in each of two scenes in each generation, and completing chromosome crossover operation by exchanging motion states of the NPCs in running time of the two scenes; chromosomal variation is a random change of a vehicle running state at a certain moment by randomly selecting a certain NPC in the genetic process.
As a preferred solution, the selection procedure of a scene aims at eliminating unsuitable individuals from each generation of scenes, obtaining individuals with higher fitness, using a roulette selection method to achieve this procedure, and if the fitness of a scene is higher, the easier it is to select the scene; in the roulette selection method, the selection probability of an individual is proportional to the fitness thereof, and the larger the fitness is, the larger the selection probability is; let a certain part x i The fitness value of (a) is expressed as f (x) i ) The corresponding selected probability is p (x i ) The cumulative probability is q (x i ) The calculation expression is as follows:
as a preferred solution, in the step of calculating the similarity between the scenes for the scenes with the fitness score smaller than the average value and reserving several scenes with the lowest similarity as the initial scenes, a random restarting mechanism is used to avoid trapping in a locally optimal solution;
When the scene fitness score in the iterative process is not improved with the passage of time, executing random restarting at the moment;
comparing the fitness score of the current scene with the average fitness score of the previous generations, and if the current fitness score is smaller than the average value of the previous generations, selecting a scene with the lowest similarity with the running scene from the scenes of the randomly initialized NPC tracks according to the NPC tracks in each scene recorded before as a new input of a random restarting genetic algorithm; the similarity is realized by calculating Euclidean distance between NPC tracks in candidate scenes and NPC tracks in previous scenes, and the larger the distance is, the lower the similarity between scenes is; the scene similarity is calculated as follows:
S t npc,1 the state information of the vehicle indicating the state of the NPC1 at the time of the scene t includes the position information L of each vehicle t npc,1 And velocity information V t npc,1 Calculated according to the following expression:
S t npc,1 =L t npc,1 +V t npc,1
by usingRepresenting Euclidean distance of different NPCs at time t, assuming that the position information of the same NPC in different test cases can be represented as L t npc,1′ And L is equal to t npc,1 Scene similarity of scenes Scenario1 and Scenario2The calculation is performed according to the following expression:
As a preferable scheme, the method further comprises the step of measuring the validity of the test case and the completeness of the test through coverage indexes;
describing scene coverage by using two parameters as a parameter space formed by the abscissa and the ordinate, wherein one parameter is the Euclidean distance between a tested vehicle and a vehicle with an occlusion condition, and the other parameter is a proposed occlusion parameter;
assuming that a total of n test cases are executed, the ith test case T i The parameter of (A) i ,B i ) Covering a small portion of the parameter space, the cumulative coverage Cov of the parameter space is expressed as the ratio of the Area of the parameter space covered by all test cases T to the Area of the total parameter space, expressed as follows:
wherein, the coverage immunity of each test case is obtained by calculating the area of the rectangle where the test case is located, and the ith test case T is assumed i The parameter area covered is rectangular (a i ,b i ) To (A) i ,B i ) Its coverage area is expressed as:
Area(T i )=(A i -a i )×(B i -b i )
the cumulative coverage Cov is:
an automatic driving system detection blind area guiding fuzzy test system, comprising:
the initial scene generation module is used for randomly generating a plurality of initial scenes according to the related semantic limit of the simulator, wherein the scenes comprise a host vehicle with a built-in tested automatic driving system and a plurality of traffic participants, the duration of each scene and interval time nodes are set, and the instantaneous speed and the channel changing condition of the traffic participants on each time node are taken as variable parameters;
The scene queue variation module is used for running each initial scene in the simulator, calculating the fitness score according to the fitness function which is designed in advance and used for describing the safety conditions among vehicles in the scene according to the blind area shielding condition of the detection area of the sensor of the main vehicle, reserving the scene with higher score and poorer safety performance according to the corresponding score condition of each initial scene, and varying the instantaneous speed and the lane changing condition of each time node of the traffic participant in the scene;
the local search module is used for operating the mutated scene queue in the simulator as a new use case to obtain a new fitness score, carrying out mutation again based on the new score, calculating the average value of all scene fitness scores after repeating the above processes for a plurality of times, selecting scenes with scores larger than the average value for local search, and restarting other scenes with scores smaller than the average value at random;
the initial scene updating module is used for calculating the similarity among the scenes according to the scenes with the fitness scores smaller than the mean value after being restarted randomly, keeping a plurality of scenes with the lowest similarity as initial scenes, and carrying out the steps again after updating the initial scenes;
The output scene selection module is used for carrying out mutation again based on parameters in scenes with higher fitness, operating the mutated scenes in the simulator, calculating fitness scores, comparing the scores of the scenes before and after mutation, and reserving the higher scene as the output scene.
Compared with the prior art, the invention has at least the following beneficial effects:
according to the related semantic limitation of the simulator, a plurality of initial scenes are randomly generated, each initial scene comprises a host vehicle with a built-in tested automatic driving system and a plurality of traffic participants, a specific traffic scene represented by an array in the simulation simulator is abstracted into a test case, each initial scene is operated in the simulator, a fitness function taking a sensor detection blind area as a guide is designed for representing the safety of each test case. The autopilot system is decision-making and controlling based on sensor acquisition environment information, and if the obstruction obstructs the sensor's ability to acquire information, the autopilot system may not make the correct decision-making and controlling, resulting in an accident. The test method can effectively search out typical test cases which cause the intelligent vehicle to lose control or collide with other vehicles.
Furthermore, in order to improve the test efficiency, the application provides the concept of scene vehicle behavior constraint to reduce the generation of invalid test cases in the fuzzy test process, and combines the invalid test cases with the fitness function, thereby effectively improving the test efficiency.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings that are needed in the description of the present application will be briefly introduced below, which are only some embodiments of the present application, and from which other drawings can be obtained for a person skilled in the art without inventive effort.
FIG. 1 is a flowchart of an overall test method for blind spot guidance ambiguity in an autopilot system according to an embodiment of the present application;
FIG. 2 is a schematic view of a vehicle passive collision scenario in accordance with an embodiment of the present application;
FIG. 3 is a schematic diagram of a non-host vehicle collision scenario in accordance with an embodiment of the present application;
FIG. 4 is a schematic view of a front-to-host sensor detection zone occlusion in accordance with an embodiment of the present application;
FIG. 5 (a) is a schematic view of a front truck shielding a detection area of a sensor of a host truck when the front truck is at a position 1 according to an embodiment of the present application;
FIG. 5 (b) is a schematic view of the front truck shielding the detection area of the sensor of the host truck when the front truck is at position 2 according to the embodiment of the present application;
FIG. 5 (c) is a schematic view of the front truck shielding the detection area of the sensor of the host truck when the front truck is at position 3 according to the embodiment of the present application;
FIG. 6 (a) is a schematic view of the blind zone overlap angle of two non-host vehicles at position 1 according to an embodiment of the present application;
FIG. 6 (b) is a schematic view of the blind zone overlap angle of two non-host vehicles at position 2 according to an embodiment of the present application;
FIG. 6 (c) is a schematic view of the blind zone overlap angle of two non-host vehicles at position 3 according to an embodiment of the present application;
FIG. 7 is a schematic representation of chromosome crossover according to an embodiment of the present application;
FIG. 8 is a schematic representation of chromosomal variations according to an embodiment of the present application;
FIG. 9 (a) is a schematic view of scene 1 at the same occlusion angle according to an embodiment of the present application;
FIG. 9 (b) is a schematic view of scene 2 at the same occlusion angle according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a final test scenario according to an embodiment of the present application;
FIG. 11 is a schematic diagram of the relationship between occlusion and collision according to an embodiment of the present application;
FIG. 12 is a ratio chart of the detection dead zone at different stages of the method according to the embodiment of the application and other methods;
FIG. 13 (a) is a graph comparing the number of dangerous scene searches in the case of the same algebra of the fuzzifier searches;
fig. 13 (b) is a view of the average time-to-contrast of the dangerous scene search in the case of the same algebraic search by the fuzzifier.
Detailed Description
The present application will be described and illustrated with reference to the accompanying drawings and examples in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application. All other embodiments, which can be made by a person of ordinary skill in the art based on the embodiments provided by the present application without making any inventive effort, are intended to fall within the scope of the present application.
Referring to fig. 1, the method for testing the guidance ambiguity of the detection blind area of the automatic driving system according to the embodiment of the invention includes the following steps:
step one, randomly generating 4 initial scenes according to the related semantic limitation of a simulator, wherein the scenes comprise 1 host vehicle with built-in tested automatic driving system and 2 traffic participants, and the initial positions of all vehicles are fixed on a three-lane expressway with the lane width of 3.7 meters. The time length of each scene is set to be 9 seconds, 3 seconds is taken as a time node, the instantaneous speed and the lane change condition of the traffic participants on each time node are taken as variable parameters, and the left content shown in fig. 1 is the visualization of each parameter and the meaning thereof in the scene.
Step two, after obtaining 4 initial scenes, designing an adaptability function for describing the safety conditions among vehicles in the scenes according to the blind area shielding condition of the detection area of the main vehicle sensor, operating 4 scenes in the simulator, respectively calculating the adaptability scores of the 4 scenes, and reserving the scenes with higher scores and poorer safety performance according to the corresponding score conditions of the scenes so as to change the instantaneous speed and the channel changing condition of each time node of the traffic participants in the scenes.
And thirdly, operating the mutated scene queue in a simulator as a new use case to obtain a new fitness score, carrying out mutation again based on the new score, calculating the average value of all scene fitness scores after repeating the above process for 5 times, selecting scenes with scores larger than the average value for local search, and restarting the scenes with other scores smaller than the average value at random.
And fourthly, randomly restarting, namely calculating the similarity among the scenes according to the scenes with the fitness scores smaller than the average value, and reserving 4 scenes with the lowest similarity as initial scenes to perform all the processes in the second to fourth steps.
And fifthly, performing local search, namely, performing mutation again on the scene with higher fitness based on parameters in the scene, operating the mutated scene in a simulator, calculating fitness scores of the scene, comparing the scores of the scene before mutation and the scene after mutation, and reserving the higher scene as an output scene.
In a possible implementation manner, when generating test cases in the step, in order to ensure that each test case is effective in a scene, a value range of a variable parameter of the test scene is defined, the speed of a host vehicle is controlled to be 30km/h at a constant speed, the speed of a traffic participant is controlled to be within 20km/h-120km/h, the lane change condition of the traffic participant is represented by 0, 1 and 2, 0 represents straight lane change, 1 represents left lane change, and 2 represents right lane change. In addition, all vehicle behaviors in the scene are restrained, so that a large number of invalid scenes are removed in the fuzzy test, and the test efficiency is improved.
The main purpose of the present invention is to generate a scene of collision between vehicles, however, defining an effective collision scene is a difficult problem, and merely limiting parameters of the scene can lead to a plurality of different situations, but the object of scene searching should be to find a scene which meets the real situation and is significant for the safety performance test of an automatic driving system. The constraints described in fig. 2, 3 are defined according to the scene test requirements and the related traffic regulations.
As shown in fig. 2, the host vehicle suddenly encounters a vehicle collision from the side rear during normal travel, resulting in a collision scenario where the rear vehicle is fully responsible according to the relevant traffic regulations. The embodiment of the invention defines the collision behavior occurring behind the side of the main vehicle as the vehicle behavior constraint, and is regarded as an invalid scene if the scenes caused by the class behavior conflict.
As shown in fig. 3, in this scenario, two non-host vehicles from the front side of the host vehicle collide, but no impact is caused on the driving of the host vehicle, and since no relevant control algorithm is usually set for the non-host vehicles in the automatic driving test, the collision between the non-host vehicles mostly does not conform to the real traffic scenario.
In one possible implementation manner, taking the following scene of the same-direction driving as an example, the fitness function described in the second step calculates the detection condition of the host vehicle on the front vehicle in the sensor range in real time. As shown in fig. 4, the host vehicle follows in the same direction, and the sensor device equipped with the host vehicle is a laser radar, wherein the radar parameters are as follows: θ=90°, r=100 m.
The occlusion of the host vehicle by different vehicle positions is different, so that the calculation of the detection blind areas in the scene by the host vehicle is also different. Let the length of the vehicle be l, the width be d, and meanwhile, for the sake of convenience in calculation, each vehicle is considered as a cuboid with the same length and width, the left lower corner of the rectangle between the front Fang Zhongdian of the main vehicle and the rectangle of the front vehicle is taken as a reference point, the transverse distance between the front vehicle and the rectangular front Fang Zhongdian of the main vehicle is taken as x, the longitudinal distance is taken as y, and the blind area angle caused by shielding of the front vehicle to the main vehicle in the detection range of the detector is defined as S, as shown in fig. 4.
When a vehicle is running, the closer other vehicles in the scene are to the host vehicle, the larger the blind area angle S is. The occlusion angle S is calculated in different ways as the relative position between the vehicles changes. As shown in fig. 5 (a) to 5 (c), when the current vehicle is at different positions in the detection range, the calculation mode of the shielding angle S is also different, and the specific calculation is shown in formulas (1), (2) and (3).
When 0 < x.ltoreq.y-d:
when y-d is less than x and less than y:
when y < x < y+l:
when more than two non-main vehicles appear in the radar detection range of the main vehicle, the two main vehicle detectors simultaneously generate shielding, and if the blind area angles of the vehicles are overlapped, the scene is considered to be a dangerous scene at the moment, otherwise, the scene is safe. Blind zone overlap angle S b The calculation formula of (2) is as follows:
S b =S area1 ∩S area2 (4)
blind zone overlap angle S b There is the following relationship when S b >0 at risk, when S b Safety when=0. Defining the angle range of the blind area of each vehicle as S area Based on the calculation result of S area The calculation of (2) is shown in formula (5):
as shown in FIGS. 6 (a) to 6 (c), when S b When the distance between the vehicles is larger, the situation that the front vehicle is opposite to the main vehicle sensor is serious, and meanwhile, the distance between the vehicles is also closer. On the contrary, when the ratio of the blind area angle overlapping part is higher and the distance of the vehicle is longer and the ratio of the blind area angle overlapping part is lower and the distance of the vehicle is shorter, the danger of the scene is lower and S b The value of (2) is also lower.
In one possible implementation, in step three, the fundamental purpose is to find the most highly antagonistic hazard test scenario within a given search budget. This can be understood as an optimization problem, how to maximize the number Y of antagonistic scenes within a fixed number of iterations T from the space X of all valid input scenes. Thus, embodiment definition b of the invention t For inputting x t E, finding a set of antagonistic scenes when the X fuzzy iteration number is t times, and then finding the antagonistic scene y t The method can be expressed as follows:
y t =max{b 1 ,b 2 ,b 3 ……,b t } (6)
since the input space X is too large, the exhaustive method cannot be used for search optimization. In addition, it is necessary to determine the search emphasis and place it in a promising area to optimize the search efficiency for the resistance scenario. The embodiment of the invention uses a fuzzy search method based on a genetic algorithm. Starting from some initial inputs, the fuzzy searcher tends to select new inputs, find new conflicts, to generate further inputs. The fuzzy search algorithm provided by the embodiment of the invention comprises the following three parts, namely a genetic algorithm, a random restarting module and a local search module. The algorithm pseudocode is as follows:
the final goal of the fuzzy search is to maximize the discovery of various resistance scenarios, corresponding to the need to find the optimal individual in the offspring in the genetic algorithm. However, since fewer scene inputs are involved in generating a collision, more guidance of a particular violation is required to assist the host-vehicle in generating a corresponding collision. For example, to obtain a scene of a collision of a vehicle with a pedestrian, an fitness function is required to help cause traffic violations of the vehicle with the pedestrian.
In genetic algorithms, fitness scores of individuals are used to determine whether the individual's genes should be retained or eliminated. For testing purposes, embodiments of the present invention choose to design fitness functions based on vehicle domain knowledge and scene situations. According to the fitness function of the embodiment of the invention, according to the relative position of the vehicle in the scene, the speed and other related parameters, the maximum distance alpha which can be moved under the condition that the main vehicle (EV) does not collide with other vehicles (NPC) and the shielding rate beta of other vehicles in the scene to the radar of the main vehicle are calculated in each time step. The fitness function calculation formula at a certain time t in the scene is:
f t (EV,NPC)=α t +β t (7)
if a collision occurs in a scene, a constant c is additionally added as a function of the fitness to compensate for the collision, so that more dangerous scenes are preserved during the evolution of the scene.
f t (EV,NPC)=α t +β t +c (8)
In the scene simulation process, the fitness score is measured every 0.1s, and f is measured at the time step t t The larger indicates the higher the risk of a vehicle collision at time t. The fitness score of a scene may increase or decrease after each time step based on the dynamic change of the scene. The fitness score of a scene is as follows:
The genetic variation process in nature is complex, and after the genetic algorithm is simplified for many times, only two parts are simulated at last: crossover (crosslever) and Mutation (Mutation).
According to the embodiment of the invention, the chromosome crossover is designed into the crossover operation of NPCs in scenes, one NPC is randomly selected in each of two scenes in each generation, and the crossover operation of the chromosomes is completed by exchanging the motion states (vehicle speed and running direction) of the NPCs in the running time of the two scenes. As shown in fig. 7.
The chromosomal variation is that a certain NPC is randomly selected in the genetic process, and the running state of the vehicle at a certain moment is randomly changed, and the specific operation is shown in fig. 8.
In one possible embodiment, in step three, the scene is selected in order to eliminate unsuitable individuals from each generation of scenes, thereby obtaining individuals with a higher fitness. The present invention uses a roulette selection (roulette wheel selection) method to achieve this, which is more frequently selected if the adaptability of the scene is high. In this method, the selection probability of each individual is proportional to the fitness thereof, and the larger the fitness is, the larger the selection probability is. The selection of individuals in making roulette selections is often based not on the individual selection probabilities, but rather on the cumulative probabilities.
Obviously, the region with the highest proportion of the population is selected with the greatest probability in the individual selection process, because it is the highest proportion of the population. But at the same time, some scenes with lower fitness scores can survive in the selection process, so that the advantage is that the gene library of the population is enriched, and the situation that the population is trapped in local optimum in the evolution process is avoided. Wherein the probability that each portion is selected is proportional to its fitness value. Let a certain part x i The fitness value of (a) is expressed as f (x) i ) The probability of the portion being selected is p (x i ) The cumulative probability is q (x i ) The corresponding calculation formula is as follows:
in a possible implementation manner, the embodiment of the present invention in the fourth step further uses a random restart mechanism to avoid trapping in the locally optimal solution. When the scene fitness score in the iterative process does not increase with the passage of time, a random restart module is executed. Comparing the fitness score of the current scene with the average fitness score of the first five generations, and if the current fitness score is found to be smaller than the average value of the first five generations, selecting the scene with the lowest similarity with the running scene from 1000 scenes with randomly initialized NPC tracks according to the NPC track in each scene recorded before as a new input of a random restart genetic algorithm. The similarity is achieved by calculating the euclidean distance between the NPC track in the candidate scene and the NPC track in the previous scene, and the larger the distance is, the lower the similarity between the scenes is. The correlation of scene similarity is defined as follows:
S t npc,1 The state information of the vehicle indicating the state of the NPC1 at the time of the scene t includes the position information L of each vehicle t npc,1 And velocity information V t npc,1 Formula 11 is shown below:
S t npc,1 =L t npc,1 +V t npc,1 (11)
by usingRepresenting Euclidean distance of different NPCs at time t, assuming that the position information of the same NPC in different test cases can be represented as L t npc,1′ And L is equal to t npc,1 Scene similarity of scenes Scenario1 and Scenario2The following formula is shown:
on the basis, the invention provides a new coverage index for measuring the validity of the test case and the completeness of the test.
The automatic driving system obtains environmental information based on sensors and makes decisions and controls. If the obstruction obstructs the ability of the sensor to acquire information, the autopilot system may not make the correct decisions and controls, resulting in an accident.
It is therefore important to test the performance of an autopilot system in the event of a shelter, as this may result in the vehicle being unable to identify, track or avoid other vehicles, pedestrians, obstructions, etc., thereby compromising the life safety of the occupants.
As shown in fig. 9 (a) and 9 (b), the occlusion angle θ in two scenes 1 =θ 2 However, it can be obviously found that the occluded conditions of the two scenes are completely different, and the main vehicle in the scene 1 can detect part of the occluded vehicles due to the relatively close distance, and the scene 2 can not be detected by the main vehicle due to the relatively far distance of the main vehicle, so that most of the area of the forefront vehicle can not be detected by the main vehicle.
By further analysing the two scenes it can be seen that the risk of the two scenes will be significantly different when facing different behaviour of the vehicles in the scenes, for example when also facing the front vehicle 2 to the left, scene 1 is blocked by a smaller angle, but because of the closer distance between the vehicles, there is less reaction space left for the host vehicle and the risk is more likely to occur.
In contrast, the host vehicle in scene 2 has more time to take action when facing the same situation because of the greater distance from the lead vehicle 2. In contrast, when the front vehicle 1 in the scene is driven to the right, the scene 1 has no risk of collision because the host vehicle is not on the same lane as the front vehicle 2, whereas in the scene 2, the host vehicle is in the same lane although being far from the front vehicle 2, so that when the front vehicle 1 is driven away, the host vehicle has a risk of collision with the front vehicle 2.
In view of this, the present invention describes scene coverage using two parameters as a parameter space composed of the abscissa and the ordinate, one parameter is the euclidean distance between the vehicle under test and the vehicle with the occlusion condition, and the other parameter is the occlusion parameter proposed in the foregoing embodiment of the present invention. Assume that a total of n test cases are executed in the experiment, where the ith test case T i The parameter of (A) i ,B i ) Covering a small part of the parameter space. The cumulative coverage Cov of the parameter space can be expressed as the ratio of the Area of the parameter space covered by all test cases T to the total Area of the parameter space, i.e.:
the coverage immunity of each test case can be obtained by calculating the area of the rectangle where the test case is located. Suppose the ith test case T i The parameter area covered is rectangular (a i ,b i ) To (A) i ,B i ) Its coverage area can be expressed as:
Area(T i )=(A i -a i )×(B i -b i ) (15)
substituting the formula into the cumulative coverage formula to obtain:
the technical effects of the invention are further verified and illustrated by simulation experiments.
Experimental environment:
the machine used in the experimental process of the invention is as follows: dell Precison 3640Tower desktop; the processor is as follows: intel i9-10900K; a 32G memory; the operating system is 64-bit window10.
In order to ensure fairness of experimental results, the same initial scene is selected for different scene generation methods. The simulation software used is an autopilot toolbox in Matlab R2021b, which provides driving scenarios, sensors, functional modules of the vehicle, and 3D simulation modules from which various required autopilot simulation models can be built for simulation of autopilot. Using the helpergridbase planngscenario module and Motion Planning in Urban Environments Using Dynamic Occupancy Grid Map algorithm provided in accordance with the autopilot toolbox as a test platform and autopilot control algorithm in virtual safety testing, behavior logic of the autopilot vehicle is simulated in a virtual driving scenario, driving tasks including obstacle, path planning, and controlling the vehicle in a given driving scenario are performed by using the simulated sensor data provided in the helpergridbase planngscenario module.
In order to verify the effectiveness of the method, in the experiment, each test needs to iterate the scene 100 times, and finally after multiple experiments, 3 typical test cases which can cause the intelligent vehicle to lose control or collide with other vehicles are searched out.
Fig. 10 is a high-contrast test scenario searched during an experiment, in which a host vehicle is traveling straight on a highway at a constant speed. When the main vehicle tries to overrun two non-main vehicles of adjacent lanes in the same direction, the yellow vehicle in the lane on the right side changes lanes at the moment, the main vehicle is blocked by the radar detector due to the fact that the speed is high and the non-main vehicle 1, so that the main vehicle cannot timely detect the front lane-changing vehicle, and finally, collision accidents occur. In general, a human driver can properly reduce the speed of a vehicle according to actual situations when encountering a visual blind zone, and particularly, when the intention of a front vehicle to change lanes is found, the driver can have more cautious driving behaviors, so that accidents are avoided. For an automatic driving system, if the detection blind area is larger and the speed of a host vehicle is higher than that of surrounding vehicles, the automatic driving system adopts a conservative driving strategy and reduces the speed of the vehicle.
To verify whether a collision of a vehicle is related to the shielding of a host vehicle by a preceding vehicle, a supplementary experiment was performed as shown in fig. 11: through changing the position of the red car, the red car cannot shade the main car, and finally the main car is found to successfully identify the yellow car and successfully carry out avoidance operation.
The validity of the verification method needs to verify whether the method of the embodiment of the invention promotes the proportion of the main vehicle detection blind areas in the generated input. By using the method, the AV-fuzzer and the random fuzzer of the embodiment of the invention, newly generated inputs are collected in the fuzzy test process, and after the fuzzy test is finished, the proportion of the detection blind areas of the main vehicle in the inputs is tested and compared, and the result is shown in figure 12. In fig. 12, the horizontal axis represents the number of iterations, and the vertical axis represents the proportion of the host vehicle detection dead zone corresponding to each generation of input. As can be seen from fig. 12, in the beginning of the test, the proportion of the main vehicle probe blind area generated and input by the method of the embodiment of the invention is slightly lower than that of the AV-fuse, because the input is randomly generated in the early stage of the method of the embodiment of the invention, and the test method is still in the stage of exploration and utilization. The blind area proportion of the method of the embodiment of the invention is obviously higher than that of AV-fuzzer and random fuzzer in the middle and later stages of the test, because the test tool based on the method selects and modifies actions by utilizing the knowledge learned by the unique fitness function defined by the embodiment of the invention after the exploration and utilization in the previous stages, the possibility that the non-host vehicles can cause more shielding to the host vehicles in the new input is improved, and the proportion of the detection blind areas is improved.
The inventive examples selected AV-fuzzers and random fuzzers for comparative experiments. The objective of the test is a path planning algorithm built in Matlab2021b, in order to evaluate the overall effect of the test method, the initial scene and variation probability of each method are set to the same value, and after 100 iterative searches are performed equally, the effectiveness and efficiency of three kinds of blurs are evaluated and represented in a graphical manner.
Fig. 13 (a) and 13 (b) show the number of dangerous test cases searched by the three methods respectively in the case of the same algebraic search of the fuzzifier. The BlindSpotsFuzz provided by the embodiment of the invention searches out 9 different dangerous test cases altogether, the AV-fuzzzer finds out 5 dangerous test cases, and the random fuzzzer does not find out any dangerous test cases. In terms of time used, the BlindSpotsFuzz searched for a dangerous test case with the least average time. In summary, the BlindSpotsFuzz detects the most dangerous scenes within the same fuzzy test time and iteration number. Compared with other methods, the BlindSpotsFuzz can change more dangerous test cases based on collision, and can reject invalid test cases in the fuzzy test process, so that the fuzzy search efficiency is further improved. In addition, 4 out of 9 dangerous test cases found by BlindSpotsFuzz are not detected by other blushers, and 5 test cases found by AV-fuzzer are all detected by BlindSpotsFuzz, so that the method can be illustrated to be a powerful complement to the current intelligent vehicle fuzzy test.
The invention further provides a detection blind area guiding fuzzy test system of an automatic driving system, which comprises the following steps:
the initial scene generation module is used for randomly generating a plurality of initial scenes according to the related semantic limit of the simulator, wherein the scenes comprise a host vehicle with a built-in tested automatic driving system and a plurality of traffic participants, the duration of each scene and interval time nodes are set, and the instantaneous speed and the channel changing condition of the traffic participants on each time node are taken as variable parameters;
the scene queue variation module is used for running each initial scene in the simulator, calculating the fitness score according to the fitness function which is designed in advance and used for describing the safety conditions among vehicles in the scene according to the blind area shielding condition of the detection area of the sensor of the main vehicle, reserving the scene with higher score and poorer safety performance according to the corresponding score condition of each initial scene, and varying the instantaneous speed and the lane changing condition of each time node of the traffic participant in the scene;
the local search module is used for operating the mutated scene queue in the simulator as a new use case to obtain a new fitness score, carrying out mutation again based on the new score, calculating the average value of all scene fitness scores after repeating the above processes for a plurality of times, selecting scenes with scores larger than the average value for local search, and restarting other scenes with scores smaller than the average value at random;
The initial scene updating module is used for calculating the similarity among the scenes according to the scenes with the fitness scores smaller than the mean value after being restarted randomly, keeping a plurality of scenes with the lowest similarity as initial scenes, and carrying out the steps again after updating the initial scenes;
the output scene selection module is used for carrying out mutation again based on parameters in scenes with higher fitness, operating the mutated scenes in the simulator, calculating fitness scores, comparing the scores of the scenes before and after mutation, and reserving the higher scene as the output scene.
Another embodiment of the present invention also proposes an electronic device, including: a memory storing at least one instruction; and the processor executes the instructions stored in the memory to realize the automatic driving system detection blind area guiding fuzzy test method.
Another embodiment of the present invention further provides a computer readable storage medium, where at least one instruction is stored, where the at least one instruction is executed by a processor in an electronic device to implement the method for testing the blind detection zone guidance ambiguity of an autopilot system according to the present invention.
For example, the instructions stored in the memory may be partitioned into one or more modules/units that are stored in a computer-readable storage medium and executed by the processor to perform the autonomous system blind spot guided fuzzy test method of the present invention. The one or more modules/units may be a series of computer readable instruction segments capable of performing a specified function, which describes the execution of the computer program in a server.
The electronic equipment can be a smart phone, a notebook computer, a palm computer, a cloud server and other computing equipment. The electronic device may include, but is not limited to, a processor, a memory. Those skilled in the art will appreciate that the electronic device may also include more or fewer components, or may combine certain components, or different components, e.g., the electronic device may also include input and output devices, network access devices, buses, etc.
The processor may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory may be an internal storage unit of the server, such as a hard disk or a memory of the server. The memory may also be an external storage device of the server, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like, which are provided on the server. Further, the memory may also include both an internal storage unit and an external storage device of the server. The memory is used to store the computer readable instructions and other programs and data required by the server. The memory may also be used to temporarily store data that has been output or is to be output.
It should be noted that, because the content of information interaction and execution process between the above module units is based on the same concept as the method embodiment, specific functions and technical effects thereof may be referred to in the method embodiment section, and details thereof are not repeated herein.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing device/terminal apparatus, recording medium, computer Memory, read-Only Memory (ROM), random access Memory (RAM, random Access Memory), electrical carrier signals, telecommunications signals, and software distribution media. Such as a U-disk, removable hard disk, magnetic or optical disk, etc.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.
Claims (10)
1. The automatic driving system detection blind area guiding fuzzy test method is characterized by comprising the following steps of:
according to the related semantic limit of the simulator, randomly generating a plurality of initial scenes, wherein each scene comprises a host vehicle with a built-in tested automatic driving system and a plurality of traffic participants, setting the duration of each scene and interval time nodes, and taking the instantaneous speed and the channel changing condition of the traffic participants on each time node as variable parameters;
Running each initial scene in a simulator, calculating fitness scores according to fitness functions which are designed in advance and describe safety conditions among vehicles in the scenes according to blind area shielding conditions of detection areas of sensors of main vehicles, reserving scenes with higher scores and poorer safety performance according to the corresponding score conditions of each initial scene, and carrying out variation on the instantaneous speeds and lane changing conditions of time nodes of traffic participants in the scenes;
operating the mutated scene queue in a simulator as a new use case to obtain a new fitness score, carrying out mutation again based on the new score, calculating the average value of all scene fitness scores after repeating the above process for a plurality of times, selecting scenes with scores larger than the average value for local search, and randomly restarting other scenes with scores smaller than the average value;
after randomly restarting, aiming at scenes with fitness scores smaller than the average value, calculating the similarity among the scenes, keeping a plurality of scenes with the lowest similarity as initial scenes, and carrying out the steps again after updating the initial scenes;
and (3) carrying out mutation again based on parameters in the scene with higher fitness, operating the mutated scene in a simulator, calculating a fitness score, comparing the score of the scene before and after mutation, and reserving the higher scene as an output scene.
2. The method for guiding and fuzzy testing the detection blind area of the automatic driving system according to claim 1, wherein the variable parameter defines a value range in a scene, the speed of the host vehicle is controlled to be 30km/h at a constant speed, the speed of the traffic participant is controlled to be in a range of 20km/h to 120km/h, the lane change condition of the traffic participant is represented by 0, 1 and 2, 0 represents straight lane change, 1 represents left lane change and 2 represents right lane change; and also, restraining all vehicle behaviors in the scene, and eliminating invalid scenes.
3. The method for guiding and fuzzy testing the blind area detected by the automatic driving system according to claim 1, wherein the fitness function describing the safety condition among vehicles in the scene is designed according to the blind area shielding condition of the detection area of the sensor of the host vehicle in the following manner: if the host vehicle follows in the same direction, the length of the vehicle is l, the width is d, the left lower corner of the front Fang Zhongdian of the rectangular front of the host vehicle and the rectangular front of the front vehicle is taken as a reference point, the transverse distance between the front vehicle and the rectangular front of the host vehicle is taken as x, the longitudinal distance between the front vehicle and the rectangular front of the host vehicle is taken as y, the blind area angle caused by shielding of the front vehicle to the host vehicle in the detection range of the detector is taken as S, when the vehicle runs, the closer other vehicles in the scene are to the host vehicle, the larger the blind area angle S is, and the calculation modes of the shielding angle S are different along with the change of the relative positions between the vehicles.
4. The method for testing the guidance blur of the detection blind area of the automatic driving system according to claim 3, wherein the following calculation modes of the shielding angle S are respectively as follows:
when 0 < x.ltoreq.y-d:
when y-d < x.ltoreq.y:
when y < x < y+l:
5. the method for guiding and fuzzy testing the blind area detected by the automatic driving system according to claim 3, wherein when more than two non-host vehicles simultaneously appear in the detection area of the sensor of the host vehicle, the sensor of the host vehicle is simultaneously detected by two vehicles to be shielded, and if the blind area angles of the vehicles are overlapped, the corresponding scene is considered to be a dangerous scene, and otherwise, the corresponding scene is safe;
blind zone overlap angle S b The calculated expression of (2) is as follows:
S,=S area] S area2
blind zone overlap angle S b The following relationship exists: when S is b >0 at risk, when S b Safety when=0;
defining the angle range of the blind area of each vehicle as S area Based on the calculation result of S area The calculated expression of (2) is as follows:
when S is b When the distance between the vehicles is larger, the shielding condition of the front vehicle to the main vehicle sensor is more serious, and the distance between the vehicles is also closer; on the contrary, when the ratio of the blind area angle overlapping part is higher, but the distance of the vehicle is longer, and the ratio of the blind area angle overlapping part is lower, but the distance of the vehicle is shorter, the danger of the corresponding scene is lower when the two conditions occur, and S b The value of (2) is also lower.
6. The method for guiding fuzzy testing of a blind spot in an automatic driving system according to claim 1, wherein the steps of operating the mutated scene queue as a new use case in a simulator, obtaining a new fitness score, mutating again based on the new score, calculating the average value of all scene fitness scores after repeating the above process for several times, selecting scenes with scores greater than the average value for local search, and randomly restarting other scenes with scores less than the average value include:
maximizing discovery of highly resistant hazard test scenarios within a given search budget, definition b t For inputting x t E, finding a set of antagonistic scenes when the X fuzzy iteration number is t times, and then finding the antagonistic scene y t The expression is as follows:
y t =max{b 1 ,b 2 ,b 3 ……,b t }
using a fuzzy search method based on a genetic algorithm, wherein the final objective of the fuzzy search is to maximize various found resistance scenes, and the fuzzy search corresponds to the optimal individuals in the offspring to be found in the genetic algorithm;
in genetic algorithms, fitness scores of individuals are used to determine whether the individual's genes should be retained or eliminated; selecting a design fitness function based on vehicle field knowledge and scene situation, wherein the fitness function calculates a maximum distance alpha which can be moved under the condition that a host vehicle EV does not collide with other vehicle NPCs in each time step according to vehicle related parameters in the scene, and the shielding rate beta of other vehicles in the scene to the host vehicle radar, and the fitness function at a certain moment t in the scene is as follows:
f t (EV,NPC)=α t +β t
If a constant c is additionally added when a vehicle collides in the scene, collision compensation as a function of fitness:
f t (EV,NPC)=α t +β t +c
in the course of scene simulation, the fitness score is measured once per time step t, f measured at time step t t The larger the higher the risk of vehicle collision at time t; according to the sceneDynamically changing, wherein the adaptability score of the scene is increased or decreased after each time step; the fitness score calculation expression of the scene is as follows:
the method comprises the steps of designing chromosome crossover as exchange operation of NPCs in scenes, randomly selecting one NPC in each of two scenes in each generation, and completing chromosome crossover operation by exchanging motion states of the NPCs in running time of the two scenes; chromosomal variation is a random change of a vehicle running state at a certain moment by randomly selecting a certain NPC in the genetic process.
7. The method for guiding fuzzy testing of an automated driving system blind spot according to claim 6, wherein the selection of a scene is aimed at eliminating unsuitable individuals from each generation of scenes, obtaining individuals with higher fitness, using a roulette selection method to achieve this, and if the fitness of a scene is higher, then the scene is easier to select; in the roulette selection method, the selection probability of an individual is proportional to the fitness thereof, and the larger the fitness is, the larger the selection probability is; let a certain part x i The fitness value of (a) is expressed as f (x) i ) The corresponding selected probability is p (x i ) The cumulative probability is q (x i ) The calculation expression is as follows:
8. the method for guiding fuzzy testing of blind spot detection in automatic driving system according to claim 1, wherein in the step of calculating similarity among scenes for scenes with fitness score smaller than average value, and keeping several scenes with lowest similarity as initial scenes, a random restarting mechanism is used to avoid sinking into local optimal solution;
when the scene fitness score in the iterative process is not improved with the passage of time, executing random restarting at the moment;
comparing the fitness score of the current scene with the average fitness score of the previous generations, and if the current fitness score is smaller than the average value of the previous generations, selecting a scene with the lowest similarity with the running scene from the scenes of the randomly initialized NPC tracks according to the NPC tracks in each scene recorded before as a new input of a random restarting genetic algorithm; the similarity is realized by calculating Euclidean distance between NPC tracks in candidate scenes and NPC tracks in previous scenes, and the larger the distance is, the lower the similarity between scenes is; the scene similarity is calculated as follows:
S t npc,1 The state information of the vehicle indicating the state of the NPC1 at the time of the scene t includes the position information L of each vehicle t npc,1 And velocity information V t npc,1 Calculated according to the following expression:
S t npc,1 =It npc,1 +V t npc,1
by usingRepresenting Euclidean distance of different NPCs at time t, assuming that the position information of the same NPC in different test cases can be represented as L t npc,1′ And L is equal to t npc,1 Scene similarity of scenes Scenario1 and Scenario2 +.>The calculation is performed according to the following expression:
9. the method for guiding fuzzy testing of a blind spot in an autopilot system of claim 1 further comprising measuring validity of a test case and completeness of the test by a coverage index;
describing scene coverage by using two parameters as a parameter space formed by the abscissa and the ordinate, wherein one parameter is the Euclidean distance between a tested vehicle and a vehicle with an occlusion condition, and the other parameter is a proposed occlusion parameter;
assuming that a total of n test cases are executed, the ith test case T i The parameter of (A) i ,B i ) Covering a small portion of the parameter space, the cumulative coverage Cov of the parameter space is expressed as the ratio of the Area of the parameter space covered by all test cases T to the Area of the total parameter space, expressed as follows:
Wherein, the coverage immunity of each test case is obtained by calculating the area of the rectangle where the test case is located, and the ith test case T is assumed i The parameter area covered is rectangular (a i ,b i ) To (A) i ,B i ) Its coverage area is expressed as:
Area(T i )=(A i -a i )×(B i -b i )
the cumulative coverage Cov is:
10. the utility model provides an automatic driving system detection blind area direction fuzzy test system which characterized in that includes:
the initial scene generation module is used for randomly generating a plurality of initial scenes according to the related semantic limit of the simulator, wherein the scenes comprise a host vehicle with a built-in tested automatic driving system and a plurality of traffic participants, the duration of each scene and interval time nodes are set, and the instantaneous speed and the channel changing condition of the traffic participants on each time node are taken as variable parameters;
the scene queue variation module is used for running each initial scene in the simulator, calculating the fitness score according to the fitness function which is designed in advance and used for describing the safety conditions among vehicles in the scene according to the blind area shielding condition of the detection area of the sensor of the main vehicle, reserving the scene with higher score and poorer safety performance according to the corresponding score condition of each initial scene, and varying the instantaneous speed and the lane changing condition of each time node of the traffic participant in the scene;
The local search module is used for operating the mutated scene queue in the simulator as a new use case to obtain a new fitness score, carrying out mutation again based on the new score, calculating the average value of all scene fitness scores after repeating the above processes for a plurality of times, selecting scenes with scores larger than the average value for local search, and restarting other scenes with scores smaller than the average value at random;
the initial scene updating module is used for calculating the similarity among the scenes according to the scenes with the fitness scores smaller than the mean value after being restarted randomly, keeping a plurality of scenes with the lowest similarity as initial scenes, and carrying out the steps again after updating the initial scenes;
the output scene selection module is used for carrying out mutation again based on parameters in scenes with higher fitness, operating the mutated scenes in the simulator, calculating fitness scores, comparing the scores of the scenes before and after mutation, and reserving the higher scene as the output scene.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311060753.7A CN117111578A (en) | 2023-08-22 | 2023-08-22 | Automatic driving system detection blind area guiding fuzzy test method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311060753.7A CN117111578A (en) | 2023-08-22 | 2023-08-22 | Automatic driving system detection blind area guiding fuzzy test method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117111578A true CN117111578A (en) | 2023-11-24 |
Family
ID=88801467
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311060753.7A Pending CN117111578A (en) | 2023-08-22 | 2023-08-22 | Automatic driving system detection blind area guiding fuzzy test method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117111578A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117909249A (en) * | 2024-03-20 | 2024-04-19 | 中国汽车技术研究中心有限公司 | Method and equipment for generating test cases of automatic driving scene |
CN118467363A (en) * | 2024-05-16 | 2024-08-09 | 青岛科技大学 | Parallel program basic path coverage test based on differential evolution algorithm |
CN118709576A (en) * | 2024-08-29 | 2024-09-27 | 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) | Accident scene generation method and system based on automatic driving virtual simulation |
-
2023
- 2023-08-22 CN CN202311060753.7A patent/CN117111578A/en active Pending
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117909249A (en) * | 2024-03-20 | 2024-04-19 | 中国汽车技术研究中心有限公司 | Method and equipment for generating test cases of automatic driving scene |
CN118467363A (en) * | 2024-05-16 | 2024-08-09 | 青岛科技大学 | Parallel program basic path coverage test based on differential evolution algorithm |
CN118709576A (en) * | 2024-08-29 | 2024-09-27 | 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) | Accident scene generation method and system based on automatic driving virtual simulation |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN117111578A (en) | Automatic driving system detection blind area guiding fuzzy test method and system | |
CN112133089B (en) | Vehicle track prediction method, system and device based on surrounding environment and behavior intention | |
WO2022052406A1 (en) | Automatic driving training method, apparatus and device, and medium | |
CN113609784B (en) | Traffic limit scene generation method, system, equipment and storage medium | |
CN111079800B (en) | Acceleration method and acceleration system for intelligent driving virtual test | |
CN114511999A (en) | Pedestrian behavior prediction method and device | |
US20230162539A1 (en) | Driving decision-making method and apparatus and chip | |
Amini et al. | Development of a conflict risk evaluation model to assess pedestrian safety in interaction with vehicles | |
CN115062202A (en) | Method, device, equipment and storage medium for predicting driving behavior intention and track | |
CN117341683A (en) | Vehicle dynamic track fitting obstacle avoidance method and system based on multi-target recognition | |
CN118238847B (en) | Autonomous lane change decision planning method and system adaptive to different driving styles and road surface environments | |
Eggert et al. | The foresighted driver: Future ADAS based on generalized predictive risk estimation | |
Sukthankar et al. | Evolving an intelligent vehicle for tactical reasoning in traffic | |
CN114701517A (en) | Multi-target complex traffic scene automatic driving solution based on reinforcement learning | |
CN114896754A (en) | Automatic driving system performance evaluation method oriented to logic scene full-parameter space | |
Kang et al. | A control policy based driving safety system for autonomous vehicles | |
Dey et al. | Machine learning based perception architecture design for semi-autonomous vehicles | |
Saraoglu et al. | A Minimax-Based Decision-Making Approach for Safe Maneuver Planning in Automated Driving | |
Moller et al. | Overcoming Blind Spots: Occlusion Considerations for Improved Autonomous Driving Safety | |
CN118072553B (en) | Intelligent traffic safety management and control system | |
CN118238849B (en) | Expressway driving decision method based on multistage safety protocol policy optimization | |
Jiangkun et al. | Complexity Evaluation for Urban Intersection Scenarios in Autonomous Driving Tests: Method and Validation | |
CN113715846B (en) | Lane borrowing control method and device, storage medium and vehicle | |
Xue et al. | [Retracted] Intelligent Control of a Driverless Energy Vehicle Based on an Environment Sensing Sensor | |
Lee et al. | Predictive Intelligent Driver Model using Deep Learning-based Prediction of Surrounding Vehicle Trajectory |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |