CN115979679A - Method, apparatus and storage medium for testing actual road of automatic driving system - Google Patents

Method, apparatus and storage medium for testing actual road of automatic driving system Download PDF

Info

Publication number
CN115979679A
CN115979679A CN202310280259.5A CN202310280259A CN115979679A CN 115979679 A CN115979679 A CN 115979679A CN 202310280259 A CN202310280259 A CN 202310280259A CN 115979679 A CN115979679 A CN 115979679A
Authority
CN
China
Prior art keywords
vehicle
data
vehicle dynamics
driving
dynamics parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310280259.5A
Other languages
Chinese (zh)
Other versions
CN115979679B (en
Inventor
孙航
华一丁
王兆
陈振宇
张琳琳
张行
周博林
王霁宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Automotive Technology and Research Center Co Ltd
CATARC Automotive Test Center Tianjin Co Ltd
Original Assignee
China Automotive Technology and Research Center Co Ltd
CATARC Automotive Test Center Tianjin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Automotive Technology and Research Center Co Ltd, CATARC Automotive Test Center Tianjin Co Ltd filed Critical China Automotive Technology and Research Center Co Ltd
Priority to CN202310280259.5A priority Critical patent/CN115979679B/en
Publication of CN115979679A publication Critical patent/CN115979679A/en
Application granted granted Critical
Publication of CN115979679B publication Critical patent/CN115979679B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The invention relates to the field of automatic driving, and discloses a method, equipment and a storage medium for testing an actual road of an automatic driving system. The method comprises the following steps: the method comprises the steps that driving environment perception data are obtained in real time in the process that a detected vehicle runs on an actual social road in an automatic driving mode; segmenting the running time period of the tested vehicle to obtain a plurality of time periods, segmenting driving environment perception data, and respectively inputting the driving environment perception data into a driver model to obtain anthropomorphic driver control data; inputting anthropomorphic driver control data into a vehicle dynamics model respectively to obtain first vehicle dynamics parameter data; and comparing the first vehicle dynamics parameter data and the second vehicle dynamics parameter data corresponding to the same time period to determine the test result of the automatic driving system. The embodiment realizes the actual road test of the automatic driving system and solves the problem that the intelligent performance of the automatic driving system is difficult to test and evaluate.

Description

Method, apparatus and storage medium for testing actual road of automatic driving system
Technical Field
The invention relates to the field of automatic driving, in particular to a method and equipment for testing an actual road of an automatic driving system and a storage medium.
Background
The automatic driving is not only the research front in the vehicle engineering field, but also the development direction of the future automobile industry, and simultaneously, the automatic driving is also an important technical means for solving the problems of traffic safety, energy waste, environmental pollution and the like. With the automatic driving automobile entering a new deep development stage, how to comprehensively and accurately test the comprehensive performance of the automatic driving system is a research hotspot concerned by the whole automobile industry and academia at present. Compared with the traditional automobile test, the method has no generally agreed scheme for testing the functions and the performances of the automatic driving product at home and abroad on how to comprehensively test and evaluate the special intelligent attribute of the automatic driving automobile. At present, the automatic driving test method accepted by the international society, namely the automatic driving test criterion of the 'multi-strut method', is tested through various ways and modes such as simulation test, field test, actual road test and the like, wherein the actual road test can better verify the intelligent performance of the automatic driving system under the condition of coping with various random traffic.
Driver models, originally proposed and implemented by vehicle dynamics engineers, are referred to as "virtual test drivers" for closed-loop testing and simulation, i.e., by operating the vehicle to travel along a specified route at a given or self-set speed. Chinese patent 202011312152.7 proposes a driver model-based test method and device, which is used for assisting the test of the automatic driving algorithm of the vehicle to be tested by triggering and matching the driver models corresponding to other vehicles and matching the vehicles to drive based on the corresponding driver models, so as to realize the accurate test of the performance of the vehicle to be tested. However, how to perform the function and performance test of the automatic driving system based on the driver model in the actual road test is not considered, and the intelligent performance of the automatic driving system in a more complex real environment cannot be verified. Chinese patent 201410055985.8 proposes an auxiliary design system and method for vehicle steering system parameters based on a driver model, the whole test process simulates a human driving process, the state response of an automobile is obtained by changing the steering system parameters, and the parameters of the steering system are optimized by analyzing the state response, so that the performance of the steering system is more in line with the driving characteristics of a human. But the overall vehicle performance of the automatic driving system is not considered, and only how to carry out test verification of the vehicle steering system based on a driver model is concerned. Chinese patent 202210432328.5 provides an automatic driving test scene generation method based on an individualized driver model, human driving data is used as a data source, and authenticity and complexity of a test scene can be effectively improved by constructing individualized driver models with different styles. However, how to apply the driver model to a specific link of an actual road test is not considered, and a specific test system and a specific test scheme are not given.
In view of the above, the present invention is particularly proposed.
Disclosure of Invention
In order to solve the technical problems, the invention provides a method, equipment and a storage medium for testing an actual road of an automatic driving system, so that the actual road test of the automatic driving system is realized, and the problem that the intelligent performance of the automatic driving system is difficult to test and evaluate is solved.
The embodiment of the invention provides a method for testing an actual road of an automatic driving system, which comprises the following steps:
the method comprises the steps that driving environment perception data are obtained in real time in the process that a detected vehicle runs on an actual social road in an automatic driving mode;
based on the driving environment perception data, segmenting the running time period of the detected vehicle according to different environment scene types to obtain a plurality of time periods, wherein the environment scene types corresponding to the different time periods are different;
segmenting the driving environment perception data according to the plurality of time periods to determine sub-driving environment perception data respectively corresponding to each time period;
respectively inputting the sub-driving environment perception data respectively corresponding to each time interval into a driver model to obtain anthropomorphic driver control data respectively corresponding to each time interval;
inputting the anthropomorphic driver control data respectively corresponding to each time interval into a vehicle dynamics model respectively to obtain first vehicle dynamics parameter data respectively corresponding to each time interval; and comparing first vehicle dynamics parameter data corresponding to the same time period with second vehicle dynamics parameter data to determine a test result of the automatic driving system, wherein the second vehicle dynamics parameter data is obtained by reading data transmitted by a CAN bus in the tested vehicle.
An embodiment of the present invention provides an electronic device, including:
a processor and a memory;
the processor is used for executing the steps of the automatic driving system actual road testing method according to any embodiment by calling the program or the instructions stored in the memory.
Embodiments of the present invention provide a computer-readable storage medium storing a program or instructions for causing a computer to execute the steps of the method for testing an actual road of an automatic driving system according to any one of the embodiments.
The embodiment of the invention has the following technical effects:
the test method can be used for testing the randomness of the automatic driving actual road, fully utilizing the generalization capability of the mature driver model, automatically generating anthropomorphic driving behaviors aiming at an actual and random automatic driving test route, generating a test result in the aspect of the intelligent performance of the automatic driving system by transversely comparing the test result with the automatic behavior of the automatic driving system, and scientifically providing a reasonable method for testing and evaluating the intelligent performance of the automatic driving.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of a method for testing an actual road of an automatic driving system according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of an actual road testing system of an automatic driving system according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The method for testing the actual road of the automatic driving system, provided by the embodiment of the invention, can be used for facing the challenge of testing the randomness of the automatic driving actual road, fully utilizing the generalization capability of a mature driver model, automatically generating anthropomorphic driving behaviors aiming at the actual and random automatic driving test routes, generating a test result in the aspect of intelligent performance of the automatic driving system by transversely comparing the test result with the automatic behaviors of the automatic driving system, and scientifically providing a reasonable method for testing and evaluating the intelligent performance of the automatic driving. The method for testing the actual road of the automatic driving system provided by the embodiment of the invention can be executed by electronic equipment.
Fig. 1 is a flowchart of an actual road testing method for an automatic driving system according to an embodiment of the present invention. Referring to fig. 1, the method for testing an actual road of an automatic driving system specifically includes:
and S110, acquiring driving environment perception data in real time in the process that the detected vehicle runs in an automatic driving mode on the actual social road.
The driving environment perception data comprises at least one of obstacle information in front of the detected vehicle, vehicle information in front of the detected vehicle, traffic light information, speed limit sign information, road curvature information, position information of the detected vehicle, transverse and longitudinal dynamic parameter information, relative speed of the front vehicle compared with the detected vehicle and relative distance of the front vehicle compared with the detected vehicle.
The driving environment perception data is acquired by a camera, a GPS/INS integrated navigation system and a millimeter wave radar sensor in the vehicle to be detected, and the camera is installed on the inner side of a front windshield of the vehicle to be detected. Specifically, the camera is calibrated and used for sensing the information of obstacles in front of the vehicle, the information of the vehicle in front, the information of traffic lights, the information of speed limit signs and the information of road curvature in real time. The GPS/INS integrated navigation system is used for acquiring the position information and the transverse and longitudinal dynamic parameter information of the vehicle in real time. The millimeter wave radar sensor is used for acquiring the relative speed and relative distance information of a front vehicle compared with the measured automatic driving vehicle.
And S120, segmenting the running time periods of the detected vehicle according to different environmental scene types based on the driving environment perception data to obtain a plurality of time periods, wherein the environmental scene types corresponding to the different time periods are different.
S130, segmenting the driving environment perception data according to the plurality of time intervals to determine sub-driving environment perception data respectively corresponding to each time interval.
Wherein the environment scene type comprises a right turn of the social vehicle, a left turn of the social vehicle, following driving of the social vehicle, cut-in of a front vehicle of the social vehicle, cut-out of a front vehicle of the social vehicle, or lane change of the social vehicle.
In other words, the travel history of the vehicle under test in a specific environmental scene is divided into a period of time.
By segmenting the running time period of the tested vehicle according to different environment scene types based on the driving environment perception data, the test scenes subjected to the actual road test are segmented and segmented, and the starting and stopping times corresponding to different environment scenes are obtained.
Through classifying according to the environment scene, the performance of the automatic driving system in a specific scene can be tested, the test accuracy and the test coverage are favorably improved, and the test comprehensiveness is ensured.
And S140, respectively inputting the sub-driving environment perception data respectively corresponding to each time interval into the driver model to obtain anthropomorphic driver control data respectively corresponding to each time interval.
Wherein the anthropomorphic driver control data comprises at least one of an accelerator pedal opening, a brake pedal opening, and a steering wheel angle. The driving behavior of a real driver in a specific environment scene is simulated through the driver model, and the driving behavior is compared with the automatic driving behavior of the automatic driving system, so that the intelligent performance of the automatic driving system is determined, and the test of the automatic driving system is realized.
And S150, inputting the anthropomorphic driver control data respectively corresponding to each time interval into a vehicle dynamics model respectively to obtain first vehicle dynamics parameter data respectively corresponding to each time interval.
Furthermore, in order to improve the simulation precision of the driving behavior and further improve the testing precision of the automatic driving system, the driver model is improved, so that the driver model can obtain anthropomorphic driver control data of different driving styles based on the driving environment perception data.
Correspondingly, the anthropomorphic driver control data respectively corresponding to each time interval comprises: anthropomorphic driver control data for different driving styles, including conservative, normal, and aggressive, respectively corresponding to each time period.
The step of inputting the anthropomorphic driver control data respectively corresponding to each time interval into the vehicle dynamics model respectively to obtain first vehicle dynamics parameter data respectively corresponding to each time interval comprises the following steps:
and inputting the anthropomorphic driver control data of different driving styles corresponding to each time interval into a vehicle dynamics model respectively to obtain sub-vehicle dynamics parameter data of different driving styles corresponding to each time interval, wherein the sub-vehicle dynamics parameter data of different driving styles form the first vehicle dynamics parameter data. Namely, three sub-vehicle dynamics parameter data are corresponding to one time interval, namely, the sub-vehicle dynamics parameter data corresponding to a conservative driver, the sub-vehicle dynamics parameter data corresponding to a common driver and the sub-vehicle dynamics parameter data corresponding to an aggressive driver.
Therefore, the three sub-vehicle dynamic parameter data need to be screened through a certain strategy, and one of the three sub-vehicle dynamic parameter data is finally reserved. Before comparing the first vehicle dynamics parameter data and the second vehicle dynamics parameter data corresponding to the same time period and determining the test result of the automatic driving system, the method further comprises:
and respectively carrying out similarity calculation on the sub-vehicle dynamics parameter data of different driving styles corresponding to a period of time and the second vehicle dynamics parameter data corresponding to the period of time, and determining the sub-vehicle dynamics parameter data with the maximum similarity as the first vehicle dynamics parameter data corresponding to the period of time.
And S160, comparing the first vehicle dynamics parameter data and the second vehicle dynamics parameter data corresponding to the same time period, and determining the test result of the automatic driving system.
And the second vehicle dynamics parameter data is obtained by reading data transmitted by a CAN bus in the tested vehicle.
For example, the comparing the first vehicle dynamics parameter data and the second vehicle dynamics parameter data corresponding to the same time period to determine the test result of the automatic driving system includes:
comparing the first vehicle dynamics parameter data and the second vehicle dynamics parameter data corresponding to the same time period one by one, and determining the similarity of each index based on the root mean square error;
carrying out weighted summation on the similarity of different indexes to obtain the test result;
the first vehicle dynamics data and the second vehicle dynamics data each include at least one indicator of longitudinal acceleration, longitudinal jerk, lateral acceleration, lateral jerk, yaw angle, and yaw rate.
The embodiment has the following technical effects: the test method can be used for testing the randomness of the automatic driving actual road, fully utilizing the generalization capability of the mature driver model, automatically generating anthropomorphic driving behaviors aiming at an actual and random automatic driving test route, generating a test result in the aspect of the intelligent performance of the automatic driving system by transversely comparing the test result with the automatic behavior of the automatic driving system, and scientifically providing a reasonable method for testing and evaluating the intelligent performance of the automatic driving.
On the basis of the technical scheme of the embodiment, the embodiment of the invention also provides a scheme for acquiring the driver model, and aims to acquire the driving behaviors of drivers with different driving styles under different environmental scenes.
Specifically, a P group driver model training data group is obtained; each set of the driver model training data set comprises driving environment perception training data, driver control training data and driving style evaluation information; driving style evaluation information in at least part of the driver model training data sets is different;
and training a Generalized Regression Neural Network (GRNN) based on the P groups of driver model training data groups to obtain the driver model.
After the obtaining of the P groups of driver model training data sets, before training a Generalized Recurrent Neural Network (GRNN) based on the P groups of driver model training data sets, the method further includes:
determining values of N characteristic parameters corresponding to each group of driver model training data based on driving environment perception training data and/or driver control training data in each group of driver model training data;
constructing an N-dimensional space, and determining corresponding points of each group of driver model training initial data in the N-dimensional space by taking the values of the N characteristic parameters as coordinate values;
clustering P corresponding points in the N-dimensional space by using a K-means clustering analysis algorithm to obtain M clustering results;
determining a representative point for each clustering result, wherein the representative point is a corresponding point in the clustering result to which the representative point belongs, the distance between each representative point and the clustering center of the clustering result to which the representative point belongs is greater than a first set threshold, and the distance between each representative point and the clustering centers of other clustering results is greater than a second set threshold;
and correcting the driving style evaluation information in each group of the driver model training data so that the driving style evaluation information in each group of the corrected driver model training data is consistent with the driving style evaluation information of the representative points belonging to the same clustering result.
Before determining values of N feature parameters corresponding to each set of driver model training data based on driving environment perception training data and/or driver control training data in each set of driver model training data, the method further includes:
performing principal component analysis in the alternative parameters to determine N characteristic parameters; the alternative parameters comprise a vehicle speed average value, a vehicle speed standard deviation, a vehicle speed maximum value, a longitudinal acceleration average value, a longitudinal acceleration standard deviation, a longitudinal acceleration maximum value, a transverse acceleration average value, a transverse acceleration standard deviation, a transverse acceleration maximum value, a steering wheel corner average value, a steering wheel corner standard deviation and a steering wheel corner maximum value.
Further, referring to the schematic structural diagram of the actual road testing system of the automatic driving system shown in fig. 2, the actual road testing system includes a forward test camera 210, a GPS/INS integrated navigation system 220, a millimeter wave radar sensor 230, an on-vehicle CAN bus data reading module 240, an automatic driving test scene segmentation processing module 250, a mature driver model 260, a vehicle dynamics model 270, a vehicle dynamics parameter similarity comparison module 280, an automatic driving system intelligence performance test result display module 290, a driving style classification judging module 300, and a mature driver natural driving behavior database 310.
The forward test camera 210 is used for sensing obstacle information in front of the vehicle, vehicle information in front of the vehicle, traffic light information, speed limit sign information and road curvature information in real time. The GPS/INS integrated navigation system 220 is used to obtain the position information and the lateral-longitudinal dynamics parameter information of the vehicle in real time. The millimeter wave radar sensor 230 is used to obtain the relative speed and distance information of the front vehicle compared to the measured autonomous vehicle. The on-board CAN bus data reading module 240 is configured to read second vehicle dynamics parameter data of the autopilot system.
The automatic driving test scene segmentation processing module 250 is configured to segment a test scene that the actual road test has undergone, that is, based on the driving environment perception data, segment the driving time period of the vehicle to be tested according to different environment scene types to obtain a plurality of time periods, where the environment scene types corresponding to different time periods are different, and segment the driving environment perception data according to the plurality of time periods to determine sub-driving environment perception data corresponding to each time period.
The mature driver model 260 is used to obtain anthropomorphic driver control data corresponding to each time period based on the sub-driving environment perception data corresponding to each time period.
The vehicle dynamics model 270 is configured to obtain first vehicle dynamics parameter data corresponding to each time period based on the personified driver control data corresponding to each time period.
The vehicle dynamics parameter similarity comparison module 280 is configured to compare the first vehicle dynamics parameter data and the second vehicle dynamics parameter data corresponding to the same time period, and determine a test result of the autopilot system.
The intelligent performance test result display module 290 of the automatic driving system is used for displaying the test result.
The driving style classification and judgment module 300 is used for performing cluster analysis on the driving behavior data of a large number of mature drivers, forming driving style classifications facing different scenes, namely three types of conservative type, common type and aggressive type, and performing labeling processing. Specifically, principal Component Analysis (PCA) is adopted to perform dimensionality reduction processing on high-dimensional data characteristics, and because various characteristic parameter dimensions are different, the data magnitude difference is large, and data with large magnitude can cover information reflected by data with small magnitude, a Z-score standardization method is adopted to process the data, all the dimensional data are scaled to the same region range to eliminate dimension influence, and then PCA processing is performed on a standardization data set. When the cumulative variance contribution rate of the principal component is close to 1 (usually 85%), the former m characteristic variables are used for replacing the former p variables to carry out comprehensive analysis, thereby simplifying the calculation steps and retaining the information of the former characteristics. After the K-means clustering analysis algorithm selects the input quantity K, dividing n data objects into K clusters according to a similarity principle, wherein the n clusters meet the following requirements after clustering: the similarity of data objects in the same cluster is high, and the similarity of data objects in different clusters is low.
The mature driver natural driving behavior database 310 stores a large amount of mature driver natural driving behavior data. Specifically, the driver is 20-50 years old, and the data acquisition frequency is 10Hz. Male drivers 170, female drivers 30. In order to meet the diversity of road conditions, the selected route relates to highway sections, such as expressways, urban expressways, national roads, rural roads and the like, and comprises curves, straight roads and ramps. In addition, the data also includes the evaluation results of 3 professional driving evaluators for the overall driving style of each driver. The driving evaluators are professional drivers with the driving ages of more than 20 years, each driver drives a vehicle on a given route according to usual driving habits in the evaluation process, 3 professional driving evaluators sit on the vehicle at the same time, and the driving style is evaluated by intensively observing the operation of the driver (such as the use of a throttle pedal and a brake pedal) and the subjective feeling of the driving evaluators (the emotion of the driver and the riding comfort). The evaluation results are classified into three types, conservative, common and aggressive. The driving evaluator evaluates the classification label to the driving style of the driver.
At present, the characteristic parameters representing the driving style attributes are not unified, and many scholars at home and abroad select speed, acceleration, throttle pedal position and throttle pedal pressure as representative characteristic parameters to evaluate the driving style, so that the effect is good. The method selects the speed, the longitudinal acceleration, the transverse acceleration and the steering wheel angle of 200 mature drivers on the basis of previous research, uses original data with the frequency of 10Hz to calculate 3 types of statistics of the Mean value (Mean), the standard deviation (Std) and the maximum value (Max) of 4 automobile parameters in each second, forms 12-dimensional characteristic parameters, and provides data samples for a subsequent driving style classification and judgment module.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. As shown in fig. 3, the electronic device 400 includes one or more processors 401 and memory 402.
The processor 401 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities and may control other components in the electronic device 400 to perform desired functions.
Memory 402 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, etc. One or more computer program instructions may be stored on the computer-readable storage medium and executed by processor 401 to implement the autopilot system actual road testing method of any of the embodiments of the invention described above and/or other desired functionality. Various contents such as initial external parameters, threshold values, etc. may also be stored in the computer-readable storage medium.
In one example, the electronic device 400 may further include: an input device 403 and an output device 404, which are interconnected by a bus system and/or other form of connection mechanism (not shown). The input device 403 may include, for example, a keyboard, a mouse, and the like. The output device 404 can output various information to the outside, including warning prompt information, braking force, etc. The output devices 404 may include, for example, a display, speakers, a printer, and a communication network and its connected remote output devices, among others.
Of course, for simplicity, only some of the components of the electronic device 400 relevant to the present invention are shown in fig. 3, omitting components such as buses, input/output interfaces, and the like. In addition, electronic device 400 may include any other suitable components depending on the particular application.
In addition to the above-described methods and apparatus, embodiments of the present invention may also be a computer program product comprising computer program instructions which, when executed by a processor, cause the processor to perform the steps of the autopilot system actual road testing method provided by any of the embodiments of the present invention.
The computer program product may write program code for carrying out operations for embodiments of the present invention in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present invention may also be a computer readable storage medium having stored thereon computer program instructions, which, when executed by a processor, cause the processor to perform the steps of the autopilot system actual road testing method provided by any of the embodiments of the present invention.
The computer-readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It is to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to limit the scope of the present application. As used in the specification and claims of this application, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural, unless the context clearly dictates otherwise. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of additional like elements in a process, method, or apparatus that comprises the element.
It is further noted that the terms "center," "upper," "lower," "left," "right," "vertical," "horizontal," "inner," "outer," and the like are used in the orientation or positional relationship indicated in the drawings for convenience in describing the invention and for simplicity in description, and do not indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and thus should not be construed as limiting the invention. Unless expressly stated or limited otherwise, the terms "mounted," "connected," "coupled," and the like are to be construed broadly and encompass, for example, both fixed and removable coupling or integral coupling; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions deviate from the technical solutions of the embodiments of the present invention.

Claims (10)

1. An actual road testing method for an automatic driving system, which is integrated in a vehicle to be tested, comprises the following steps:
the method comprises the steps that driving environment perception data are obtained in real time in the process that a detected vehicle runs in an automatic driving mode on an actual social road;
based on the driving environment perception data, segmenting the running time period of the detected vehicle according to different environment scene types to obtain a plurality of time periods, wherein the environment scene types corresponding to the different time periods are different;
segmenting the driving environment perception data according to the plurality of time periods to determine sub-driving environment perception data respectively corresponding to each time period;
respectively inputting the sub-driving environment perception data respectively corresponding to each time interval into a driver model to obtain anthropomorphic driver control data respectively corresponding to each time interval;
inputting the anthropomorphic driver control data respectively corresponding to each time interval into a vehicle dynamics model respectively to obtain first vehicle dynamics parameter data respectively corresponding to each time interval;
and comparing the first vehicle dynamics parameter data corresponding to the same time period with the second vehicle dynamics parameter data to determine the test result of the automatic driving system, wherein the second vehicle dynamics parameter data is obtained by reading data transmitted by a CAN bus in the tested vehicle.
2. The method according to claim 1, wherein the driving environment perception data is acquired by a camera, a GPS/INS integrated navigation system and a millimeter wave radar sensor in the tested vehicle, and the camera is installed on the inner side of a front windshield of the tested vehicle;
the driving environment perception data comprise at least one of obstacle information in front of the detected vehicle, vehicle information in front of the detected vehicle, traffic light information, speed limit sign information, road curvature information, position information of the detected vehicle, transverse and longitudinal dynamic parameter information, relative speed of the front vehicle compared with the detected vehicle and relative distance of the front vehicle compared with the detected vehicle;
the anthropomorphic driver control data comprises at least one of an accelerator pedal opening, a brake pedal opening, and a steering wheel angle;
the environmental scene types comprise a right turn of the social vehicle, a left turn of the social vehicle, following driving of the social vehicle, front vehicle cut-in of the social vehicle, front vehicle cut-out of the social vehicle or lane change of the social vehicle.
3. The method according to claim 1, characterized in that the driver model is capable of deriving anthropomorphic driver control data of different driving styles based on driving environment perception data;
the personified driver control data corresponding to each time period respectively includes: anthropomorphic driver control data of different driving styles respectively corresponding to each time period, the different driving styles including conservative, ordinary and aggressive;
the step of inputting the anthropomorphic driver control data corresponding to each time interval into the vehicle dynamics model respectively to obtain first vehicle dynamics parameter data corresponding to each time interval respectively comprises the following steps:
inputting the anthropomorphic driver control data of different driving styles corresponding to each time interval into a vehicle dynamics model respectively to obtain sub-vehicle dynamics parameter data of different driving styles corresponding to each time interval respectively, wherein the sub-vehicle dynamics parameter data of different driving styles form the first vehicle dynamics parameter data.
4. The method of claim 3, wherein prior to comparing the first vehicle dynamics parameter data and the second vehicle dynamics parameter data corresponding to the same time period to determine the test result for the autopilot system, the method further comprises:
and respectively carrying out similarity calculation on the sub-vehicle dynamics parameter data of different driving styles corresponding to a period of time and the second vehicle dynamics parameter data corresponding to the period of time, and determining the sub-vehicle dynamics parameter data with the maximum similarity as the first vehicle dynamics parameter data corresponding to the period of time.
5. The method of claim 1, wherein comparing the first vehicle dynamics parameter data and the second vehicle dynamics parameter data corresponding to the same time period to determine the test result of the autopilot system comprises:
comparing the first vehicle dynamics parameter data and the second vehicle dynamics parameter data corresponding to the same time period one by one, and determining the similarity of each index based on the root mean square error;
carrying out weighted summation on the similarity of different indexes to obtain the test result;
the first vehicle dynamics parameter data and the second vehicle dynamics parameter data each include at least one indicator of longitudinal acceleration, longitudinal jerk, lateral acceleration, lateral jerk, yaw angle, and yaw rate.
6. The method of claim 1, further comprising:
acquiring a P group of driver model training data sets; each set of the driver model training data set comprises driving environment perception training data, driver control training data and driving style evaluation information; driving style evaluation information in at least part of the driver model training data sets is different;
and training a Generalized Regression Neural Network (GRNN) based on the P groups of driver model training data groups to obtain the driver model.
7. The method of claim 6, wherein after the obtaining the P sets of driver model training data sets and before training a Generalized Recurrent Neural Network (GRNN) based on the P sets of driver model training data sets, further comprising:
determining values of N characteristic parameters corresponding to each group of driver model training data based on driving environment perception training data and/or driver control training data in each group of driver model training data;
constructing an N-dimensional space, and determining corresponding points of each group of driver model training initial data in the N-dimensional space by taking the values of the N characteristic parameters as coordinate values;
clustering P corresponding points in the N-dimensional space by using a K-means clustering analysis algorithm to obtain M clustering results;
determining a representative point for each clustering result, wherein the representative point is a corresponding point in the clustering result to which the representative point belongs, the distance between each representative point and the clustering center of the clustering result to which the representative point belongs is greater than a first set threshold, and the distance between each representative point and the clustering centers of other clustering results is greater than a second set threshold;
and correcting the driving style evaluation information in each group of the driver model training data so that the driving style evaluation information in each group of the corrected driver model training data is consistent with the driving style evaluation information of the representative points belonging to the same clustering result.
8. The method according to claim 7, wherein before determining values of the N characteristic parameters corresponding to each set of driver model training data based on the driving environment perception training data and/or the driver control training data in each set of driver model training data, further comprising:
performing principal component analysis in the alternative parameters to determine N characteristic parameters; the alternative parameters comprise a vehicle speed average value, a vehicle speed standard deviation, a vehicle speed maximum value, a longitudinal acceleration average value, a longitudinal acceleration standard deviation, a longitudinal acceleration maximum value, a transverse acceleration average value, a transverse acceleration standard deviation, a transverse acceleration maximum value, a steering wheel corner average value, a steering wheel corner standard deviation and a steering wheel corner maximum value.
9. An electronic device, characterized in that the electronic device comprises:
a processor and a memory;
the processor is adapted to perform the steps of the autopilot system actual road testing method of any one of claims 1 to 8 by invoking programs or instructions stored by the memory.
10. A computer-readable storage medium characterized in that it stores a program or instructions for causing a computer to execute the steps of the automatic driving system actual road test method according to any one of claims 1 to 8.
CN202310280259.5A 2023-03-22 2023-03-22 Method, device and storage medium for testing actual road of automatic driving system Active CN115979679B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310280259.5A CN115979679B (en) 2023-03-22 2023-03-22 Method, device and storage medium for testing actual road of automatic driving system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310280259.5A CN115979679B (en) 2023-03-22 2023-03-22 Method, device and storage medium for testing actual road of automatic driving system

Publications (2)

Publication Number Publication Date
CN115979679A true CN115979679A (en) 2023-04-18
CN115979679B CN115979679B (en) 2023-06-23

Family

ID=85961134

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310280259.5A Active CN115979679B (en) 2023-03-22 2023-03-22 Method, device and storage medium for testing actual road of automatic driving system

Country Status (1)

Country Link
CN (1) CN115979679B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016216023A (en) * 2015-04-21 2016-12-22 パナソニックIpマネジメント株式会社 Driving support method, and driving support device, driving control device, vehicle and driving support program using the same
US20180354512A1 (en) * 2017-06-09 2018-12-13 Baidu Online Network Technology (Beijing) Co., Ltd. Driverless Vehicle Testing Method and Apparatus, Device and Storage Medium
US20200226467A1 (en) * 2019-01-11 2020-07-16 Arizona Board Of Regents On Behalf Of Arizona State University Systems and methods for evaluating perception systems for autonomous vehicles using quality temporal logic
CN111856969A (en) * 2020-08-06 2020-10-30 北京赛目科技有限公司 Automatic driving simulation test method and device
CN111947938A (en) * 2020-08-03 2020-11-17 中国第一汽车股份有限公司 In-loop test system, method, server and storage medium for automatic driving vehicle
CN112130472A (en) * 2020-10-14 2020-12-25 广州小鹏自动驾驶科技有限公司 Automatic driving simulation test system and method
CN112506170A (en) * 2020-11-20 2021-03-16 北京赛目科技有限公司 Driver model based test method and device
CN112526968A (en) * 2020-11-25 2021-03-19 东南大学 Method for building automatic driving virtual test platform for mapping real world road conditions
CN112721949A (en) * 2021-01-12 2021-04-30 重庆大学 Method for evaluating longitudinal driving personification degree of automatic driving vehicle
US20210316753A1 (en) * 2020-12-10 2021-10-14 Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. Road test method and apparatus for autonomous driving vehicle, device and storage medium
CN114354219A (en) * 2022-01-07 2022-04-15 苏州挚途科技有限公司 Test method and device for automatic driving vehicle
CN115048972A (en) * 2022-03-11 2022-09-13 北京智能车联产业创新中心有限公司 Traffic scene deconstruction classification method and virtual-real combined automatic driving test method
WO2022246852A1 (en) * 2021-05-28 2022-12-01 吉林大学 Automatic driving system testing method based on aerial survey data, testing system, and storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016216023A (en) * 2015-04-21 2016-12-22 パナソニックIpマネジメント株式会社 Driving support method, and driving support device, driving control device, vehicle and driving support program using the same
US20180354512A1 (en) * 2017-06-09 2018-12-13 Baidu Online Network Technology (Beijing) Co., Ltd. Driverless Vehicle Testing Method and Apparatus, Device and Storage Medium
US20200226467A1 (en) * 2019-01-11 2020-07-16 Arizona Board Of Regents On Behalf Of Arizona State University Systems and methods for evaluating perception systems for autonomous vehicles using quality temporal logic
CN111947938A (en) * 2020-08-03 2020-11-17 中国第一汽车股份有限公司 In-loop test system, method, server and storage medium for automatic driving vehicle
CN111856969A (en) * 2020-08-06 2020-10-30 北京赛目科技有限公司 Automatic driving simulation test method and device
CN112130472A (en) * 2020-10-14 2020-12-25 广州小鹏自动驾驶科技有限公司 Automatic driving simulation test system and method
CN112506170A (en) * 2020-11-20 2021-03-16 北京赛目科技有限公司 Driver model based test method and device
CN112526968A (en) * 2020-11-25 2021-03-19 东南大学 Method for building automatic driving virtual test platform for mapping real world road conditions
US20210316753A1 (en) * 2020-12-10 2021-10-14 Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. Road test method and apparatus for autonomous driving vehicle, device and storage medium
CN112721949A (en) * 2021-01-12 2021-04-30 重庆大学 Method for evaluating longitudinal driving personification degree of automatic driving vehicle
WO2022246852A1 (en) * 2021-05-28 2022-12-01 吉林大学 Automatic driving system testing method based on aerial survey data, testing system, and storage medium
CN114354219A (en) * 2022-01-07 2022-04-15 苏州挚途科技有限公司 Test method and device for automatic driving vehicle
CN115048972A (en) * 2022-03-11 2022-09-13 北京智能车联产业创新中心有限公司 Traffic scene deconstruction classification method and virtual-real combined automatic driving test method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
VAIBHAV SWAMINATHAN 等: "Autonomous Driving System with Road Sign Recognition using Convolutional Neural Networks", 《 SECOND INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE IN DATA SCIENCE (ICCIDS-2019)》, pages 1 - 12 *
刘法旺等: "搭载自动驾驶功能的智能网联汽车安全测试与评估方法研究", 《汽车工程学报 》, vol. 12, no. 3, pages 221 - 227 *
端帅等: "自动驾驶汽车实际道路测试系统设计与实现", 《制造业自动化》, vol. 44, no. 11, pages 208 - 214 *

Also Published As

Publication number Publication date
CN115979679B (en) 2023-06-23

Similar Documents

Publication Publication Date Title
CN109840660B (en) Vehicle characteristic data processing method and vehicle risk prediction model training method
US20190155291A1 (en) Methods and systems for automated driving system simulation, validation, and implementation
Lyu et al. Using naturalistic driving data to identify driving style based on longitudinal driving operation conditions
CN112631257B (en) Expected function safety test evaluation method for misoperation of automatic driving vehicle
Vasconcelos et al. Smartphone-based outlier detection: a complex event processing approach for driving behavior detection
CN115081508B (en) Traffic running risk parallel simulation system based on traffic digital twin
CN112937591B (en) Driving safety monitoring method, device, equipment and computer readable storage medium
CN110858312A (en) Driver driving style classification method based on fuzzy C-means clustering algorithm
CN114021840A (en) Channel switching strategy generation method and device, computer storage medium and electronic equipment
JP2023540613A (en) Method and system for testing driver assistance systems
Ding et al. An extended car-following model in connected and autonomous vehicle environment: Perspective from the cooperation between drivers
Chu et al. A review of driving style recognition methods from short-term and long-term perspectives
Cai et al. Cnn-lstm driving style classification model based on driver operation time series data
CN115979679B (en) Method, device and storage medium for testing actual road of automatic driving system
CN114692713A (en) Driving behavior evaluation method and device for automatic driving vehicle
Hao et al. Aggressive lane-change analysis closing to intersection based on UAV video and deep learning
Li A scenario-based development framework for autonomous driving
CN116151045A (en) Vehicle simulation test data accuracy analysis method, device, equipment and medium
CN115855531A (en) Test scene construction method, device and medium for automatic driving automobile
CN112596388B (en) LSTM neural network AEB system control method based on driver data
Zhao et al. Driver lane changing intention recognition based on multi-class information
Zhang et al. An Embedded Driving Style Recognition Approach: Leveraging Knowledge in Learning
Zhang et al. Finding critical scenarios for automated driving systems: The data extraction form
CN114861516B (en) Vehicle dynamic performance determining method, device and storage medium
CN117516956A (en) Methods, systems, and computer program products for objectively evaluating the performance of an ADAS/ADS system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant