CN115964296A - Evaluation method, device, equipment and medium for automatic driving safety verification platform - Google Patents

Evaluation method, device, equipment and medium for automatic driving safety verification platform Download PDF

Info

Publication number
CN115964296A
CN115964296A CN202211734833.1A CN202211734833A CN115964296A CN 115964296 A CN115964296 A CN 115964296A CN 202211734833 A CN202211734833 A CN 202211734833A CN 115964296 A CN115964296 A CN 115964296A
Authority
CN
China
Prior art keywords
test
evaluation
scene
container
task
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211734833.1A
Other languages
Chinese (zh)
Inventor
刘书明
倪永富
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guoke Chushi Chongqing Software Co ltd
Original Assignee
Guoke Chushi Chongqing Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guoke Chushi Chongqing Software Co ltd filed Critical Guoke Chushi Chongqing Software Co ltd
Priority to CN202211734833.1A priority Critical patent/CN115964296A/en
Publication of CN115964296A publication Critical patent/CN115964296A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The disclosure relates to an evaluation method, an evaluation device, evaluation equipment and a medium of an automatic driving safety verification platform, wherein the method comprises the following steps: generating a test task according to the test requirement, wherein the test task comprises scene data and is sent to a message queue corresponding to the corresponding container group; sending the test task to a correspondingly connected message queue according to the container group identifier carried by the test task; each scene control service system acquires a test task in the message queue, calls the UE simulation engine, loads scene data carried in the test task to the UE simulation engine to create a test scene and feeds the test scene back to the corresponding scene control service system, then calls a tested system algorithm, loads the tested system algorithm to the corresponding test scene to perform simulation test, and evaluates a test result through a preset evaluation index. The invention realizes that a plurality of tested systems can independently and parallelly carry out the simulation test of the tested algorithm, and solves the problems that an evaluation system can not simultaneously carry out a plurality of test tasks and the time cost is too high.

Description

Evaluation method, device, equipment and medium for automatic driving safety verification platform
Technical Field
The present disclosure relates to the field of data processing technologies, and in particular, to an evaluation method, an evaluation device, an evaluation apparatus, and a medium for an automatic driving safety verification platform.
Background
As the level of autodrive increases, automotive systems become more complex. Variable weather, complex traffic environments, diverse driving tasks, dynamic driving states and the like all present new challenges for the test evaluation of the autonomous vehicles. In order to ensure the effectiveness and safety of the advanced automatic driving function, the effectiveness and functionality of the advanced automatic driving function need to be verified in the whole research and development stage and after the research and development are finished.
The scene-based virtual test technology has the advantages of flexible test scene configuration, high test efficiency, strong test repeatability, safe test process and low test cost, can realize automatic test and accelerated test, and saves a large amount of manpower and material resources. Therefore, virtual testing based on scenes becomes an indispensable important link for testing and evaluating the automatic driving system.
In the automatic driving simulation test, an automatic driving system needs to be tested in a plurality of scenes so as to realize the detection of the safety, the compliance or the comfort of the automatic driving system, and a method and a system for simultaneously and concurrently testing a plurality of tested systems in different test scenes are not realized in the prior art.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides an evaluation method and device for an automatic driving safety verification platform, which solves the problem that in an automatic driving simulation test, a test method and system for simultaneously and concurrently performing a plurality of tested systems under different test scenes are not implemented in the prior art.
According to a first aspect of the embodiments of the present disclosure, there is provided an evaluation method for an automatic driving safety verification platform, including:
generating a test task according to a test requirement, wherein the test task comprises scene data and is sent to a message queue corresponding to a corresponding container group;
sending the test task to a message queue correspondingly connected with the container group according to the container group identification carried by the test task; the container group comprises a first container and a second container, wherein the first container and the second container are used for respectively packaging a scene control service system algorithm and a tested system algorithm; the first container and the second container are in communication connection through a communication module;
each scene control service system acquires the test task in the corresponding message queue, calls a preset UE simulation engine, loads scene data carried in the test task to the UE simulation engine to create a test scene, and feeds the test scene back to the corresponding scene control service system; the scene control service system calls a tested system algorithm, loads the tested system algorithm to the corresponding test scene for simulation test, and obtains a test result;
and evaluating the test result through a preset evaluation index.
According to a second aspect of the embodiments of the present disclosure, an evaluation apparatus of an automatic driving safety verification platform is provided, which includes a test task generation module, a task distribution module, a simulation test module, and an evaluation module;
the test task generation module: generating a test task according to a test requirement, wherein the test task comprises scene data and is sent to a message queue corresponding to a corresponding container group; the container group comprises a first container and a second container, wherein the first container and the second container are used for respectively packaging a scene control service system and a tested system algorithm; the first container and the second container are in communication connection through a communication module;
the task distribution module: the message queue is used for sending the test task to a message queue correspondingly connected with the container group according to the container group identification carried by the test task;
the simulation test module: the system is used for each scene control service system to obtain the test task corresponding to the message queue, and call a preset UE simulation engine, load scene data carried in the test task to the UE simulation engine to create a test scene, and feed the test scene back to the corresponding scene control service system; the scene control service system calls a tested system algorithm, loads the tested system algorithm to the corresponding test scene for simulation test, and obtains a test result;
the evaluation module: and the test result is evaluated through a preset evaluation index.
According to a third aspect of an embodiment of the present disclosure, there is provided an electronic apparatus including: a processor; a memory for storing the processor-executable instructions; the processor is configured to read the executable instructions from the memory and execute the instructions to implement the evaluation method of the automated driving safety verification platform provided by the first aspect of the present disclosure.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the steps of the evaluation method of the automated driving safety verification platform provided by the first aspect of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: the test tasks comprise scene data and container group identifications, corresponding scene data are distributed to corresponding message queues according to the container group identifications, so that when each scene control service system controls and obtains the task data in the corresponding message queue to run, the tested system algorithm is loaded to a required test scene for testing, the test scene obtained by each scene control service system is prevented from being not a scene required to be tested by the corresponding tested system algorithm, and the test process is more efficient; in addition, the scene control service system and the tested system are respectively packaged in two containers which are in communication connection, in the test running process, the scene control service system calls the tested system algorithm, the tested system algorithm is loaded to the corresponding test scene for simulation test, the simulation test of the tested algorithm can be independently and parallelly carried out on a plurality of tested systems, the simulation test of one or more tested systems in one or more simulation environments can be carried out, and the problems that an evaluation system cannot simultaneously carry out a plurality of test tasks, different function tests need to be queued for sequential test, and the time cost is overhigh are solved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow diagram illustrating a method for evaluation of an automated driving safety verification platform, according to an exemplary embodiment.
FIG. 2 is a flowchart illustrating a first task execution of a method for evaluating an automated driving safety verification platform, according to an exemplary embodiment.
FIG. 3 is a flowchart illustrating a second task execution of a method for evaluating an automated driving safety verification platform, according to an exemplary embodiment.
FIG. 4 is a scene control service system task processing flow diagram illustrating a method for evaluation of an automated driving safety verification platform in accordance with an exemplary embodiment.
Fig. 5 is a sequence diagram illustrating a scene control service system and task processing of an evaluation method of an automated driving safety verification platform according to an exemplary embodiment.
FIG. 6 is a flow diagram illustrating a first evaluation service task processing of an evaluation method of an automated driving safety verification platform, according to an exemplary embodiment.
FIG. 7 is a flow diagram illustrating a second evaluation service task processing of a method for evaluating an automated driving safety verification platform, according to an exemplary embodiment.
FIG. 8 is a sequence diagram illustrating evaluation services and task processing for an evaluation method for an automated driving safety verification platform, according to an exemplary embodiment.
FIG. 9 is a block diagram illustrating an evaluation device of an automated driving safety verification platform according to an exemplary embodiment.
Fig. 10 is a block diagram schematically illustrating another structure of an evaluation device of an automated driving safety verification platform according to an exemplary embodiment.
Fig. 11 is a block diagram schematically illustrating another structure of an evaluation device of an automated driving safety verification platform according to an exemplary embodiment.
FIG. 12 is a block diagram illustrating an electronic device in accordance with an example embodiment.
Detailed Description
Exemplary embodiments will be described in detail below with reference to the accompanying drawings.
It should be noted that the related embodiments and the accompanying drawings are provided only for the purpose of describing exemplary embodiments provided by the present disclosure, and not for the purpose of describing all embodiments of the present disclosure, and it should be understood that the present disclosure is not limited by the related exemplary embodiments.
It should be noted that the terms "first", "second", etc. are used in this disclosure only to distinguish different steps, devices or modules, etc. The relational terms do not denote any particular technical meaning, nor do they denote an order or mutual dependency between them.
It should be noted that the terms "a", "an", "a", "at least one", and "one" as used in this disclosure are intended to be illustrative, and not limiting. Unless the context clearly dictates otherwise, it should be understood as "one or more".
It should be noted that the term "and/or" is used in this disclosure to describe the association relationship between the associated objects, and generally means that there are at least three types of association relationships. For example, a and/or B, may represent at least: a exists alone, A and B exist simultaneously, and B exists alone.
It should be noted that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. The scope of the present disclosure is not limited by the order in which the steps of the related embodiments are described, unless otherwise specified.
It should be noted that all actions of acquiring signals, information or data in the present disclosure are performed under the premise of complying with the corresponding data protection regulation policy of the country of the location and obtaining the authorization given by the owner of the corresponding device.
Exemplary method
Fig. 1 is a flowchart illustrating an evaluation method of an automated driving safety verification platform according to an exemplary embodiment, and as shown in fig. 1, the evaluation method of the automated driving safety verification platform includes the following steps.
In step S110, according to a container group identifier carried by the test task, sending the test task to a message queue correspondingly connected to the container group; the container group comprises a first container and a second container, wherein the first container and the second container are used for respectively packaging a scene control service system algorithm and a tested system algorithm; the first container and the second container are communicatively coupled via a communication module.
In step S120, each scene control service system obtains the test task in the corresponding message queue, and calls a preset scene rendering (hereinafter, referred to as UE) simulation engine through an interface, and 5 loads scene data carried in the test task to the UE simulation engine to create a test scene;
feeding the test scene back to the corresponding scene control service system; the scene control service system calls a tested system algorithm through an interface, and loads the tested system algorithm to the corresponding test scene for simulation test to obtain a test result;
in step S130, the test result is evaluated according to a preset evaluation index.
The evaluation method of the automatic driving safety verification platform provided by the invention realizes that each task comprises scene data and a container group identifier, and distributes the corresponding scene data to the corresponding message queue according to the container group identifier, so that when each scene control service system controls and obtains the task data in the corresponding message queue, the algorithm of the system to be tested is loaded to the required test scene for testing, and each scene control is prevented from controlling
The test scene obtained by the service system is not the scene which needs to be tested corresponding to the tested system algorithm, so that the test process 5 is more efficient; in addition, the scene control service system and the system to be tested are respectively packaged in two containers which are in communication connection, in the test operation process, the scene control service system calls the algorithm of the system to be tested through an interface, the algorithm of the system to be tested is loaded to the corresponding test scene for simulation test, the simulation test of the algorithm to be tested can be independently and parallelly performed on a plurality of systems to be tested, and one or more systems to be tested can be tested in one or more containers
The simulation test is carried out in the simulation environment, and the problems that the evaluation system cannot carry out multiple test tasks at the same time, and different 0 function tests need to be queued up to be sequentially carried out, so that the time cost is too high are solved.
Referring to fig. 2, in some embodiments, an example of a process of task execution for an automatic driving safety verification platform provided in this embodiment provides an evaluation method, which can complete a test task of defining and instantiating a platform service through a configuration file and a script file, for example, a test field is used
The scene information generating code is compiled into An executable file to be added into a test task, a tester only needs 5 to compile a configuration file according to a certain standard and compile a script file according to different services, wherein the test task comprises scene data required to be tested by a corresponding tested system and a corresponding container group identifier, such as a unique domain DI (domain _ id), a plurality of message queues A1-An are respectively and correspondingly connected with a scene control service system in a communication mode, the test task of each tested system is divided into n message queues to distribute and process the task, the system sends the task data to the corresponding message queues A1-An according to the corresponding domain _ id in the test task, the setting adds the scene data required to be tested by the corresponding tested system to the task queues before testing, each scene control service system and each tested system algorithm are arranged in a one-to-one container mode, and in the test process, each tested system algorithm is loaded to the test scene required to be tested, the tested system is enabled to run in a virtual driving environment module, the virtual control service system and the target vehicle driving verification action are enabled to achieve the aim of virtual vehicle verification and vehicle response, and the vehicle automatic driving verification is achieved; the scene control service system device can be a PAAS cloud platform-based service manager and is used for allocating and executing virtual test resources and feeding back operation data.
In order to avoid message transmission and receiving confusion caused by information interaction among a scene control service system, a tested system algorithm and a simulation engine, and to avoid that the corresponding tested system algorithm cannot be loaded to the corresponding test scene for simulation test, the first container and the second container are in communication connection by using CyberRT communication modules which are respectively arranged as communication middleware, the scene control service system and the tested system need to be communicated through the CyberRT, messages during communication are broadcast in a local area network, and the same domain _ id is used during communication to ensure the accuracy of message communication at two ends and prevent mess from occurring during message receiving and sending.
In some embodiments, a plurality of UE containers which are managed by UE agents and packaged with the UE simulation engines are arranged, and the arrangement of the UE agents realizes the control of the access authority of the UE simulation engines in the system, optimizes scheduling resources and reduces the energy consumption of the system; the UE agent is responsible for load balancing and request uniform forwarding of the UE container and also serves as an interface for providing request access to the application; referring to fig. 3, an exemplary flowchart of a task operation of an autopilot safety verification platform provided in this embodiment is shown, and referring to fig. 4, a task processing flowchart of a scene control service system provided in this embodiment is shown, where this embodiment provides an evaluation method, in step S120, each scene control service system calls a preset UE simulation engine through an interface, and the method includes:
step S1201, each scene control service system sends a calling instruction to a UE agent;
optionally, the invoking instruction may include unique identity information, such as a container name, an address, or a domain _ id, which facilitates the UE agent to determine a scenario control service system that needs to invoke the emulation engine, and feed back the acquired idle available UE container address to the corresponding scenario control service system, and further includes an idle available UE container query command, which is used for the UE agent to query the usage status of the multiple UE containers that the UE agent proxies.
Step S1202, the UE agent inquires the use states of a plurality of UE containers respectively packaged with the UE simulation engine according to the calling instruction, and feeds back the addresses of the UE containers in the unused state to the corresponding scene control service system;
optionally, the UE agent is responsible for load balancing and request uniform forwarding of each scene control service system, and also serves as an interface to provide request access to an application; when the UE agent receives the call instruction, the UE agent accesses a plurality of UE containers of the UE agent in sequence, when the UE containers are in a use state, the UE containers start application isolation and enter an access limiting state, and the UE agent cannot access the UE containers; and if the UE agent can access one UE container, which indicates that the UE container is in an non-isolated state, the UE agent feeds back the address of the UE container to the corresponding scene control service system according to the unique identity information of the scene control service system in the calling instruction.
Step S1203, the scene control service system calls a simulation engine through an interface according to the UE container address.
In this embodiment, when a task is run, an available UE container address is obtained by the UE agent, the UE agent queries the UE container use state list to obtain an idle available UE container, and interacts with the UE, and the UE container is used to render a scene according to scene data and feed the scene rendered scene back to the scene control service system, thereby optimizing the use of multiple UE containers.
In a specific embodiment, referring to fig. 5, for the scene control service system and the task processing timing chart provided in this embodiment, in step S120, the scene control service system calls a preset UE simulation engine through an interface, and loads the scene data carried in the test task and the system algorithm under test to the UE simulation engine for simulation test, and obtaining a test result includes:
step S1211, each scene control service system obtains the test task corresponding to the message queue A;
step S1212, the UE agent inquires the use states of a plurality of UE containers according to the call instruction of each scene control service system, and obtains an idle UE container;
step S1213, the using state of the corresponding container is set as used;
step S1214, returning the UE container address to the scene control service system;
step S1215, the scenario control service system calls the corresponding UE container according to the received UE container address, loads scenario data carried in the test task to the UE simulation engine to create a test scenario, and feeds the test scenario back to the corresponding scenario control service system; the scene control service system calls the tested system algorithm through an interface, loads the tested system algorithm to the corresponding test scene for simulation test, and obtains the test result;
step S1216, feeding back task completion information to the UE agent;
step S1217, the UE agent updates the UE container use state list;
step S1218, setting the current usage state of the UE container to be unused, and feeding back an update result to the scene control service system.
In some embodiments, referring to fig. 6, for the processing flowchart of the evaluation service task provided in this embodiment, in step S130, the evaluating each test result includes:
s1301, the scene control service systems generate a task to be evaluated carrying an evaluation container address according to a test result and send the task to be evaluated to a message queue to be evaluated;
optionally, after each of the scene control service systems obtains the test result, in order to efficiently evaluate the test result, in this embodiment, a plurality of evaluation service systems that can simultaneously evaluate the test result and are respectively packaged in the evaluation container are set, and an evaluation index for evaluating the test result is also preset in each evaluation container, so that each evaluation container is used for evaluating a preset evaluation task, for example, a plurality of evaluation service systems are set in one-to-one correspondence with the plurality of container groups, and each of the evaluation service systems respectively evaluates the test result of one of the algorithms of the tested system;
in this embodiment, to achieve the above object, a tester may add the evaluation container address to a test task when the test task is compiled, and after the simulation test is completed by each scene control service system, add the evaluation container address to a corresponding test result to generate a to-be-evaluated task and send the to-be-evaluated task to a to-be-evaluated message queue; or is
When the scene control service systems are deployed, after the scene control service systems are set to complete simulation tests, tasks to be evaluated carrying the set evaluation container addresses are generated by default, for example, the test tasks from the evaluation service system A to the evaluation message queue A are set, and after the scene control service system A connected with the message queue A completes the simulation tests, the test results are generated into the addresses of the evaluation containers A carrying the encapsulation evaluation service system A.
S1302, the evaluation agent service obtains the task to be evaluated in the message queue to be evaluated, and calls the evaluation container through an interface according to an evaluation container address carried by the task to be evaluated, the evaluation container is packaged with the evaluation service system and an evaluation index, and the evaluation service system obtains the task to be evaluated in the message queue to be evaluated and evaluates the test result according to the evaluation index.
In this embodiment, each test result is evaluated through a plurality of evaluation service systems, wherein an evaluation container address carried by the task to be evaluated can be added to the test task when a tester writes the test task, and after each scene control service system completes a simulation test, the evaluation container address is added to the corresponding test result to generate a task to be evaluated, and the task to be evaluated is sent to a message queue to be evaluated; and the evaluation agent service calls the corresponding evaluation container to finish test result evaluation according to the address of the evaluation container in the task to be evaluated, so that a plurality of evaluation tasks are simultaneously independently and parallelly carried out.
In other embodiments, in order to implement flexible scheduling of each evaluation container, the evaluation proxy service uses a multi-container load balancing policy to perform dynamic operation and maintenance on a plurality of evaluation containers, dynamically maintain the number of tasks running in each container, perform assignment of tasks in the containers and recovery of the containers, improve the utilization rate of each evaluation container, and ensure continuity and reliability of use of an evaluation service system. Referring to fig. 7, in another flowchart of the evaluation service task processing provided in this embodiment, in step S130, the evaluating each test result includes:
step S1311, enabling each scene control service system to generate the obtained test results into tasks to be evaluated, carrying evaluation indexes, and sending the tasks to be evaluated to a message queue to be evaluated;
optionally, in this embodiment, in order to implement flexible scheduling of multiple evaluation containers and implement dynamic operation and maintenance of multiple evaluation containers by using a multi-container load balancing policy, a tester may add and carry evaluation index information of each test task in the test task when the test task is compiled, and after each scene control service system completes a simulation test, add and carry evaluation index information in a corresponding test task to a to-be-evaluated task and send the to-be-evaluated task to a to-be-evaluated message queue; or is
When the scene control service systems are deployed, after the scene control service systems are set to complete simulation tests, default evaluation index information is carried in the tasks to be evaluated in an adding mode, and the tasks to be evaluated are sent to a message queue to be evaluated. Step S1312, the evaluation agent service inquires a plurality of evaluation containers packaged with evaluation service systems, determines the addresses of the evaluation containers capable of running the evaluation tasks, calls the evaluation service systems through interfaces according to the addresses of the evaluation containers, and the evaluation service systems acquire the tasks to be evaluated in the message queues to be evaluated and evaluate the test results according to the evaluation indexes carried by the tasks to be evaluated.
In an embodiment, referring to fig. 8, for the evaluation service system and the task processing sequence diagram provided in this embodiment, in step S1312, the step of querying, by the evaluation proxy service, a plurality of evaluation containers in which the evaluation service system is encapsulated, determining an address of the evaluation container where the evaluation task can run, and calling the evaluation service system through an interface according to the address of the evaluation container includes:
step S1320, the evaluation agent service acquires the task to be evaluated in the message queue B to be evaluated;
step S1321, inquiring the evaluation container capable of running evaluation in a plurality of evaluation containers according to a preset threshold value of the number of tasks to be evaluated which can run in each evaluation container;
step S1322, determining the address of the evaluation container C with the least number of the tasks to be evaluated; if a certain evaluation container is preset to be capable of running 10 tasks at most and 3 tasks are currently running, the number of the remaining executable tasks is 7, and if the number of the currently executable tasks of all the evaluation containers is 0, which indicates that no evaluation container capable of running the evaluation tasks exists, the step returns to execute step S1321;
otherwise, selecting a container C with the largest number of the remaining executable tasks from all the containers, and returning the address of the container C;
step S1323, the evaluation agent service asynchronously calls an evaluation service system interface of the evaluation container C according to the evaluation container address, meanwhile, the number of the tasks which have run in the evaluation container C is updated to be added with 1, and the evaluation agent service repeatedly executes the step S1320;
step S1324, when the evaluation service system interface of the container C is executed, taking out a task to be evaluated from the message queue B to be evaluated to run the evaluation task;
and S1325, after the operation of the evaluation task is finished, reducing the number of the operated tasks of the container where the task is located by 1, calling back a platform service interface, and feeding back the corresponding task id to inform that the evaluation task is finished.
In some embodiments, each test task determines its priority or importance when written, for example, a queue parameter is set for each test task, and the priority or importance is specified by setting a queue attribute; after each test task is sent to the corresponding message queue, the system arranges the test task to the corresponding position in the queue according to the set priority or importance, and similarly, after the simulation test is completed to obtain a test result and the test result is sent to the message queue to be evaluated, the system arranges the evaluation task to the corresponding position in the queue according to the set priority or importance.
In some embodiments, the evaluation service may evaluate the test result according to a set evaluation index, for example, compare the test result with the test result expected to be achieved by the test algorithm, and finally obtain an evaluation report that the test result is not qualified (for example, the algorithm is safe, compliant, comfortable, etc.)
In some embodiments, the system under test algorithmic testing includes, but is not limited to, safety, compliance, accuracy, limits, etc. of the algorithm, such as setting processing limits of a plurality of different levels of scene condition testing algorithms, such as testing accuracy of visual sensors under different traffic environment factors (weather, vehicle speed, object size, etc., conditions), acceleration performance, maximum speed per hour, etc.; the same or different automatic driving algorithms are deployed in a plurality of tested system algorithm containers; for deploying the same automatic driving algorithm in a plurality of tested system algorithm containers, please refer to the following embodiment;
deploying the same or different automatic driving algorithms in a plurality of tested system algorithm containers; for deploying the same autopilot algorithm in multiple algorithm containers of the system under test, refer to the following example:
specifically, the same automatic driving algorithm is deployed in two algorithm containers of the system to be tested and is respectively used for testing the identification condition of the front-end sensor of the vehicle on the obstacle under different visibility weather scenes under the same obstacle setting condition, testing the identification condition of the front-end sensor of the vehicle on the obstacle under different obstacle moving speed conditions under the same weather scene, and the like, so that the test result of the automatic driving algorithm under different scenes and different test parameters can be quickly verified.
Or testing and evaluating the tested system from the automatic driving function scene, testing the tested system algorithm to identify the corresponding hazard event in the automatic driving program or algorithm testing process, and performing SOTIF risk identification and evaluation to evaluate whether the SOTIF index meets the SOTIF target specification or not, wherein the SOTIF index can be as follows:
1. the target scene is not considered, and the tested system cannot make correct response to the environment;
2. the function logic arbitration mechanism and algorithm of the tested system are unreasonable, so that the decision is in a problem;
3. the output of the actuating mechanism of the system to be tested deviates from the ideal output, and perfect control-Execution is difficult.
In summary, the technical solution of the embodiment of the present invention is applicable to a task processing and container management method for an automatic driving safety verification platform, different scene control service systems and tested systems (domain _ id are different) are divided into a plurality of message queues to distribute and process test tasks, so that a plurality of tested systems can independently and concurrently perform simulation tests on tested algorithms, and when one or more tested systems perform simulation tests in one or more simulation environments, a scene control service system and a tested system are set to communicate with each other through a CyberRT, thereby ensuring accuracy of message communication at two ends and preventing mess in message receiving and sending; in addition, in the method of the embodiment, a container selection strategy is formulated for a plurality of UE containers and evaluation containers, the use state of the containers, the allocation of tasks in the containers, the recovery of the containers and the like are dynamically maintained through a container management and use method, the utilization rate of each container is improved, and the continuity and reliability of the use of the scene service or the evaluation service are ensured.
Exemplary devices
FIG. 9 is a block diagram of an evaluation device of an automated driving safety verification platform, according to an exemplary embodiment. Referring to fig. 10, the apparatus 200 includes a test task generating module 210, a task distributing module 220, a simulation testing module 230, and an evaluating module 240;
the test task generation module 210: generating a test task according to a test requirement, wherein the test task comprises scene data, a container group identifier and a task id; the container group comprises a first container and a second container, wherein the first container and the second container are used for respectively packaging a scene control service system and a tested system algorithm; the first container and the second container are in communication connection through a communication module;
the task distribution module 220: the message queue is used for sending the test task to a message queue correspondingly connected with the container group according to the container group identification carried by the test task;
the simulation test module 230: the system is used for each scene control service system to obtain the test task corresponding to the message queue, call a preset UE simulation engine through an interface, load scene data carried in the test task to the UE simulation engine to create a test scene, and feed the test scene back to the corresponding scene control service system; the scene control service system calls a tested system algorithm through an interface, and loads the tested system algorithm to the corresponding test scene for simulation test to obtain a test result;
the evaluation module 240: and the evaluation module is used for evaluating the test result through a preset evaluation index.
According to the evaluation device of the automatic driving safety verification platform, the test task generation module 210 generates that each task comprises scene data and a container group identifier, the task distribution module 220 distributes the corresponding scene data to the corresponding message queue according to the container group identifier, so that each scene control service system in the simulation test module 230 can conveniently load a tested system algorithm to a required test scene for testing when controlling to obtain the task data in the corresponding message queue, and the test scene obtained by each scene control service system is not the scene required to be tested by the corresponding tested system algorithm, so that the test process is more efficient; in addition, the scene control service system and the tested system are respectively packaged in two containers which are in communication connection, in the test operation process, the scene control service system calls the tested system algorithm through an interface, the tested system algorithm is loaded to the corresponding test scene for simulation test, the simulation test of the tested algorithm can be independently and parallelly carried out on a plurality of tested systems, the simulation test of one or more tested systems in one or more simulation environments can be carried out, and the problems that an evaluation system cannot simultaneously carry out a plurality of test tasks, the test needs to be queued for sequential test during different functional tests, and the time cost is overhigh are solved.
In some embodiments, the present embodiment provides An evaluation apparatus, the test task generating module 210 may complete a test task for defining and instantiating a platform service through a configuration file and a script file, for example, compiling a test scenario information generating code into An executable file to be added to the test task, a tester only needs to compile the configuration file according to a certain specification, and compile the script file according to different services, where the test task includes scenario data and a corresponding container group identifier, such as a unique domain DI (domain _ id), which are required to be tested by a corresponding tested system, a plurality of message queues A1-An are respectively connected to a scenario control service system in a corresponding communication manner, the test task of each tested system is divided into n message queues to distribute and process the task, the task distributing module 220 sends the task data to the corresponding message queues A1-An according to the corresponding domain _ id in the test task, the setting adds the scenario data required to be tested by the corresponding tested system to the task queues before testing, and deploys each control service system and each tested scenario control system through a container one-to-one, builds a virtual test algorithm in the test scenario, and automatically loads a test scenario control algorithm 230 in a test scenario, so as to obtain a test scenario, and a test target test environment for a test vehicle to automatically test target test vehicle, and a test environment; the scene control service system device can be a PAAS cloud platform-based service manager and is used for allocating and executing virtual test resources and feeding back operation data.
In some embodiments, in order to avoid message transmission and reception confusion caused by information interaction among a scene control service system, a tested system algorithm and a simulation engine, and further avoid that the corresponding tested system algorithm cannot be loaded to the corresponding test scene for simulation test, the first container and the second container are in communication connection through CyberRT communication modules which are respectively arranged as communication middleware, communication needs to be performed between one scene control service system and one tested system through CyberRT, messages during communication are broadcast in a local area network, and the same domain _ id is used during communication to ensure the accuracy of message communication at two ends and prevent message receiving and sending from being disordered.
In some embodiments, a plurality of UE containers which are managed by UE agents and packaged with the UE simulation engines are arranged, and the arrangement of the UE agents realizes the control of the access authority of the UE simulation engines in the system, optimizes scheduling resources and reduces the energy consumption of the system; the UE agent is responsible for load balancing and request unified forwarding of the UE container and also serves as an interface to provide request access to the application; referring to fig. 10, a block diagram of an evaluation apparatus of another automated driving safety verification platform provided in this embodiment is provided, where this embodiment provides an evaluation apparatus, further including:
a plurality of UE containers respectively encapsulating the UE simulation engine;
the UE proxy module 250: the system comprises a plurality of scene control service systems, a plurality of UE containers and a plurality of network element management modules, wherein the scene control service systems are used for managing the use states of the UE containers, inquiring the use state of each UE container according to a calling instruction sent by each scene control service system, and feeding back the addresses of the UE containers in the unused state to the corresponding scene control service systems;
and the scene control service system calls a simulation engine through an interface according to the UE container address.
In an embodiment, the UE agent module 250 queries the use states of a plurality of UE containers according to the call instruction of each scene control service system, feeds back the address of the UE container in an unused state to the corresponding scene control service system, and sets the use state of the corresponding UE container as used; the UE agent module 250 sets the use state of the corresponding UE container as unused according to the test task completion information fed back by the scene control service system.
In some embodiments, referring to fig. 11, a block diagram of an evaluation apparatus of another automated driving safety verification platform provided in this embodiment is provided, where this embodiment provides an evaluation apparatus, further including:
message queue to be evaluated: the system is used for receiving and storing the tasks to be evaluated generated by the scene control service systems; the task to be evaluated is a task to be evaluated, which carries evaluation indexes and is generated by the scene control service system on the test result;
the evaluation module 240: an evaluation container comprising a plurality of packaged evaluation services;
the evaluation agent module 260: the system comprises a plurality of evaluation containers which are packaged with evaluation services, the evaluation containers are used for determining the addresses of the evaluation containers which can run the tasks to be evaluated, the evaluation services are called through an interface according to the addresses of the evaluation containers, the evaluation services acquire the tasks to be evaluated in the message queues to be evaluated, and the test results are evaluated according to the evaluation indexes carried by the tasks to be evaluated.
In this embodiment, a plurality of evaluation services set by the evaluation module 240 implement evaluation of each test result, where an evaluation container address carried by the task to be evaluated may be added to the test task when the test task is compiled by a tester, and after the simulation test is completed by each scene control service system, the evaluation container address is added to the corresponding test result to generate a task to be evaluated, and the task to be evaluated is sent to a message queue to be evaluated; the evaluation agent module 260 calls the corresponding evaluation container to complete the test result evaluation according to the evaluation container address in the task to be evaluated, so as to implement the simultaneous independent and parallel execution of a plurality of evaluation tasks.
In other embodiments, in order to implement flexible scheduling of each evaluation container, the evaluation agent module 260 uses a multi-container load balancing policy to perform dynamic operation and maintenance on a plurality of evaluation containers, dynamically maintain the number of tasks running in each container, perform assignment of tasks in the containers and recovery of the containers, and ensure continuity and reliability of use of evaluation services while improving the utilization rate of each evaluation container.
In some embodiments, the evaluation agent module 260 queries the evaluation container capable of running an evaluation in a plurality of evaluation containers according to a preset threshold value of the number of tasks to be evaluated that can be run in each of the evaluation containers, and determines the address of the evaluation container with the smallest number of tasks to be evaluated.
The embodiment of the apparatus of the present disclosure corresponds to the technical solutions of the embodiments of the present invention, and the specific operations of the modules may be understood with reference to the description in the method embodiment, and may be understood with reference to fig. 9 to 11 specifically, which are not described herein again.
Exemplary electronic device
Fig. 12 is a block diagram illustrating an electronic device 900 in accordance with an example embodiment. The electronic device 900 may be a vehicle controller, a vehicle terminal, a vehicle computer, or other type of electronic device.
Referring to fig. 9, an electronic device 900 may include at least one processor 910 and memory 920. Processor 910 may execute instructions stored in memory 920. The processor 910 is communicatively coupled to the memory 920 via a data bus. In addition to the memory 920, the processor 910 may be communicatively coupled to an input device 930, an output device 940, and a communication device 950 via a data bus.
The processor 910 may be any conventional processor, such as a commercially available CPU. The processor may also include, for example, an image processor (GPU), a Field Programmable Gate Array (FPGA), a System On Chip (SOC), an Application Specific Integrated Circuit (ASIC), or a combination thereof.
The memory 920 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
In the disclosed embodiment, the memory 920 stores executable instructions, and the processor 910 may read the executable instructions from the memory 920 and execute the instructions to implement all or part of the steps of the evaluation method of the autopilot safety verification platform described in any of the above exemplary embodiments.
Exemplary computer readable storage Medium
In addition to the above-described methods and apparatuses, exemplary embodiments of the present disclosure may also be a computer program product or a computer-readable storage medium storing the computer program product. The computer product includes computer program instructions that are executable by a processor to perform all or part of the steps described in any of the above exemplary embodiments.
The computer program product may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages, and scripting languages (e.g., python). The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
The computer readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the readable storage medium include: static Random Access Memory (SRAM) having one or more electrically conductive lines, electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, a magnetic or optical disk, or any suitable combination of the foregoing.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements that have been described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. The evaluation method of the automatic driving safety verification platform is characterized by comprising the following steps:
generating a test task according to a test requirement, wherein the test task comprises scene data and is sent to a message queue corresponding to a corresponding container group;
sending the test task to a message queue correspondingly connected with the container group according to the container group identification carried by the test task; the container groups are multiple groups, and each container group comprises a first container for respectively packaging a scene control service system and a second container for packaging a tested system algorithm; the first container and the second container are in communication connection through a communication module;
each scene control service system acquires the test task in the corresponding message queue, calls a preset scene rendering simulation engine, loads scene data carried in the test task to the scene rendering simulation engine to create a test scene, and feeds the test scene back to the corresponding scene control service system; the scene control service system calls a tested system algorithm, loads the tested system algorithm to the corresponding test scene for simulation test, and obtains a test result;
and evaluating the test result through a preset evaluation index.
2. The evaluation method of the automated driving safety verification platform according to claim 1, wherein each scene control service system calls a preset scene rendering simulation engine, and the method comprises the following steps:
each scene control service system sends a calling instruction to a scene rendering agent;
the scene rendering agent inquires the use states of a plurality of scene rendering containers respectively packaged with the scene rendering simulation engine according to the calling instruction, and feeds back the addresses of the scene rendering containers in the unused state to the corresponding scene control service system;
and the scene control service system calls a simulation engine according to the scene rendering container address.
3. The method of claim 1, wherein the evaluating each of the test results comprises:
the scene control service systems generate the obtained test results into tasks to be evaluated carrying evaluation indexes and send the tasks to be evaluated to a message queue to be evaluated;
the evaluation agent service inquires a plurality of evaluation containers packaged with evaluation service systems, determines the addresses of the evaluation containers capable of running the evaluation tasks, calls the evaluation service systems according to the addresses of the evaluation containers, and the evaluation service systems acquire the tasks to be evaluated in the message queues to be evaluated and evaluate the test results according to the evaluation indexes carried by the tasks to be evaluated.
4. The method of evaluating an automated driving safety verification platform of claim 1, wherein the evaluating each of the test results comprises:
the scene control service systems generate a task to be evaluated carrying an evaluation container address according to the test result and send the task to be evaluated to a message queue to be evaluated;
and the evaluation agent service acquires the task to be evaluated in the message queue to be evaluated, calls the evaluation container according to an evaluation container address carried by the task to be evaluated, encapsulates the evaluation service system and an evaluation index in the evaluation container, acquires the task to be evaluated in the message queue to be evaluated, and evaluates the test result according to the evaluation index.
5. The evaluation method of the automatic driving safety verification platform according to claim 2, wherein the scene control service system calls a preset scene rendering simulation engine, and loads scene data carried in the test task and the tested system algorithm to the scene rendering simulation engine for simulation test, and obtaining a test result comprises:
each scene control service system acquires the test task in the corresponding message queue;
the scene rendering agent inquires the use states of a plurality of scene rendering containers according to the calling instruction of each scene control service system, feeds back the addresses of the scene rendering containers in the unused state to the corresponding scene control service systems, and sets the use states of the scene rendering containers to be used;
each scene control service system calls the corresponding scene rendering container according to the received address of each scene rendering container, loads scene data carried in the test task to the scene rendering simulation engine to create a test scene, and feeds the test scene back to the corresponding scene control service system; the scene control service system calls the tested system algorithm, loads the tested system algorithm to the corresponding test scene for simulation test, and obtains the test result; and feeding back task completion information to the scene rendering agent, and setting the use state of the corresponding scene rendering container as unused by the scene rendering agent.
6. The evaluation method of the automated driving safety verification platform according to claim 3, wherein the evaluation agent service queries a plurality of evaluation containers packaged with an evaluation service system, determines the addresses of the evaluation containers capable of running the tasks to be evaluated, and calls the evaluation service system according to the addresses of the evaluation containers comprises:
the evaluation agent service inquires the evaluation containers capable of running evaluation in the plurality of evaluation containers according to the preset threshold value of the number of the tasks to be evaluated which can run in each evaluation container, determines the address of the evaluation container with the minimum number of the tasks to be evaluated at present, and calls the evaluation service system according to the address of the evaluation container.
7. The method for evaluating an automated driving safety verification platform according to any one of claims 1 to 6, wherein the algorithms of the tested systems are the same tested algorithm or different tested algorithms.
8. The evaluation device of the automatic driving safety verification platform is characterized by comprising a test task generation module, a task distribution module, a simulation test module and an evaluation module;
the test task generation module: generating a test task according to a test requirement, wherein the test task comprises scene data and is sent to a message queue corresponding to a corresponding container group; the container group comprises a first container and a second container, wherein the first container and the second container are used for respectively packaging a scene control service system and a tested system algorithm; the first container and the second container are in communication connection through a communication module;
the task distribution module: the message queue is used for sending the test task to a message queue correspondingly connected with the container group according to the container group identification carried by the test task;
the simulation test module: the system is used for each scene control service system to obtain the test task corresponding to the message queue, call a preset scene rendering simulation engine, load scene data carried in the test task to the scene rendering simulation engine to create a test scene, and feed the test scene back to the corresponding scene control service system; the scene control service system calls a tested system algorithm, loads the tested system algorithm to the corresponding test scene for simulation test, and obtains a test result;
the evaluation module: and the evaluation module is used for evaluating the test result through a preset evaluation index.
9. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
the processor is configured to read the executable instructions from the memory and execute the instructions to implement the evaluation method of the automated driving safety verification platform of any one of claims 1-7.
10. A computer readable storage medium having stored thereon computer program instructions, which when executed by a processor, perform the steps of the method of evaluating an automated driving safety verification platform according to any of claims 1-7.
CN202211734833.1A 2022-12-31 2022-12-31 Evaluation method, device, equipment and medium for automatic driving safety verification platform Pending CN115964296A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211734833.1A CN115964296A (en) 2022-12-31 2022-12-31 Evaluation method, device, equipment and medium for automatic driving safety verification platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211734833.1A CN115964296A (en) 2022-12-31 2022-12-31 Evaluation method, device, equipment and medium for automatic driving safety verification platform

Publications (1)

Publication Number Publication Date
CN115964296A true CN115964296A (en) 2023-04-14

Family

ID=87352844

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211734833.1A Pending CN115964296A (en) 2022-12-31 2022-12-31 Evaluation method, device, equipment and medium for automatic driving safety verification platform

Country Status (1)

Country Link
CN (1) CN115964296A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116302364A (en) * 2023-05-17 2023-06-23 清华大学 Automatic driving reliability test method, device, equipment, medium and program product

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116302364A (en) * 2023-05-17 2023-06-23 清华大学 Automatic driving reliability test method, device, equipment, medium and program product
CN116302364B (en) * 2023-05-17 2023-08-15 清华大学 Automatic driving reliability test method, device, equipment, medium and program product

Similar Documents

Publication Publication Date Title
US20190123963A1 (en) Method and apparatus for managing resources of network slice
US20190208009A1 (en) Computing resource discovery and allocation
WO2020052605A1 (en) Network slice selection method and device
CN109684054A (en) Information processing method and device, electronic equipment and memory
US11838384B2 (en) Intelligent scheduling apparatus and method
CN110058894B (en) Method, apparatus and computer program product for executing applications in a hybrid cloud
CN112202639B (en) Performance test method for realizing Internet of vehicles remote control service through LoadRunner tool
CN115964296A (en) Evaluation method, device, equipment and medium for automatic driving safety verification platform
EP3901770A1 (en) Method and device for instantiating virtualized network function
CN111831191A (en) Workflow configuration method and device, computer equipment and storage medium
CN108429783A (en) Electronic device, configuration file method for pushing and storage medium
EP4207707A1 (en) Data transmission system, data transmission method, smart vehicle and device
CN112698952A (en) Unified management method and device for computing resources, computer equipment and storage medium
CN112035344A (en) Multi-scenario test method, device, equipment and computer readable storage medium
CN109788325A (en) Video task distribution method and server
CN103677983A (en) Scheduling method and device of application
CN112948050A (en) Method and device for deploying pod
EP3724776A1 (en) Method, function manager and arrangement for handling function calls
CN108881460B (en) Method and device for realizing unified monitoring of cloud platform
CN114265690A (en) Method and device for realizing remote training
CN112395568A (en) Interface authority configuration method, device, equipment and storage medium
CN109858257A (en) Access control method and device
CN110247783A (en) A kind of scalable appearance policy conflict processing method and processing device
CN110826233B (en) Multi-user traffic simulation platform system
CN115167985A (en) Virtualized computing power providing method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination