CN112667366B - Dynamic scene data importing method, device, equipment and readable storage medium - Google Patents

Dynamic scene data importing method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN112667366B
CN112667366B CN202110278162.1A CN202110278162A CN112667366B CN 112667366 B CN112667366 B CN 112667366B CN 202110278162 A CN202110278162 A CN 202110278162A CN 112667366 B CN112667366 B CN 112667366B
Authority
CN
China
Prior art keywords
function
scene
vehicle
data
dynamic scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110278162.1A
Other languages
Chinese (zh)
Other versions
CN112667366A (en
Inventor
赵帅
朱向雷
杜志彬
宋文泽
杨永翌
周博林
翟洋
刘应心
侯全杉
胡耘浩
刘光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sinotruk Data Co ltd
China Automotive Technology and Research Center Co Ltd
Automotive Data of China Tianjin Co Ltd
Original Assignee
Sinotruk Data Co ltd
China Automotive Technology and Research Center Co Ltd
Automotive Data of China Tianjin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sinotruk Data Co ltd, China Automotive Technology and Research Center Co Ltd, Automotive Data of China Tianjin Co Ltd filed Critical Sinotruk Data Co ltd
Priority to CN202110278162.1A priority Critical patent/CN112667366B/en
Publication of CN112667366A publication Critical patent/CN112667366A/en
Application granted granted Critical
Publication of CN112667366B publication Critical patent/CN112667366B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Traffic Control Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The embodiment of the application discloses a method, a device and equipment for importing dynamic scene data and a readable storage medium, and relates to the technical field of unmanned simulation test software. The method comprises the following steps: acquiring attribute information of a target dynamic scene; searching a function matched with the vehicle action information in the attribute information in an action library function table; determining a target real scene corresponding to the target dynamic scene, and acquiring acquisition data of the target real scene; determining data required by the execution of the function from the collected data of the target real scene, and filling the data into the function; and importing the function into a simulator, so that the simulator builds the target dynamic scene by executing the function. The embodiment realizes automatic import of dynamic scene data and automatic generation of dynamic scenes.

Description

Dynamic scene data importing method, device, equipment and readable storage medium
Technical Field
The embodiment of the application relates to a software technology for unmanned simulation testing, in particular to a method, a device, equipment and a readable storage medium for importing dynamic scene data.
Background
The simulation test based on the virtual scene is an important support for intelligent automobile algorithm development and safety verification. Simulation tests usually need to construct a large number of simulation scenes to comprehensively cover natural driving scenes, dangerous scenes and corner scenes, and meet the requirement of sufficient test mileage.
At present, a virtual simulation scene is often constructed in a manual construction mode, and a part of simulation tools support a map information import mode to generate a scene, but the imported and generated scene only has a static road scene, and a dynamic scene still needs to be constructed manually, so that the labor cost is high, and mistakes are easy to occur.
Disclosure of Invention
The embodiment of the application provides a method, a device and equipment for importing dynamic scene data and a readable storage medium, so as to realize automatic import of the dynamic scene data and automatic generation of a dynamic scene.
In a first aspect, an embodiment of the present application provides a method for importing dynamic scene data, including:
acquiring attribute information of a target dynamic scene;
searching a function matched with the vehicle action information in the attribute information in an action library function table;
determining a target real scene corresponding to the target dynamic scene, and acquiring acquisition data of the target real scene; the collected data comprises timestamps, vehicle identifications, vehicle speeds, vehicle accelerations, vehicle type information and driving lane identifications collected in a plurality of collecting periods;
determining data required by the execution of the function from the collected data of the target real scene, and filling the data into the function;
importing the function into a simulator, so that the simulator builds the target dynamic scene by executing the function;
the determining data required for executing the function from the collected data of the target real scene comprises:
performing statistical analysis on data acquired in a plurality of acquisition periods in the target real scene to obtain data required by executing the function;
the data required by the line patrol action function execution comprises the execution duration of the line patrol action, the initial speed of the vehicle, the finishing speed of the vehicle and a line patrol lane mark; alternatively, the data required for the execution of the plunge move function includes the length of time the plunge move was executed, the vehicle initial speed, the vehicle end speed, the initial travel lane identification, and the end travel lane identification.
In a second aspect, an embodiment of the present application further provides an importing apparatus of dynamic scene data, including:
the attribute acquisition module is used for acquiring the attribute information of the target dynamic scene;
the searching module is used for searching a function matched with the vehicle action information in the attribute information in an action library function table;
the data acquisition module is used for determining a target real scene corresponding to the target dynamic scene and acquiring the acquired data of the target real scene; the collected data comprises timestamps, vehicle identifications, vehicle speeds, vehicle accelerations, vehicle type information and driving lane identifications collected in a plurality of collecting periods;
the filling module is used for determining data required by the execution of the function from the collected data of the target real scene and filling the data into the function;
the import module is used for importing the function into a simulator so that the simulator can build the target dynamic scene by executing the function;
wherein, when determining data required for executing the function from the collected data of the target real scene, the filling module is specifically configured to: performing statistical analysis on data acquired in a plurality of acquisition periods in the target real scene to obtain data required by executing the function;
the data required by the line patrol action function execution comprises the execution duration of the line patrol action, the initial speed of the vehicle, the finishing speed of the vehicle and a line patrol lane mark; alternatively, the data required for the execution of the plunge move function includes the length of time the plunge move was executed, the vehicle initial speed, the vehicle end speed, the initial travel lane identification, and the end travel lane identification.
In a third aspect, an embodiment of the present application further provides an electronic device, where the electronic device includes:
one or more processors;
a memory for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the method for importing dynamic scene data according to any of the embodiments.
In a fourth aspect, the present application further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method for importing dynamic scene data according to any embodiment.
According to the method, the attribute information of the target dynamic scene is acquired, so that a function matched with the vehicle action information in the attribute information is searched in an action library function table, and therefore the function required for building the dynamic scene is automatically determined without manual judgment; determining a target real scene corresponding to the target dynamic scene, and acquiring acquisition data of the target real scene; and determining data required by function execution from the acquired data of the target real scene, and filling the data into the function, so that the function data is completely supplemented based on the acquired data of the real scene, manual filling is not required, and the data can be directly led into a simulator, thereby realizing automatic construction of the dynamic scene. In summary, the embodiment provides automatic import of dynamic scene data based on the collected data of the real scene and the attribute information of the dynamic scene, and finally, automatic construction of the dynamic scene is realized, and efficiency and accuracy are improved. In conclusion, the present embodiment provides a new unmanned simulation test software, which promotes the development of the unmanned simulation test industry.
Drawings
Fig. 1 is a flowchart of a first method for importing dynamic scene data according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram illustrating comparison between collected data and attribute information provided in an embodiment of the present application;
fig. 3 is a flowchart of another dynamic scene data importing method according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an apparatus for importing dynamic scene data according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only some of the structures related to the present application are shown in the drawings, not all of the structures.
The embodiment of the application provides a first method for importing dynamic scene data, a flowchart of which is shown in fig. 1, and the method is applicable to a situation that a dynamic scene is automatically built through unmanned simulation test software. The embodiment improves the application software of the unmanned simulation test industry, namely the unmanned simulation test software, adds an importing tool on the basis of the existing simulator, and automatically imports dynamic scene data into the simulator mainly through the importing tool.
The method may be performed by an importing apparatus of dynamic scene data, which may be constituted by software and/or hardware, and is generally integrated in an electronic device.
With reference to fig. 1, the method provided in this embodiment specifically includes:
and S110, acquiring the attribute information of the target dynamic scene.
The dynamic scenario herein is a logical scenario, including but not limited to a line patrol scenario, a cut-in scenario, a cut-out scenario, and a cut-in scenario. At least one dynamic scene can be selected as a target dynamic scene to be built through the simulator.
The attribute information of the target dynamic scene includes, but is not limited to, vehicle motion information, a timestamp, a vehicle identification, vehicle type information, and location information. The vehicle operation information is, for example, a line patrol, a cut-in, a cut-out, or a passing. The time stamp includes a scene start time and a scene end time. The vehicle identification may be a vehicle number that uniquely identifies a vehicle. The vehicle type information is, for example, a car, a commercial vehicle, or a truck. The position information is the position information of the vehicle relative to the target vehicle on the road, for example, the front or the rear of the target vehicle, etc., and can be represented by a nine-grid, and the target parking space is assumed to be located in the center of the nine-grid.
And S120, searching a function matched with the vehicle action information in the attribute information in an action library function table.
The action library function table stores a plurality of functions, each of which is matched with one type of vehicle action information.
Optionally, a matching relationship between the vehicle motion information and the function is pre-constructed, and the matching relationship is written into the motion library function table. Based on the above, according to the matching relationship between the vehicle action information and the function, the function matched with the vehicle action information in the attribute information is searched.
According to the embodiment, the vehicle action information is associated with the function through the matching relation, so that the matched function can be automatically and efficiently found according to the vehicle action information.
It should be noted that the function here is a function framework and has no specific data.
S130, determining a target real scene corresponding to the target dynamic scene, and acquiring the acquired data of the target real scene.
For convenience of description and distinction, a real scene corresponding to the target dynamic scene is referred to as a target real scene.
Unlike the logic scene, the real scene is a scene actually occurring in real life, and also includes but is not limited to a line patrol scene, a cut-in scene, a cut-out scene, an overtaking scene, and the like; data acquisition can be carried out according to the acquisition period of the sensor in the occurrence process of the real target scene, so that the acquisition data of the real target scene can be acquired.
The collected data comprises timestamps, vehicle identifications, vehicle speeds, vehicle accelerations, vehicle type information and driving lane identifications collected in a plurality of collecting periods. For example, to simplify the data storage process, the collected data of one collection period transmitted by the sensor may be stored as one row in real time, and the collected data of the next collection period may be stored as the next row, so as to form a plurality of rows of collected data. Each row of collected data comprises a timestamp, a vehicle identifier, a vehicle speed, a vehicle acceleration, vehicle type information and a driving lane identifier corresponding to the collection period.
140. And determining data required by the function execution from the collected data of the target real scene, and filling the data into the function.
The data required for the function execution includes input data of the function, which can be known by the parameter name of the function. Optionally, the data required for executing the function includes at least one of an execution time length, a vehicle initial speed, a vehicle end speed, a vehicle acceleration, an initial driving lane mark, an end driving lane mark, vehicle type information, and a patrol lane mark. Furthermore, statistical analysis can be performed on data acquired in a plurality of acquisition cycles in the target real scene to obtain data required by the function execution. Statistical analysis includes, but is not limited to, averaging and difference.
In one example, the vehicle action information is a line patrol, and the data required for the execution of the line patrol action function includes the execution duration of the line patrol action, the vehicle initial speed, the vehicle end speed, and a line patrol lane identification. Specifically, the time stamp corresponding to the first acquisition cycle and the time stamp corresponding to the last acquisition cycle of the target real scene are subtracted to obtain the execution duration. And taking the vehicle speed of the first acquisition cycle as the initial speed of the vehicle, taking the vehicle speed of the last acquisition cycle as the finishing speed of the vehicle, and taking any mark or marks with the largest number in the driving lane marks of all acquisition cycles of the target real scene as the patrol lane marks.
In another example, if the vehicle maneuver information is a cut-in, then the data required for the cut-in maneuver function execution includes a length of time for the cut-in maneuver to execute, a vehicle initial speed, a vehicle end speed, an initial travel lane identification, and an end travel lane identification. Specifically, the time stamp corresponding to the first acquisition cycle and the time stamp corresponding to the last acquisition cycle of the target real scene are subtracted to obtain the execution duration. And taking the vehicle speed of the first acquisition period as the initial speed of the vehicle, and taking the vehicle speed of the last acquisition period as the final speed of the vehicle. Optionally, for the cut-in/cut-out action function, the initial driving lane mark of the vehicle and the driving lane mark of the target vehicle are determined comprehensively according to the position information in the attribute information and the driving lane mark of the first acquisition cycle, and the ending driving lane mark of the vehicle is determined according to the driving lane mark of the last acquisition cycle. For example, if the host vehicle is located behind the target vehicle in the attribute information and the host vehicle is located on the lane 2 in the first acquisition cycle, it is determined that the host vehicle is located on the lane 2 and the target vehicle is located on the lane 1. And determining that the vehicle is positioned in the lane 1 if the vehicle is positioned in the lane 1 in the last acquisition period.
If some functions are executed and also need the vehicle acceleration, the vehicle acceleration in a plurality of acquisition periods can be obtained by taking the average value; alternatively, the acceleration is calculated from the vehicle speed of the first acquisition cycle, the vehicle speed of the last acquisition cycle, and the execution time period. If some functions are executed that also require the type of vehicle, the function may be populated with vehicle type information that is common to both the attribute information and the collected information.
S150, importing the function into a simulator, so that the simulator builds the target dynamic scene by executing the function.
The function has completed data filling and can be directly imported into the simulator. The simulator may be an Unreal Engine (UE) that performs automatic scene rendering by executing the function, thereby generating a three-dimensional visual and animation effect of the virtual simulation scene described by the collected data.
According to the method, the attribute information of the target dynamic scene is acquired, so that a function matched with the vehicle action information in the attribute information is searched in an action library function table, and therefore the function required for building the dynamic scene is automatically determined without manual judgment; determining a target real scene corresponding to the target dynamic scene, and acquiring acquisition data of the target real scene; and determining data required by function execution from the acquired data of the target real scene, and filling the data into the function, so that the function data is completely supplemented based on the acquired data of the real scene, manual filling is not required, and the data can be directly led into a simulator, thereby realizing automatic construction of the dynamic scene. In summary, the embodiment provides automatic import of dynamic scene data based on the collected data of the real scene and the attribute information of the dynamic scene, and finally, automatic construction of the dynamic scene is realized, and efficiency and accuracy are improved.
In the above embodiment and the following embodiments, determining a target real scene corresponding to the target dynamic scene includes: determining a target real scene consistent with the scene identification information in the attribute information from at least one real scene; the scene identification information includes at least one of a timestamp and a vehicle identification.
In the process of acquiring real scenes, in order to improve the acquisition efficiency, data of a plurality of real scenes are continuously acquired, and a plurality of rows of acquired data belonging to each real scene are formed, as shown in fig. 2. In order to accurately position the multi-line collected data corresponding to the target dynamic scene, the attribute information and the collected data simultaneously comprise scene identification information for matching positioning. As shown in fig. 2, the attribute information of the dynamic scene 1 includes time stamps t1-t2 and the vehicle 1, and the attribute information of the dynamic scene 2 includes time stamps t3-t4 and the vehicle 2; it is determined that the captured data between timestamps t1-t2 and for vehicle 1 corresponds to dynamic scene 1 and the captured data between timestamps t3-t4 and for vehicle 2 corresponds to dynamic scene 2.
It should be noted that the scene identification information may serve as a unique identification scene, and may be any attribute information or custom information.
Fig. 3 is a flowchart of another method for importing dynamic scene data according to an embodiment of the present invention, where a verification step and an importing step of static scene file information are added on the basis of the above embodiment. The method comprises the following specific steps:
s310, receiving a dynamic scene file and a real scene file sent by a terminal, wherein the dynamic scene file stores attribute information of at least one dynamic scene, and the real scene file stores collected data of at least one real scene.
Firstly, a terminal receives collected data of at least one real scene sent by a sensor, and generates a corresponding dynamic scene based on each real scene. And then, storing the attribute information of at least one dynamic scene into a dynamic scene file, and storing the collected data of the real scene into the real scene file. The terminal can be a portable mobile terminal such as a mobile phone and a tablet personal computer, and can also be a vehicle-mounted terminal.
Illustratively, the dynamic scene file and the real scene file are Excel format files, each line in the dynamic scene file stores attribute information of one dynamic scene, and specifically includes field names and corresponding data, for example, the field names are vehicle motion, timestamp, vehicle identification, vehicle type and location, and the corresponding data are tour, t1-t2, vehicle 1, truck and lane 1. Each line in the real scene file stores data acquired in one acquisition cycle, which also includes field names and corresponding data, as described in detail in the above embodiments.
And S320, carrying out format check and integrity check on the dynamic scene file and the real scene file.
After receiving a file input by a terminal, firstly, format verification is respectively carried out on the dynamic scene file and the real scene file. The reliability of the data is judged by scanning the data and checking whether the data meets the specified data format. Then, integrity check is carried out on the dynamic scene file and the real scene file, namely whether the files comprise all data types is checked, for example, whether the dynamic scene files comprise field names of vehicle actions, timestamps, vehicle identifications, vehicle types and positions; and whether there is a data loss, e.g., whether data corresponding to each field name is missing.
If the format check and integrity check are not passed, an error is reported and an error message is returned.
And S330, acquiring the attribute information of the target dynamic scene.
The target dynamic scene may be at least one dynamic scene selected from a dynamic scene file.
And S340, searching a function matched with the vehicle action information in the attribute information in an action library function table.
And S350, determining a target real scene corresponding to the target dynamic scene, and acquiring the acquired data of the target real scene.
S360, determining data required by the execution of the function from the collected data of the target real scene, and filling the data into the function.
And S370, selecting the static scene file reference information according to the lane identification required by the function execution.
The static scene file reference information provides a static scene for dynamic scene construction, and optionally, the static scene file reference information includes road network file information and sometimes traffic identification file information.
The lane identification required by the function execution is at least one, specifically, the line patrol action function requires one lane identification, the cut-in/cut-out action function requires two lane identifications, and some road networks only have one lane or more than two lanes; the traffic signs may also be different for different numbers of lanes, for example, two lanes may be used for traffic signs that allow lane changing or do not allow lane changing. In order to enable the static scene file reference information to sufficiently support the execution of the function, it is necessary to be able to provide a corresponding number of lanes and traffic identifications.
Based on the lane identification, determining the number of lanes according to the lane identification required by the function execution; and selecting static scene file reference information providing corresponding number of lanes and traffic signs according to the number of the lanes.
And S380, importing the static scene file reference information and the function into the simulator so that the simulator can build the target dynamic scene by executing the function.
Optionally, at least one of the following three optional embodiments is included when importing the function into the simulator.
A first alternative embodiment: and respectively writing each function in the plurality of functions into a plurality of dynamic scene files, and importing the plurality of dynamic scene files into the simulator according to a set sequence. And setting the sequence to be the sequence for constructing each dynamic scene by the simulator.
The dynamic scene file is an Open Scene (OSC) file which can be identified by the emulator, the OSC file defines a standard format of an emulation test case, is compatible with different emulation test software, and is specifically used for describing a file format of a dynamic scene in a driving simulation application program.
In this embodiment, one dynamic scene file stores one function. And importing the dynamic scene files into the simulator according to a set sequence by running the script, so that the simulator builds the dynamic scenes according to the sequence of the received files.
Second alternative embodiment: writing at least two functions into a dynamic scene file according to a set sequence, and importing the dynamic scene file into a simulator. And setting the sequence to be the sequence for constructing each dynamic scene by the simulator. The dynamic scene file is described in detail in the above description.
In this embodiment, at least two functions are stored in one dynamic scene file. The dynamic scene file stores the written functions in the order from top to bottom or from left to right. The simulator can read the corresponding functions in sequence from top to bottom or from left to right, and accordingly the corresponding dynamic scenes are built according to the set sequence.
In a third optional implementation manner, at least two functions and a setting sequence of the at least two functions are written into a dynamic scene file, and the dynamic scene file is imported into a simulator; and setting the sequence to be the sequence for constructing each dynamic scene by the simulator. The dynamic scene file is described in detail in the above description.
In this embodiment, one dynamic scene file stores at least two functions and a setting order of the at least two functions. Optionally, the setting order of the at least two functions includes a jump instruction to each function. Specifically, after receiving the dynamic scene file, the emulator executes a jump instruction to the second function at the file header, and jumps to the position of the second function, thereby executing the second function. And writing a jump instruction to the first function at the end of the second function, executing the jump instruction, jumping to the position of the first function, and executing the first function.
And the static scene file reference information and the function need to be synchronously imported into the simulator, so that the simulator can simultaneously build a dynamic scene and a static scene. Optionally, the static scene file reference information may be automatically written into the dynamic scene file, and is imported into the simulator together with the dynamic scene file.
The embodiment provides three methods for importing the function into the simulator, so that the function and the simulator are butted, and meanwhile, each dynamic scene is built according to the set sequence. In addition, the static scene file reference information is imported into the simulator, the dynamic scene data and the static scene data are combined, complete automatic construction of the static scene and the dynamic scene is achieved, and manual operation is not needed in the whole process.
Fig. 4 is a schematic structural diagram of an importing apparatus of dynamic scene data according to an embodiment of the present application, where the embodiment of the present application is suitable for importing dynamic scene data into a simulator. Referring to fig. 4, the apparatus for importing dynamic scene data includes: an attribute acquisition module 410, a lookup module 420, a data acquisition module 430, a population module 440, and an import module 450.
An attribute obtaining module 410, configured to obtain attribute information of a target dynamic scene;
the searching module 420 is used for searching a function matched with the vehicle action information in the attribute information in an action library function table;
a data obtaining module 430, configured to determine a target real scene corresponding to the target dynamic scene, and obtain collected data of the target real scene; the collected data comprises timestamps, vehicle identifications, vehicle speeds, vehicle accelerations, vehicle type information and driving lane identifications collected in a plurality of collecting periods;
a filling module 440, configured to determine data required for executing the function from the collected data of the target real scene, and fill the data into the function;
the importing module 450 is configured to import the function into a simulator, so that the simulator builds the target dynamic scene by executing the function;
wherein, when determining data required for executing the function from the collected data of the target real scene, the filling module is specifically configured to: performing statistical analysis on data acquired in a plurality of acquisition periods in the target real scene to obtain data required by executing the function;
the data required by the line patrol action function execution comprises the execution duration of the line patrol action, the initial speed of the vehicle, the finishing speed of the vehicle and a line patrol lane mark; alternatively, the data required for the execution of the plunge move function includes the length of time the plunge move was executed, the vehicle initial speed, the vehicle end speed, the initial travel lane identification, and the end travel lane identification.
According to the method, the attribute information of the target dynamic scene is acquired, so that a function matched with the vehicle action information in the attribute information is searched in an action library function table, and therefore the function required for building the dynamic scene is automatically determined without manual judgment; determining a target real scene corresponding to the target dynamic scene, and acquiring acquisition data of the target real scene; and determining data required by function execution from the acquired data of the target real scene, and filling the data into the function, so that the function data is completely supplemented based on the acquired data of the real scene, manual filling is not required, and the data can be directly led into a simulator, thereby realizing automatic construction of the dynamic scene. In summary, the embodiment provides automatic import of dynamic scene data based on the collected data of the real scene and the attribute information of the dynamic scene, and finally, automatic construction of the dynamic scene is realized, and efficiency and accuracy are improved.
Optionally, the action library function table includes a matching relationship between the vehicle action information and the function; the search module is specifically configured to: and searching a function matched with the vehicle action information in the attribute information according to the matching relation between the vehicle action information and the function.
Optionally, when determining the target real scene corresponding to the target dynamic scene, the data obtaining module is specifically configured to: determining a target real scene consistent with the scene identification information in the attribute information from at least one real scene; the scene identification information includes at least one of a timestamp and a vehicle identification.
Optionally, the apparatus further includes a verification module, configured to receive a dynamic scene file and a real scene file sent by a terminal before the attribute information of the target dynamic scene is obtained, where the dynamic scene file stores attribute information of at least one dynamic scene, and the real scene file stores collected data of at least one real scene; carrying out format check and integrity check on the dynamic scene file and the real scene file; wherein the attribute information comprises vehicle-to-vehicle action information, a timestamp, a vehicle identification, vehicle type information, and location information.
Optionally, the import module is specifically configured to write each of the plurality of functions into a plurality of dynamic scene files, and import the plurality of dynamic scene files into the simulator according to a set sequence; and/or writing at least two functions into a dynamic scene file according to a set sequence, and importing the dynamic scene file into a simulator; and/or writing at least two functions and the setting sequence of the at least two functions into a dynamic scene file, and importing the dynamic scene file into a simulator; and setting the sequence to be the sequence for constructing each dynamic scene by the simulator.
Optionally, the apparatus further includes an information importing module, configured to import static scene file reference information to the simulator when the function is imported to the simulator.
Optionally, the apparatus further includes a selection module, configured to select the static scene file reference information according to lane identification required by the function execution before the static scene file reference information is imported to the simulator.
The dynamic scene data importing device provided by the embodiment of the application can execute the dynamic scene data importing method provided by any embodiment of the application, and has corresponding functional modules and beneficial effects of the execution method.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, as shown in fig. 5, the electronic device includes a processor 50, a memory 51, an input device 52, and an output device 53; the number of processors 50 in the device may be one or more, and one processor 50 is taken as an example in fig. 5; the processor 50, the memory 51, the input device 52 and the output device 53 in the apparatus may be connected by a bus or other means, which is exemplified in fig. 5.
The memory 51 is a computer-readable storage medium, and can be used to store software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the importing method of dynamic scene data in the embodiment of the present invention (for example, the attribute obtaining module 410, the searching module 420, the data obtaining module 430, the filling module 440, and the importing module 450 in the importing apparatus of dynamic scene data). The processor 50 executes various functional applications of the device and data processing, that is, implements the above-described importing method of dynamic scene data, by executing software programs, instructions, and modules stored in the memory 51.
The memory 51 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the memory 51 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the memory 51 may further include memory located remotely from the processor 50, which may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 52 is operable to receive input numeric or character information and to generate key signal inputs relating to user settings and function controls of the apparatus. The output device 53 may include a display device such as a display screen.
The embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the method for importing dynamic scene data according to any embodiment is implemented.
The computer storage media of the embodiments of the present application may take any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present application and the technical principles employed. It will be understood by those skilled in the art that the present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, although the present application has been described in more detail with reference to the above embodiments, the present application is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present application, and the scope of the present application is determined by the scope of the appended claims.

Claims (9)

1. A method for importing dynamic scene data, comprising:
acquiring attribute information of a target dynamic scene;
searching a function matched with the vehicle action information in the attribute information in an action library function table;
determining a target real scene corresponding to the target dynamic scene, and acquiring acquisition data of the target real scene; the collected data comprises timestamps, vehicle identifications, vehicle speeds, vehicle accelerations, vehicle type information and driving lane identifications collected in a plurality of collecting periods;
determining data required by the execution of the function from the collected data of the target real scene, and filling the data into the function;
importing the function into a simulator, so that the simulator builds the target dynamic scene by executing the function;
the determining data required for executing the function from the collected data of the target real scene comprises:
performing statistical analysis on data acquired in a plurality of acquisition periods in the target real scene to obtain data required by executing the function;
the data required by the line patrol action function execution comprises the execution duration of the line patrol action, the initial speed of the vehicle, the finishing speed of the vehicle and a line patrol lane mark; or the data required by the execution of the cut-in action function comprises the execution time length of the cut-in action, the initial speed of the vehicle, the finishing speed of the vehicle, the mark of the initial driving lane and the mark of the finishing driving lane;
the importing the function into the simulator includes:
writing at least two functions and the set sequence of the at least two functions into a dynamic scene file, and importing the dynamic scene file into a simulator; the set sequence is the sequence of the simulator for constructing each dynamic scene; the setting sequence of at least two functions comprises jump instructions to each function;
the simulator builds the target dynamic scene by executing the function, and the method comprises the following steps:
after receiving the dynamic scene file, the simulator executes a jump instruction to a second function at the head of the file, and jumps to the position of the second function so as to execute the second function;
and writing a jump instruction to a first function at the end of the second function, executing the jump instruction, jumping to the position of the first function, and executing the first function.
2. The method of claim 1, wherein the table of action library functions includes a matching relationship of vehicle action information to functions;
the step of searching a function matched with the vehicle action information in the attribute information in the action library function table comprises the following steps:
and searching a function matched with the vehicle action information in the attribute information according to the matching relation between the vehicle action information and the function.
3. The method of claim 1, wherein determining the target real scene corresponding to the target dynamic scene comprises:
determining a target real scene consistent with the scene identification information in the attribute information from at least one real scene;
the scene identification information includes at least one of a timestamp and a vehicle identification.
4. The method according to claim 1, further comprising, before said obtaining attribute information of the target dynamic scene:
receiving a dynamic scene file and a real scene file sent by a terminal, wherein the dynamic scene file stores attribute information of at least one dynamic scene, and the real scene file stores collected data of at least one real scene;
carrying out format check and integrity check on the dynamic scene file and the real scene file;
wherein the attribute information comprises vehicle-to-vehicle action information, a timestamp, a vehicle identification, vehicle type information, and location information.
5. The method of claim 1, wherein importing the function into a simulator comprises:
writing each function of the plurality of functions into a plurality of dynamic scene files respectively, and importing the plurality of dynamic scene files into a simulator according to a set sequence; and/or the presence of a gas in the gas,
writing at least two functions into a dynamic scene file according to a set sequence, and importing the dynamic scene file into a simulator.
6. The method according to any of claims 1-5, wherein, when said importing said function into a simulator, further comprising:
importing static scene file reference information into the simulator;
before the importing the static scene file reference information into the simulator, the method further includes:
and selecting static scene file reference information according to the lane identification required by the function execution.
7. An apparatus for importing dynamic scene data, comprising:
the attribute acquisition module is used for acquiring the attribute information of the target dynamic scene;
the searching module is used for searching a function matched with the vehicle action information in the attribute information in an action library function table;
the data acquisition module is used for determining a target real scene corresponding to the target dynamic scene and acquiring the acquired data of the target real scene; the collected data comprises timestamps, vehicle identifications, vehicle speeds, vehicle accelerations, vehicle type information and driving lane identifications collected in a plurality of collecting periods;
the filling module is used for determining data required by the execution of the function from the collected data of the target real scene and filling the data into the function;
the import module is used for importing the function into a simulator so that the simulator can build the target dynamic scene by executing the function;
wherein, when determining data required for executing the function from the collected data of the target real scene, the filling module is specifically configured to: performing statistical analysis on data acquired in a plurality of acquisition periods in the target real scene to obtain data required by executing the function;
the data required by the line patrol action function execution comprises the execution duration of the line patrol action, the initial speed of the vehicle, the finishing speed of the vehicle and a line patrol lane mark; or the data required by the execution of the cut-in action function comprises the execution time length of the cut-in action, the initial speed of the vehicle, the finishing speed of the vehicle, the mark of the initial driving lane and the mark of the finishing driving lane;
when the import module imports the function into the simulator, the import module is specifically configured to: writing at least two functions and the set sequence of the at least two functions into a dynamic scene file, and importing the dynamic scene file into a simulator; the set sequence is the sequence of the simulator for constructing each dynamic scene; the setting sequence of at least two functions comprises jump instructions to each function;
when the simulator builds the target dynamic scene by executing the function, the simulator is specifically configured to: after receiving the dynamic scene file, executing a jump instruction of a file header to a second function, and jumping to the position of the second function so as to execute the second function; and writing a jump instruction to a first function at the end of the second function, executing the jump instruction, jumping to the position of the first function, and executing the first function.
8. An electronic device, comprising:
one or more processors;
a memory for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of importing dynamic scene data as recited in any one of claims 1-6.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the method of importing dynamic scene data according to any one of claims 1 to 6.
CN202110278162.1A 2021-03-16 2021-03-16 Dynamic scene data importing method, device, equipment and readable storage medium Active CN112667366B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110278162.1A CN112667366B (en) 2021-03-16 2021-03-16 Dynamic scene data importing method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110278162.1A CN112667366B (en) 2021-03-16 2021-03-16 Dynamic scene data importing method, device, equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN112667366A CN112667366A (en) 2021-04-16
CN112667366B true CN112667366B (en) 2021-07-20

Family

ID=75399371

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110278162.1A Active CN112667366B (en) 2021-03-16 2021-03-16 Dynamic scene data importing method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112667366B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114415542A (en) * 2022-01-06 2022-04-29 中国第一汽车股份有限公司 Automatic driving simulation system, method, server and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111338973A (en) * 2020-05-19 2020-06-26 中汽院汽车技术有限公司 Scene-based automatic driving simulation test evaluation service cloud platform and application method
CN111797003A (en) * 2020-05-27 2020-10-20 中汽数据有限公司 Method for building virtual test scene based on VTD software
CN111881519A (en) * 2020-07-31 2020-11-03 广州文远知行科技有限公司 Automatic driving test method and device, computer equipment and storage medium
CN111967123A (en) * 2020-06-30 2020-11-20 中汽数据有限公司 Method for generating simulation test case in simulation test

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10755007B2 (en) * 2018-05-17 2020-08-25 Toyota Jidosha Kabushiki Kaisha Mixed reality simulation system for testing vehicle control system designs

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111338973A (en) * 2020-05-19 2020-06-26 中汽院汽车技术有限公司 Scene-based automatic driving simulation test evaluation service cloud platform and application method
CN111797003A (en) * 2020-05-27 2020-10-20 中汽数据有限公司 Method for building virtual test scene based on VTD software
CN111967123A (en) * 2020-06-30 2020-11-20 中汽数据有限公司 Method for generating simulation test case in simulation test
CN111881519A (en) * 2020-07-31 2020-11-03 广州文远知行科技有限公司 Automatic driving test method and device, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
面向决策规划系统测试的具体场景自动化生成方法;陈君毅 等;《汽车技术》;20201031;第2.1-2.2、3.3节,表2 *

Also Published As

Publication number Publication date
CN112667366A (en) 2021-04-16

Similar Documents

Publication Publication Date Title
EP4224404A1 (en) Simulated traffic scene file generation method and apparatus
CN107944091B (en) Virtual-real combined vehicle networking application scene testing system and method
CN112835806B (en) Simulation test platform, method and storage medium
CN109060370B (en) Method and device for vehicle testing of automatically driven vehicle
CN110995548B (en) Method for testing validity of V2X protocol under boundary working condition
CN113341935A (en) Vehicle testing method, device, testing equipment, system and storage medium
CN111797003A (en) Method for building virtual test scene based on VTD software
CN109115242B (en) Navigation evaluation method, device, terminal, server and storage medium
CN112671487B (en) Vehicle testing method, server and testing vehicle
CN112667366B (en) Dynamic scene data importing method, device, equipment and readable storage medium
WO2021146906A1 (en) Test scenario simulation method and apparatus, computer device, and storage medium
CN114647584A (en) Control method and system for automobile micro environment and readable storage medium
CN115878681A (en) Method and device for acquiring automatic driving data, storage medium and electronic device
CN111368409A (en) Vehicle flow simulation processing method, device, equipment and storage medium
CN107167149B (en) Street view making method and system
CN114415542A (en) Automatic driving simulation system, method, server and medium
CN114427976B (en) Test method, device and system for automatic driving vehicle
CN115543809A (en) Method and device for constructing test scene library of automatic driving function
CN115052267A (en) Microscopic simulation vehicle road cooperative data interaction system
CN115061897B (en) Data generation method and device for Internet of vehicles, electronic equipment and storage medium
Onozawa et al. Self-Driving Software Benchmark for Model-Based Development
CN113467429B (en) Real vehicle scene reinjection system and method based on PCAN-USB and ADAS controller
CN110796024B (en) Automatic driving visual perception test method and device for failure sample
CN112527940B (en) Method and device for generating simulation map, electronic equipment and storage medium
CN111121793B (en) Map generation method and device for unmanned driving and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant