CN113498511A - Test scene simulation method and device, computer equipment and storage medium - Google Patents

Test scene simulation method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN113498511A
CN113498511A CN202080003153.5A CN202080003153A CN113498511A CN 113498511 A CN113498511 A CN 113498511A CN 202080003153 A CN202080003153 A CN 202080003153A CN 113498511 A CN113498511 A CN 113498511A
Authority
CN
China
Prior art keywords
field
fields
target
objects
characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080003153.5A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DeepRoute AI Ltd
Original Assignee
DeepRoute AI Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DeepRoute AI Ltd filed Critical DeepRoute AI Ltd
Publication of CN113498511A publication Critical patent/CN113498511A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B17/00Systems involving the use of models or simulators of said systems
    • G05B17/02Systems involving the use of models or simulators of said systems electric
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content

Abstract

A test scenario simulation method includes: acquiring characteristic information corresponding to each object in a test scene; acquiring a field template, wherein the field template comprises a plurality of candidate fields; determining a target field of each object from the field template according to the characteristic information corresponding to each object, and assigning values to the target field of each object according to the characteristic information corresponding to each object; generating target codes corresponding to the target fields of the objects according to the assigned target fields of the objects, and generating simulation models corresponding to the objects according to the target codes corresponding to the objects; and establishing a simulation scene corresponding to the test scene according to the simulation model corresponding to each object.

Description

Test scene simulation method and device, computer equipment and storage medium Technical Field
The present application relates to the field of automatic driving technologies, and in particular, to a test scenario simulation method, apparatus, computer device, and storage medium.
Background
With the development of vehicle control technology, unmanned technology has emerged. The unmanned technology is that a driving route of an unmanned vehicle is automatically planned based on an unmanned driving algorithm, and the unmanned vehicle is controlled based on the driving route, so that the unmanned vehicle can reach a preset target location. And the test evaluation of the unmanned algorithm is mainly based on simulation of a real test scene.
The existing simulation method for a real test scene is often used for classifying objects appearing on a road in real life, such as: cars, trucks, bicycles, pedestrians, etc. write corresponding codes to simulate each object according to the function of each object. However, different classes of objects may have the same function, and thus there is a large amount of repetitive codes between different classes of objects, resulting in inefficient generation of codes.
Disclosure of Invention
Various embodiments provided by the application provide a test scenario simulation method, a test scenario simulation device, computer equipment and a storage medium. The technical scheme is as follows:
a test scenario simulation method includes:
acquiring characteristic information corresponding to each object in a test scene;
acquiring a field template, wherein the field template comprises a plurality of candidate fields;
determining a target field of each object from a field template according to the characteristic information corresponding to each object, and assigning values to the target fields of each object according to the characteristic information corresponding to each object;
generating target codes corresponding to the target fields of the objects according to the assigned target fields of the objects, and generating simulation models corresponding to the objects according to the target codes corresponding to the objects;
and establishing a simulation scene corresponding to the test scene according to the simulation model corresponding to each object.
In one embodiment, before acquiring the field template, the method further includes:
acquiring historical simulation data, and extracting fields of each historical simulation model from the historical simulation data to obtain a field set corresponding to each historical simulation model; respectively acquiring fields in a field set corresponding to each historical simulation model; counting the repetition rate of each field; and forming a field template according to the fields with the repetition rate larger than the preset threshold value.
In one embodiment, determining a target field of each object from a field template according to the feature information corresponding to each object, and assigning a value to the target field of each object according to the feature information corresponding to each object includes:
determining a characteristic field and a characteristic field value corresponding to each object according to the characteristic information corresponding to each object; matching the candidate fields in the field template with the characteristic fields corresponding to the objects; when the matching is successful, taking the candidate fields successfully matched with the characteristic fields corresponding to the objects as target fields of the objects; and assigning values to the target fields of the objects according to the characteristic field values corresponding to the objects.
In one embodiment, the method further comprises the following steps:
when the matching fails, the characteristic field which fails in matching is used as an update candidate field;
the update candidate field is added to the field template.
In one embodiment, before determining the feature field and the feature field value corresponding to each object according to the feature information corresponding to each object, the method further includes:
and inputting the characteristic information corresponding to each object into a pre-trained deep learning neural network, and outputting the characteristic field and the characteristic field value corresponding to each object.
In one embodiment, before inputting the feature information corresponding to each object into the pre-trained deep learning neural network, the method further includes:
acquiring object sample data; and training the deep learning neural network according to the characteristic information, the field and the field value of the sample object obtained by the object sample data.
In one embodiment, training the deep learning neural network according to feature information, fields and field values of an object sample obtained from object sample data includes:
acquiring characteristic information of the object according to the object sample data, and inputting the characteristic information of the object into a deep learning neural network for unsupervised training; acquiring a field and a field value corresponding to the characteristic information of the sample object from the object sample data, taking the acquired characteristic information of the sample object as input data of the deep learning neural network, taking the acquired field and field value as expected output of the deep learning neural network, and performing supervised training on the deep learning neural network.
A test scenario simulation apparatus, comprising:
the acquisition module is used for acquiring characteristic information corresponding to each object in the test scene; acquiring a field template, wherein the field template comprises a plurality of candidate fields;
the object field determining module is used for determining the object field of each object from the field template according to the characteristic information corresponding to each object and assigning values to the object field of each object according to the characteristic information corresponding to each object;
the simulation model generation module is used for generating target codes corresponding to the target fields of the objects according to the assigned target fields of the objects and generating simulation models corresponding to the objects according to the target codes corresponding to the objects;
and the simulation scene establishing module is used for establishing a simulation scene corresponding to the test scene according to the simulation model corresponding to each object.
In one aspect, a computer device is provided, the computer device comprising a processor and a memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the steps of:
acquiring characteristic information corresponding to each object in a test scene;
acquiring a field template, wherein the field template comprises a plurality of candidate fields;
determining a target field of each object from a field template according to the characteristic information corresponding to each object, and assigning values to the target fields of each object according to the characteristic information corresponding to each object;
generating target codes corresponding to the target fields of the objects according to the assigned target fields of the objects, and generating simulation models corresponding to the objects according to the target codes corresponding to the objects;
and establishing a simulation scene corresponding to the test scene according to the simulation model corresponding to each object.
In one aspect, one or more non-volatile storage media are provided that store computer-readable instructions that, when executed by one or more processors, cause the one or more processors to perform the steps of:
acquiring characteristic information corresponding to each object in a test scene;
acquiring a field template, wherein the field template comprises a plurality of candidate fields;
determining a target field of each object from a field template according to the characteristic information corresponding to each object, and assigning values to the target fields of each object according to the characteristic information corresponding to each object;
generating target codes corresponding to the target fields of the objects according to the assigned target fields of the objects, and generating simulation models corresponding to the objects according to the target codes corresponding to the objects;
and establishing a simulation scene corresponding to the test scene according to the simulation model corresponding to each object.
According to the test scene simulation method, the test scene simulation device, the computer equipment and the storage medium, the characteristic information corresponding to each object in the test scene is obtained; acquiring a field template, wherein the field template comprises a plurality of candidate fields; determining a target field of each object from the field template according to the characteristic information corresponding to each object, and assigning values to the target field of each object according to the characteristic information corresponding to each object; generating target codes corresponding to the target fields of the objects according to the assigned target fields of the objects, and generating simulation models corresponding to the objects according to the target codes corresponding to the objects; and establishing a simulation scene corresponding to the test scene according to the simulation model corresponding to each object. Thus, the field template includes a plurality of candidate fields, each candidate field having a corresponding code. The characteristic information corresponding to the object can represent the function corresponding to the object, so that the function corresponding to the object can be realized by assigning the target field, the target code is automatically generated according to the target field, and then the simulation model corresponding to the object is obtained, the code does not need to be changed, redundant codes are reduced, the maintenance of the code is reduced, and the code generation efficiency is improved.
Drawings
For a better understanding of the description and/or illustration of embodiments and/or examples of those applications disclosed herein, reference may be made to one or more of the drawings. The additional details or examples used to describe the figures should not be considered as limiting the scope of any of the disclosed inventions, the presently described embodiments and/or examples, and the presently understood best mode of these applications.
Fig. 1 is a schematic diagram of an application environment of the test scenario simulation method in one embodiment.
FIG. 2 is a flowchart illustrating a test scenario simulation method according to an embodiment.
Fig. 3 is a flowchart illustrating a step of determining a target field of each object from a field template according to feature information corresponding to each object and assigning a value to the target field of each object according to the feature information corresponding to each object in one embodiment.
Fig. 4 is a flowchart illustrating a test scenario simulation method in another embodiment.
FIG. 5 is a block diagram of a test scenario simulation apparatus according to an embodiment
Fig. 6 is a schematic diagram of an internal configuration of a server in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that the terms "first," "second," and the like, as used in the embodiments of the present application, may be used herein to describe various elements, but the elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first control may be referred to as a second control, both of which are controls, but which are not the same, without departing from the scope of the present application.
FIG. 1 is a diagram of an application environment of the test scenario simulation method in one embodiment. As shown in fig. 1, the application environment includes a terminal 102 and a server 104, where the terminal 102 may specifically be a desktop terminal or a mobile terminal, and the mobile terminal may specifically be at least one of a mobile phone, a tablet computer, a notebook computer, and the like. The server 104 may be a single server or a cluster of servers, and the terminal 102 and the server 104 communicate via a network.
Specifically, the server 104 obtains feature information corresponding to each object in the test scene from the terminal 102. The server 104 obtains a field template, where the field template includes a plurality of candidate fields, determines a target field of each object from the field template according to the feature information corresponding to each object, and assigns a value to the target field of each object according to the feature information corresponding to each object. The server 104 generates a target code corresponding to the target field of each object according to the assigned target field of each object, generates a simulation model corresponding to each object according to the target code corresponding to each object, and establishes a simulation scene corresponding to the test scene according to the simulation model corresponding to each object.
It is to be understood that the above application scenario is only an example, and does not constitute a limitation to the data processing method of the present application, for example, the data processing method provided in the present application may also be executed in a terminal.
FIG. 2 is a flow diagram of a test scenario simulation method in one embodiment. As shown in fig. 2, a test scenario simulation method, which is described by taking the application to the server in fig. 1 as an example, specifically includes:
s202, acquiring characteristic information corresponding to each object in the test scene.
The test scene is a set of surrounding objects and environmental states within a certain period of time in the running process of the automatic driving automobile. The robustness of the autopilot algorithm can be tested by driving the autopilot in different test scenarios. The test scenario may be a real scenario or a virtual extreme scenario. The test scenarios include, but are not limited to, school scenarios, pedestrian dense scenarios, rural scenarios, tunnel scenarios, cross scenarios (e.g., T-shaped cross scenarios, and ring-shaped cross scenarios), etc. The extreme scene may be an extreme weather scene, an extreme disaster scene, or the like, or may be a scene including an object with abnormal behavior.
An object is a substance that exhibits a corresponding property or performs a corresponding operation according to a time course or an event course. The object includes, but is not limited to, an object that can react to surroundings according to the condition of a road at a certain time and state, an object that changes regularly or does not change, and the like. Objects that react to the surroundings depending on the condition of the road at a certain moment and state include, but are not limited to, pedestrians, bicycles, small vehicles, large vehicles, etc. Objects that may or may not change regularly include, but are not limited to, road barriers, traffic lights, traffic signs, and the like.
The feature information is information for describing the features and attributes of the object. The characteristic information includes, but is not limited to, a behavior description of the object, a location description of the object, a shape description of the object, and the like. For example, the characteristic information of the pedestrian crossing signal lamp may be: coordinates (x1, y1, z1), height 2m, red light flickered at 1HZ for 15s and then switched to green light, and green light flickered at 1HZ for 15s and then switched to red light. The characteristic information of the bicycle may be: coordinates (x2, y2, z2), length of the car body 1.7m, width of 0.7m, height of 1.0m, and speed of 16 km/h.
Specifically, the server may obtain feature information of each object in the test scene from the terminal. For example, when the test scene is a real scene, the server may receive sensor data uploaded by a sensor of the autonomous vehicle, and determine feature information of each object in the real scene according to the sensor data. When the test scene is an extreme scene, the user can input the characteristic information of each object in the extreme scene at the terminal, and the terminal sends the characteristic information of each object in the extreme scene to the server.
In one embodiment, the autonomous vehicle carries various sensors, such as a lidar, a millimeter wave radar, and a camera. The sensor data can comprise pictures collected by the camera, and the server identifies the pictures after receiving the pictures, so that the shapes and structures of objects in the pictures can be identified. For example, surrounding vehicles, pedestrians, road markings, traffic sign text, traffic lights, etc. can be recognized from the picture. The sensor data may also include data collected by a lidar. The data that laser radar gathered can play the additional effect to the picture that the camera was gathered to obtain the distance of each object apart from laser radar and the velocity of motion of each object. And the characteristic information of each object in the real scene can be determined by combining the high-precision map and the data acquired by various sensors.
S204, a field template is obtained, and the field template comprises a plurality of candidate fields.
Wherein the field template is a template consisting of a plurality of candidate fields. The field value corresponding to each candidate field is default to be null. The candidate field is a keyword for determining a function and function expression of the object, such as ID (Identity Document), Size, Velocity, and the like. According to different description dimensions and modes required by specific functions, the contents required to be filled in the candidate fields are different. According to the candidate fields filling the corresponding field values, the corresponding functions can be realized, and the objects with the corresponding functions are simulated.
Specifically, the server stores a field template, and the field template comprises a plurality of preset candidate fields, and each candidate field provides a blank default value. Each candidate field is an optional padding field, so the field template has high expansibility.
In one embodiment, the candidate fields in the field template may be set according to the drive test data. Autonomous vehicles carry various sensors such as laser radar, millimeter wave radar, and cameras. Therefore, when the automatic driving automobile is in the drive test, various sensor data can be obtained through the sensors to form the drive test data. According to the drive test data, a plurality of fields can be determined, and the fields are used as candidate fields to form a field template.
In one embodiment, the candidate fields in the field template may be set according to historical simulation data. The historical simulation data includes a plurality of historical simulation models. And determining a field set corresponding to each historical simulation model according to each historical simulation model. Fields included in each field set can be used as candidate fields to form a field template, and the applicability and the comprehensiveness of the field template are improved. Or counting the repetition rate of the fields in all the field sets, and sorting all the fields according to the repetition rate from large to small. And filtering fields with later repetition rates, namely filtering fields with low utilization rates, and taking the fields with earlier repetition rates as candidate fields to form a field template, so that server resources can be saved.
In one embodiment, the corresponding field template can be set according to the scene type, so that the specificity of the field template is improved. Because the similarity of the characteristic attributes of the objects in the scenes of the same type is larger, and the similarity of the characteristic attributes of the objects in the scenes of different types is smaller, the corresponding field template can be set according to the types of the scenes. For example, a scene of school type corresponds to a field template, which may be tagged with a school. When the test scenario is a small school playing scenario a, a field template with a school as a label can be acquired to simulate the test scenario.
S206, determining the target field of each object from the field template according to the characteristic information corresponding to each object, and assigning values to the target field of each object according to the characteristic information corresponding to each object.
Specifically, the server may determine the feature field and the feature field value corresponding to each object according to the feature information corresponding to each object. Furthermore, the server can search a candidate field identical to the characteristic field in the field template according to the characteristic field, use the searched candidate field as a target field of an object corresponding to the characteristic field, and assign a value to the target field according to a characteristic field value corresponding to the characteristic field. For example, when the characteristic information of the pedestrian crossing signal light is the coordinates (x1, y1, z1), the red light is switched to the green light after flashing at 1HZ for 15s, and the green light is switched to the red light after flashing at 1HZ for 15s, the characteristic fields of the pedestrian crossing signal light can be determined to be the coordinates, the color, and the switching frequency based on the characteristic information. Coordinate, color and switching frequency fields are present in the field template. Therefore, the target fields of the pedestrian crossing signal lamp are as follows: coordinates, colors and switching frequency, wherein the field values corresponding to the coordinates are (x1, y1, z1), the field values corresponding to the colors are red and green, and the field values corresponding to the switching frequency are 1.
In one embodiment, when a candidate field that does not match a feature field is found in the field template, the feature field may be added to the field template as a new candidate field.
In one embodiment, the feature field and the feature field value corresponding to the feature information of the object can be determined according to a pre-trained deep learning neural network. Inputting the characteristic information of the object into a pre-trained deep learning neural network, and outputting a characteristic field and a characteristic field value corresponding to the characteristic information by the pre-trained deep learning neural network.
In one embodiment, when the test scene is a real scene, the corresponding characteristic field and the characteristic field value of the object may be determined through the drive test data.
And S208, generating target codes corresponding to the target fields of the objects according to the assigned target fields of the objects, and generating simulation models corresponding to the objects according to the target codes corresponding to the objects.
In particular, the target field is a definition that conforms to a format specification. After the target field corresponding to the object is assigned according to the characteristic information of the object, the corresponding target code can be generated according to the format specification of the target field, and the simulation model of the object can be generated by running the target code.
In one embodiment, because the simulation models corresponding to the objects are all based on the same field template, the simulation models can be conveniently and efficiently managed. Because the bottom layers of all simulation models are completely the same based on the same field template, the completely unified data structure can be used for reading, calling, assigning and storing. The server can sequentially read each target field corresponding to the object, eliminates all fields only having blank default values, and automatically generates codes for the assigned target fields according to format specifications corresponding to the target fields, thereby obtaining different simulation models.
In one embodiment, the code may be automatically generated by a code generation tool from the format specification, i.e., the corresponding object code is automatically generated from the object field. The code generation tool has the characteristics of cross-platform, cross-language and portability, and does not need to be written by a user, so that the code generation efficiency is effectively improved.
In one embodiment, the destination field corresponding to each object may include a name field. The name field may be used to identify objects to which each simulation model corresponds in order to distinguish the simulation models to which different objects correspond.
And S210, establishing a simulation scene corresponding to the test scene according to the simulation model corresponding to each object.
The simulation scene is a multi-dimensional virtual world which is created in a simulation mode and reflects the change and interaction of an object in real time, and the multi-dimensional virtual world can be used for testing and evaluating an automatic driving algorithm. The automatic driving automobile is tested through the simulation scene, a method and theoretical guidance can be provided for the field test of the automatic driving automobile, and the safety of the field test can be improved.
Specifically, the server may establish a corresponding simulation scene according to the simulation model, for example, the simulation scene may include a simulation model corresponding to a static object such as a lane line, a traffic identifier, and a sidewalk, and a simulation model corresponding to a dynamic object such as a bicycle and a pedestrian. Further, the automatic driving algorithm can plan a driving route for the automatic driving automobile according to the simulation scene, and control the automatic driving automobile to drive in the simulation scene. The reliability of the autopilot algorithm can be evaluated based on the actual driving route of the autopilot.
The test scene simulation method comprises the steps of obtaining characteristic information corresponding to each object in a test scene; acquiring a field template, wherein the field template comprises a plurality of candidate fields; determining a target field of each object from the field template according to the characteristic information corresponding to each object, and assigning values to the target field of each object according to the characteristic information corresponding to each object; generating target codes corresponding to the target fields of the objects according to the assigned target fields of the objects, and generating simulation models corresponding to the objects according to the target codes corresponding to the objects; and establishing a simulation scene corresponding to the test scene according to the simulation model corresponding to each object. Thus, the field template includes a plurality of candidate fields, each candidate field having a corresponding code. The characteristic information corresponding to the object can represent the function corresponding to the object, so that the function corresponding to the object can be realized by assigning the target field, the target code is automatically generated according to the target field, and then the simulation model corresponding to the object is obtained, the code does not need to be changed, redundant codes are reduced, the maintenance of the code is reduced, and the code generation efficiency is improved.
In one embodiment, before S202, the test scenario simulation method further includes: acquiring historical simulation data, and extracting fields of each historical simulation model from the historical simulation data to obtain a field set corresponding to each historical simulation model; respectively acquiring fields in a field set corresponding to each historical simulation model; counting the repetition rate of each field; and forming a field template according to the fields with the repetition rate larger than the preset threshold value.
The repetition rate refers to the ratio of the number of repetitions of the field to the total number of field sets. The preset threshold is preset, and may be set according to actual requirements.
Specifically, the server may obtain the historical simulation data from a database, and may also obtain the historical simulation data from other servers. The historical simulation data includes a plurality of historical simulation scenarios, each historical simulation scenario including a plurality of historical simulation models. Since the historical simulation model is generated according to at least one field, the server can obtain a field set corresponding to the historical simulation model. The corresponding field set of each historical simulation model comprises at least one field. Each historical simulation model may include the same fields or different fields. For example, the field set a corresponding to the historical simulation model A includes field 1, field 2, field 3, and field 4. The field set B corresponding to the historical simulation model B comprises a field 1, a field 2, a field 5, a field 6 and a field 7. The server can calculate the repetition rate of each field, and compare the size relationship between the repetition rate of each field and a preset threshold value. And when the repetition rate of a field is greater than a preset threshold value, taking the field as a candidate field of the field template. And taking all fields with repetition rates larger than a preset threshold value as candidate fields to form a field template. ,
in the embodiment, the fields of the historical simulation models are extracted from the historical simulation data by acquiring the historical simulation data, so that a field set corresponding to each historical simulation model is obtained; respectively acquiring fields in a field set corresponding to each historical simulation model; counting the repetition rate of each field; and forming a field template according to the fields with the repetition rate larger than the preset threshold value. The repetition rate of the fields reflects the importance of the fields, the fields with higher importance form field templates, and the concentration of the field templates is improved. In addition, the fields with lower importance are filtered, and the storage resources of the server can be saved.
As shown in fig. 3, in one embodiment, S206 includes:
s302, determining the characteristic field and the characteristic field value corresponding to each object according to the characteristic information corresponding to each object.
S304, matching the candidate fields in the field template with the characteristic fields corresponding to the objects.
And S306, when the matching is successful, taking the candidate fields successfully matched with the characteristic fields corresponding to the objects as the target fields of the objects.
And S308, assigning values to the target fields of the objects according to the characteristic field values corresponding to the objects.
Specifically, the server may determine the feature field and the feature field value corresponding to each object according to the feature information of each object. After the feature field is obtained, the candidate field identical to the feature field can be searched in the field template. And when the candidate field corresponding to the characteristic field is found, the characteristic field is successfully matched. And taking the candidate field successfully matched as the target field corresponding to the characteristic field, namely taking the candidate field as the target field corresponding to the object. And when the characteristic fields corresponding to the object are matched successfully, target fields corresponding to the at least one characteristic field one by one can be determined from the field template. And assigning the target field corresponding to the characteristic field according to the characteristic field value corresponding to the characteristic field.
In one embodiment, S206 further comprises: when the matching fails, the characteristic field which fails in matching is used as an update candidate field; the update candidate field is added to the field template.
Specifically, when the field template does not find the same candidate field as the feature field, the feature field fails to match. And taking the feature field failed in matching as a new candidate field, and adding the new candidate field into the field template to obtain an updated field template. Furthermore, the object including the characteristic field may determine a corresponding target field from the updated field template, and generate a corresponding simulation model according to the target field. In addition, the new candidate field does not affect the candidate field in the field template before updating, and the code corresponding to the new candidate field does not affect the code corresponding to the candidate field in the field template before updating.
In one embodiment, S302 includes: and inputting the characteristic information corresponding to each object into a pre-trained deep learning neural network, and outputting the characteristic field and the characteristic field value corresponding to each object.
Specifically, the feature field and the feature field value corresponding to the feature information of the object can be determined according to the pre-trained deep learning neural network. For the deep learning neural network trained in advance, the feature information of the object is set as the input variable of the deep learning neural network, and the feature field value are set as the output variable of the deep learning neural network. Therefore, when the acquired feature information of the object is input into the pre-trained deep learning neural network through the input device, the pre-trained deep learning neural network can calculate and output the corresponding feature field and the feature field value of the object.
In the embodiment, through the application of the deep learning neural network, the characteristic field and the characteristic field value of the object can be predicted under the condition that the characteristic information of the object is obtained, and data support is further provided for establishing a simulation model.
In one embodiment, before inputting the feature information corresponding to each object into a pre-trained deep learning neural network and outputting the feature field and the feature field value corresponding to each object, the method includes: acquiring object sample data; and training the deep learning neural network according to the characteristic information, the field and the field value of the sample object obtained by the object sample data.
Specifically, a database stores a plurality of object sample data, which includes feature information, fields, and field values of a plurality of objects. In the deep learning neural network, input variables and output variables of the deep learning neural network are set in advance, feature information is set as the input variables of the deep learning neural network, and fields and field values are set as the output variables of the deep learning neural network. Therefore, the characteristic information, the fields and the field values of a plurality of objects are respectively used as the input and the expected output of the deep learning neural network, and the deep learning neural network is trained according to a plurality of groups of data. The trained deep learning neural network can predict output data from input data when in use. The deep learning neural network is trained, so that the obtained prediction result is more accurate.
In one embodiment, training a deep learning neural network according to feature information, fields and field values of a sample object obtained by object sample data includes: acquiring characteristic information of the object according to the object sample data, and inputting the characteristic information of the object into a deep learning neural network for unsupervised training; acquiring a field and a field value corresponding to the characteristic information of the sample object from the object sample data, taking the acquired characteristic information of the sample object as input data of the deep learning neural network, taking the acquired field and field value as expected output of the deep learning neural network, and performing supervised training on the deep learning neural network.
Specifically, the characteristic information value of the object is input into an input variable corresponding to the deep learning neural network for unsupervised training. After the unsupervised training, the supervised training is carried out on the deep learning neural network. When supervised training is carried out, input variables and expected outputs of the deep learning neural network are provided to be complete. For example, feature information of an object is used as an input variable of the deep learning neural network, a corresponding field and a corresponding field value are used as expected output of the deep learning neural network, and the deep learning neural network is supervised trained.
The object sample data stores characteristic information of a plurality of objects and corresponding fields and field values, that is, a plurality of groups of object data are stored in the object sample data, and the data type in each group of object data includes the characteristic information, the fields and the field values of each object. However, not every group of object data is complete, there may be some object data lacking fields and field values, and the part of object data lacking fields and field values may be used for unsupervised training of the deep learning neural network, so as to avoid data waste. Because the feature extraction capability of the deep learning neural network is trained firstly by the unsupervised training, the training effect of the deep learning neural network can be improved by carrying out the unsupervised training and then carrying out the supervised training, and the accuracy of the predicted output data of the trained deep learning neural network is improved.
In one embodiment, the unsupervised training is a bottom-up training mode. Single layer neurons can be constructed layer by layer, specifically comprising one input layer, one output layer, and multiple hidden layers. The input layer is arranged at the lowest part, and the input variable of the input layer is the characteristic information of the object. The output layer is at the top, and the output variables of the output layer are the characteristic field and the characteristic field value of the object. The hidden layer is arranged between the input layer and the output layer, and the number of layers of the hidden layer can be set according to actual needs. Unsupervised training is to train layer by layer from the input layer to obtain the output layer. Parameter tuning can be carried out on each layer by adopting a wake-sleep algorithm, and only one layer is adjusted each time, and the adjustment is carried out layer by layer. In unsupervised training, the expected output is not needed, and the purpose of unsupervised training is not to predict the output, but to sense the input. And carrying out unsupervised training on the deep learning neural network and then carrying out supervised training. The supervised training adopts a top-down training mode. On the basis of obtaining the parameters corresponding to the neurons of each layer through unsupervised training, a classifier is added to an output layer, and the parameters corresponding to the neurons of each layer are finely adjusted through supervised learning of complete object data and a gradient descent method.
In a specific embodiment, as shown in fig. 4, a test scenario simulation method is provided, which specifically includes the following steps:
s402, acquiring historical simulation data, and extracting fields of each historical simulation model from the historical simulation data to obtain a field set corresponding to each historical simulation model.
S404, respectively obtaining the fields in the field set corresponding to each historical simulation model.
S406, counting the repetition rate of each field.
And S408, forming a field template according to the fields with the repetition rate larger than the preset threshold value.
And S410, acquiring object sample data.
And S412, acquiring the characteristic information of the object according to the object sample data, and inputting the characteristic information of the object into the deep learning neural network for unsupervised training.
S416, acquiring fields and field values corresponding to the feature information of the sample object from the object sample data, taking the acquired feature information of the sample object as input data of the deep learning neural network, taking the acquired fields and field values as expected output of the deep learning neural network, and performing supervised training on the deep learning neural network.
And S418, acquiring characteristic information corresponding to each object in the test scene.
And S420, inputting the feature information corresponding to each object into a pre-trained deep learning neural network, and outputting the feature field and the feature field value corresponding to each object.
And S422, matching the candidate fields in the field template with the characteristic fields corresponding to the objects.
And S424, when the matching is successful, taking the candidate fields successfully matched with the characteristic fields corresponding to the objects as the target fields of the objects.
And S426, assigning values to the target fields of the objects according to the characteristic field values corresponding to the objects.
S428, generating target codes corresponding to the target fields of the objects according to the assigned target fields of the objects, and generating simulation models corresponding to the objects according to the target codes corresponding to the objects.
And S430, establishing a simulation scene corresponding to the test scene according to the simulation model corresponding to each object.
At the present stage, a plurality of objects in a simulation test scene are managed by using a plurality of containers, each container stores a certain type of specific object, predefined memory occupation is improved, the cycle number and the maintenance difficulty during processing are increased, and a data storage part of codes after loading needs to be changed when a new object is added every time. Alternatively, it is managed using a container that holds pointers to the base class, each of which points to an inherited class. When simulating an object in a test scene, the method needs to frequently perform type conversion of a base class and an inheritance class, and can solve the problems of pointer border crossing, memory leakage, system crash caused by incorrect type conversion during operation and the like.
By adopting the test scene simulation method, a container for storing the field template is uniformly used for management. Each object to be simulated is an instance that conforms to a field template, and different functions are implemented by assigning values to different fields, and unused fields are left empty by default. Therefore, even if a new field is added to the field template, the data storage code does not need to be changed, and the accuracy of the conventional code is not influenced. Meanwhile, the fields are stored in a variable mode instead of a pointer mode, and therefore the probability of system errors is effectively reduced.
It should be understood that, although the steps in the above-described flowcharts are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the above-described flowcharts may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or the stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least a portion of the sub-steps or stages of other steps.
FIG. 5 is a block diagram of a test scenario simulation apparatus according to an embodiment. As shown in fig. 5, a test scenario simulation apparatus includes an obtaining module 502, a target field determining module 504, a simulation model generating module 506, and a simulation scenario establishing module 508. Wherein:
an obtaining module 502, configured to obtain feature information corresponding to each object in a test scene; a field template is obtained, the field template including a plurality of candidate fields.
And a target field determining module 504, configured to determine a target field of each object from the field template according to the feature information corresponding to each object, and assign a value to the target field of each object according to the feature information corresponding to each object.
And the simulation model generating module 506 is configured to generate a target code corresponding to the target field of each object according to the assigned target field of each object, and generate a simulation model corresponding to each object according to the target code corresponding to each object.
And a simulation scene establishing module 508, configured to establish a simulation scene corresponding to the test scene according to the simulation model corresponding to each object.
In an embodiment, the obtaining module 502 is further configured to obtain historical simulation data, and extract fields of each historical simulation model from the historical simulation data to obtain a field set corresponding to each historical simulation model; respectively acquiring fields in a field set corresponding to each historical simulation model; counting the repetition rate of each field; and forming a field template according to the fields with the repetition rate larger than the preset threshold value.
In one embodiment, the target field determining module 504 is further configured to determine a feature field and a feature field value corresponding to each object according to feature information corresponding to each object; matching the candidate fields in the field template with the characteristic fields corresponding to the objects; when the matching is successful, taking the candidate fields successfully matched with the characteristic fields corresponding to the objects as target fields of the objects; and assigning values to the target fields of the objects according to the characteristic field values corresponding to the objects.
In one embodiment, the target field determination module 504 is further configured to, when the matching fails, take the feature field with the failed matching as the update candidate field; the update candidate field is added to the field template.
In one embodiment, the target field determining module 504 is further configured to input the feature information corresponding to each object into a pre-trained deep learning neural network, and output the feature field and the feature field value corresponding to each object.
In one embodiment, the target field determination module 504 is further configured to obtain object sample data; and training the deep learning neural network according to the characteristic information, the field and the field value of the sample object obtained by the object sample data.
In one embodiment, the target field determining module 504 is further configured to obtain feature information of the object according to the object sample data, and input the feature information of the object into the deep learning neural network for unsupervised training; acquiring a field and a field value corresponding to the characteristic information of the sample object from the object sample data, taking the acquired characteristic information of the sample object as input data of the deep learning neural network, taking the acquired field and field value as expected output of the deep learning neural network, and performing supervised training on the deep learning neural network.
The test scene simulation device acquires the characteristic information corresponding to each object in the test scene; acquiring a field template, wherein the field template comprises a plurality of candidate fields; determining a target field of each object from the field template according to the characteristic information corresponding to each object, and assigning values to the target field of each object according to the characteristic information corresponding to each object; generating target codes corresponding to the target fields of the objects according to the assigned target fields of the objects, and generating simulation models corresponding to the objects according to the target codes corresponding to the objects; and establishing a simulation scene corresponding to the test scene according to the simulation model corresponding to each object. Thus, the field template includes a plurality of candidate fields, each candidate field having a corresponding code. The characteristic information corresponding to the object can represent the function corresponding to the object, so that the function corresponding to the object can be realized by assigning the target field, the target code is automatically generated according to the target field, and then the simulation model corresponding to the object is obtained, the code does not need to be changed, redundant codes are reduced, the maintenance of the code is reduced, and the code generation efficiency is improved.
For the specific limitations of the test scenario simulation apparatus, reference may be made to the limitations of the test scenario simulation method in the foregoing, and details are not repeated here. All or part of the modules in the test scene simulation device can be realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In some embodiments, a computer device is provided, which may be the server 104 in fig. 1, and its internal structure diagram may be as shown in fig. 6. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a test scenario simulation method.
Those skilled in the art will appreciate that the architecture shown in fig. 6 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, the test scenario simulation apparatus provided in the present application may be implemented in the form of a computer program, and the computer program may be run on a computer device as shown in fig. 6. The memory of the computer device may store various program modules constituting the test scenario simulation apparatus, such as the acquisition module, the target field determination module, the simulation model generation module, and the simulation scenario establishment module shown in fig. 5. The computer program constituted by the respective program modules causes the processor to execute the steps in the test scenario simulation method of the embodiments of the present application described in the present specification.
For example, the computer device shown in fig. 6 may execute step S202 through the acquisition module in the test scenario simulation apparatus shown in fig. 5. The computer device may perform step S206 by the target field determination module. The computer device may perform step S208 through the simulation model generation module. The computer device may perform step S210 through the simulation scenario setup.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory storing a computer program which, when executed by the processor, causes the processor to perform the steps of the test scenario simulation method described above. Here, the steps of the test scenario simulation method may be steps in the test scenario simulation methods of the above embodiments.
In one embodiment, a computer-readable storage medium is provided, in which a computer program is stored, which, when executed by a processor, causes the processor to perform the steps of the test scenario simulation method described above. Here, the steps of the test scenario simulation method may be steps in the test scenario simulation methods of the above embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (16)

  1. A test scenario simulation method is characterized by comprising the following steps:
    acquiring characteristic information corresponding to each object in a test scene;
    obtaining a field template, wherein the field template comprises a plurality of candidate fields;
    determining a target field of each object from the field template according to the characteristic information corresponding to each object, and assigning values to the target fields of each object according to the characteristic information corresponding to each object;
    generating target codes corresponding to the target fields of the objects according to the assigned target fields of the objects, and generating simulation models corresponding to the objects according to the target codes corresponding to the objects;
    and establishing a simulation scene corresponding to the test scene according to the simulation model corresponding to each object.
  2. The method of claim 1, wherein prior to obtaining the field template, the method further comprises:
    acquiring historical simulation data, and extracting fields of each historical simulation model from the historical simulation data to obtain a field set corresponding to each historical simulation model;
    respectively acquiring fields in a field set corresponding to each historical simulation model;
    counting the repetition rate of each field;
    and forming the field template according to the fields with repetition rates larger than a preset threshold value.
  3. The method according to claim 1, wherein the determining the target field of each object from the field template according to the feature information corresponding to each object, and assigning the target field of each object according to the feature information corresponding to each object comprises:
    determining a characteristic field and a characteristic field value corresponding to each object according to the characteristic information corresponding to each object;
    matching the candidate fields in the field template with the characteristic fields corresponding to the objects;
    when the matching is successful, taking the candidate fields successfully matched with the characteristic fields corresponding to the objects as target fields of the objects;
    and assigning values to the target fields of the objects according to the characteristic field values corresponding to the objects.
  4. The method of claim 3, further comprising:
    when the matching fails, the characteristic field which fails in matching is used as an update candidate field;
    adding the update candidate field to the field template.
  5. The method according to claim 3, wherein before determining the feature field and the feature field value corresponding to each object according to the feature information corresponding to each object, the method further comprises:
    and inputting the characteristic information corresponding to each object into a pre-trained deep learning neural network, and outputting the characteristic field and the characteristic field value corresponding to each object.
  6. The method according to claim 5, wherein before inputting the feature information corresponding to each object into the pre-trained deep learning neural network, the method further comprises:
    acquiring object sample data;
    and training the deep learning neural network according to the characteristic information, the field and the field value of the sample object obtained by the object sample data.
  7. The method of claim 6, wherein training the deep learning neural network according to the feature information, the field and the field value of the sample object obtained by the object sample data comprises:
    acquiring characteristic information of an object according to the object sample data, and inputting the characteristic information of the object into the deep learning neural network for unsupervised training;
    and acquiring a field and a field value corresponding to the characteristic information of the sample object from the object sample data, taking the acquired characteristic information of the sample object as input data of the deep learning neural network, taking the acquired field and field value as expected output of the deep learning neural network, and performing supervised training on the deep learning neural network.
  8. A test scenario simulation apparatus, comprising:
    the acquisition module is used for acquiring characteristic information corresponding to each object in the test scene; obtaining a field template, wherein the field template comprises a plurality of candidate fields;
    the object field determining module is used for determining the object field of each object from the field template according to the characteristic information corresponding to each object and assigning values to the object field of each object according to the characteristic information corresponding to each object;
    the simulation model generation module is used for generating target codes corresponding to the target fields of the objects according to the assigned target fields of the objects and generating simulation models corresponding to the objects according to the target codes corresponding to the objects;
    and the simulation scene establishing module is used for establishing the simulation scene corresponding to the test scene according to the simulation model corresponding to each object.
  9. The device according to claim 8, wherein the obtaining module is further configured to obtain historical simulation data, extract fields of each historical simulation model from the historical simulation data, and obtain a field set corresponding to each historical simulation model; respectively acquiring fields in a field set corresponding to each historical simulation model; counting the repetition rate of each field; and forming the field template according to the fields with repetition rates larger than a preset threshold value.
  10. The apparatus according to claim 8, wherein the target field determining module is further configured to determine a feature field and a feature field value corresponding to each object according to feature information corresponding to each object; matching the candidate fields in the field template with the characteristic fields corresponding to the objects; when the matching is successful, taking the candidate fields successfully matched with the characteristic fields corresponding to the objects as target fields of the objects; and assigning values to the target fields of the objects according to the characteristic field values corresponding to the objects.
  11. The apparatus of claim 10, wherein the target field determination module is further configured to, when matching fails, use the matching-failed feature field as an update candidate field; adding the update candidate field to the field template.
  12. The apparatus of claim 10, wherein the target field determination module is further configured to input feature information corresponding to each object into a pre-trained deep learning neural network, and output the feature field and the feature field value corresponding to each object.
  13. The apparatus of claim 12, wherein the target field determination module is further configured to obtain object sample data; and training the deep learning neural network according to the characteristic information, the field and the field value of the sample object obtained by the object sample data.
  14. The apparatus according to claim 12, wherein the target field determination module is further configured to obtain feature information of an object according to the object sample data, and input the feature information of the object into the deep learning neural network for unsupervised training; and acquiring a field and a field value corresponding to the characteristic information of the sample object from the object sample data, taking the acquired characteristic information of the sample object as input data of the deep learning neural network, taking the acquired field and field value as expected output of the deep learning neural network, and performing supervised training on the deep learning neural network.
  15. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor implements the steps of the method of any one of claims 1 to 7 when executing the computer program.
  16. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202080003153.5A 2020-01-21 2020-01-21 Test scene simulation method and device, computer equipment and storage medium Pending CN113498511A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/073476 WO2021146906A1 (en) 2020-01-21 2020-01-21 Test scenario simulation method and apparatus, computer device, and storage medium

Publications (1)

Publication Number Publication Date
CN113498511A true CN113498511A (en) 2021-10-12

Family

ID=76992629

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080003153.5A Pending CN113498511A (en) 2020-01-21 2020-01-21 Test scene simulation method and device, computer equipment and storage medium

Country Status (2)

Country Link
CN (1) CN113498511A (en)
WO (1) WO2021146906A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115828638A (en) * 2023-01-09 2023-03-21 西安深信科创信息技术有限公司 Automatic driving test scene script generation method and device and electronic equipment

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115630106B (en) * 2022-10-28 2023-06-30 上海柯林布瑞信息技术有限公司 Multi-scene parameter receiving method and device based on general format analysis
CN115830255B (en) * 2022-11-28 2023-11-21 北京百度网讯科技有限公司 Simulation scene generation method and device, electronic equipment and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106096192B (en) * 2016-06-27 2019-05-28 百度在线网络技术(北京)有限公司 A kind of construction method and device of the test scene of automatic driving vehicle
CN109446371A (en) * 2018-11-09 2019-03-08 苏州清研精准汽车科技有限公司 A kind of intelligent automobile emulation testing scene library generating method and test macro and method
CN110597086B (en) * 2019-08-19 2023-01-13 深圳元戎启行科技有限公司 Simulation scene generation method, unmanned driving system test method and device
CN110502852B (en) * 2019-08-27 2022-12-13 上海汽车集团股份有限公司 Generation method and generation system of automatic driving simulation test scene

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115828638A (en) * 2023-01-09 2023-03-21 西安深信科创信息技术有限公司 Automatic driving test scene script generation method and device and electronic equipment

Also Published As

Publication number Publication date
WO2021146906A1 (en) 2021-07-29

Similar Documents

Publication Publication Date Title
US20210101619A1 (en) Safe and scalable model for culturally sensitive driving by automated vehicles
CN113498511A (en) Test scene simulation method and device, computer equipment and storage medium
CN106198049A (en) Real vehicles is at ring test system and method
CN111582189B (en) Traffic signal lamp identification method and device, vehicle-mounted control terminal and motor vehicle
CN112417756A (en) Interactive simulation test system of automatic driving algorithm
CN112115600A (en) Simulation system for automatically driving vehicle
Niranjan et al. Deep learning based object detection model for autonomous driving research using carla simulator
CN112212874A (en) Vehicle track prediction method and device, electronic equipment and computer readable medium
US20220318464A1 (en) Machine Learning Data Augmentation for Simulation
CN110716529A (en) Automatic generation method and device for automatic driving test case
CN113343461A (en) Simulation method and device for automatic driving vehicle, electronic equipment and storage medium
CN110688311A (en) Test case management method and device for automatic driving typical scene
CN111752258A (en) Operation test of autonomous vehicle
US11798225B2 (en) 3D building generation using topology
US11875680B2 (en) Systems and methods for augmenting perception data with supplemental information
CN115795808A (en) Automatic driving decision dangerous scene generation method, system, equipment and medium
King et al. Capturing the Variety of Urban Logical Scenarios from Bird-view Trajectories.
Vlachogiannis et al. Intersense: An XGBoost model for traffic regulator identification at intersections through crowdsourced GPS data
Nikitin et al. Traffic Signs Recognition System Development
CN111983934B (en) Unmanned vehicle simulation test case generation method and system
CN115098079B (en) Radar detection model determination method, system, electronic device and readable storage medium
CN113119996B (en) Trajectory prediction method and apparatus, electronic device and storage medium
CN116718181B (en) Map generation method, map generation device, electronic equipment and storage medium
CN115408822A (en) Threshold-based scene generation method and device and storage medium
CN116665157B (en) Road image processing method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination