CN116561650B - Scene file classification and updating method, device and equipment based on tree structure - Google Patents

Scene file classification and updating method, device and equipment based on tree structure Download PDF

Info

Publication number
CN116561650B
CN116561650B CN202310833543.0A CN202310833543A CN116561650B CN 116561650 B CN116561650 B CN 116561650B CN 202310833543 A CN202310833543 A CN 202310833543A CN 116561650 B CN116561650 B CN 116561650B
Authority
CN
China
Prior art keywords
file
classification
scene
target
tag
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310833543.0A
Other languages
Chinese (zh)
Other versions
CN116561650A (en
Inventor
陈蔯
李晓婷
刘诗曼
付艳红
王萌
陈硕
胡鑫
陈则毅
赵鹏超
谢卉瑜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongqi Zhilian Technology Co ltd
Original Assignee
Zhongqi Zhilian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongqi Zhilian Technology Co ltd filed Critical Zhongqi Zhilian Technology Co ltd
Priority to CN202310833543.0A priority Critical patent/CN116561650B/en
Publication of CN116561650A publication Critical patent/CN116561650A/en
Application granted granted Critical
Publication of CN116561650B publication Critical patent/CN116561650B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/80Information retrieval; Database structures therefor; File system structures therefor of semi-structured data, e.g. markup language structured data such as SGML, XML or HTML
    • G06F16/83Querying
    • G06F16/835Query processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/12Use of codes for handling textual entities
    • G06F40/14Tree-structured documents
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention discloses a scene file classification and updating method, device and equipment based on a tree structure. The method comprises the following steps: acquiring a scene file to be classified, and storing the scene file to be classified in a data pool of a cloud; searching target classified files matched with the tag files from all existing classified files; the target classification file is to display label data corresponding to each entity object in all the existing scene files in the target scene library in a classification tree mode; the target classification file comprises multiple levels of nodes, each level of nodes comprises a classification identifier and corresponding tag data, and the child level of nodes inherit the tag data of the parent level of nodes; and adding classification identifiers for the scene files, and storing the added classification identifiers as classification results of the scene files in a target scene library. The invention can realize automatic classification of the scene files, improves the management efficiency of the scene files, and provides convenience for subsequently improving the retrieval efficiency of the scene files.

Description

Scene file classification and updating method, device and equipment based on tree structure
Technical Field
The present invention relates to the field of data identification and representation, and in particular, to a method, an apparatus, and a device for classifying and updating a scene file based on a tree structure.
Background
The intelligent network-connected automobile is taken as a product of the fusion development of a new generation of information technology and transportation, and is an important content for accelerating the construction of the traffic country through the scientific and technological innovation support of China. Before intelligent network-connected automobiles are produced on a large scale, intelligent network-connected automobile safety verification is necessary by utilizing simulation software, intelligent driving simulation verification is an important ring in intelligent network-connected automobile safety verification, and scene files are key bases for simulation verification, so that the scene files and intelligent driving simulation are organically combined.
At present, because a large number of scene files are scattered on the local areas of enterprises, the problems of incapability of unified management, data disorder, fuzzy limit and data redundancy exist in the existing scene files, so that the retrieval efficiency of users in a scene retrieval stage is low, and the mass production and the commercial use of intelligent network-connected automobiles are seriously hindered.
In view of this, the present invention has been made.
Disclosure of Invention
In order to solve the technical problems, the invention provides a scene file classifying and updating method, device and equipment based on a tree structure, which can realize the rapid matching of a tag file and a target classification file on the basis that the classification file displays tag data corresponding to each entity object in all the existing scene files in a corresponding scene library in a classification tree mode, thereby realizing the classification of the scene file and solving the technical problems of inconvenient scene file management and data redundancy in the prior art.
The embodiment of the invention provides a scene file classification and updating method based on a tree structure, which comprises the following steps:
acquiring a scene file to be classified, and storing the scene file to be classified in a data pool of a cloud;
searching target classified files matched with the tag files from all existing classified files; the label file is a file generated by adding labels to the scene file, and the target classification file is label data corresponding to each entity object in all the existing scene files in the target scene library in a classification tree mode; the target classification file comprises multiple stages of nodes, each stage of nodes comprises a classification identifier and corresponding tag data, and a child stage of nodes inherits the tag data of a parent stage of nodes; the target scene library is one of all existing classification scene libraries, and each classification scene library corresponds to one classification file;
in the searching process, determining nodes containing all labels in the label file in the classification file as target nodes, and determining the classification file in which the target nodes are located as a target classification file;
in the target classification file, if the target node has a sub-level node and any tag data of the sub-level node of the target node does not appear in the tag file, adding a classification identifier of the target node to the scene file;
In the target classification file, if a parent node exists in the target node, adding a classification identifier of the target node and a classification identifier of the parent node of the target node to the scene file;
classifying the scene file into a category corresponding to the added classification identifier in the target scene library according to the added classification identifier;
responding to a first updating operation of the target classified file, and when the type of the first updating operation is a deletion target node and a father node exists in the target node, adding a new classified identification for the scene file again to realize the reclassification of the scene file;
and updating the tag file in response to a second updating operation on the tag file.
The embodiment of the invention provides a scene file classifying and updating device based on a tree structure, which comprises the following steps:
the acquisition and storage module is used for acquiring the scene files to be classified and storing the scene files to be classified in a data pool of the cloud;
the searching module is used for searching the target classification files matched with the tag files from all the existing classification files; the label file is a file generated by adding labels to the scene file, and the target classification file is label data corresponding to each entity object in all the existing scene files in the target scene library in a classification tree mode; the target classification file comprises multiple stages of nodes, each stage of nodes comprises a classification identifier and corresponding tag data, and a child stage of nodes inherits the tag data of a parent stage of nodes; the target scene library is one of all existing classification scene libraries, and each classification scene library corresponds to one classification file;
The searching module is further used for determining nodes containing all labels in the label file in the classification file as target nodes in the searching process, and determining the classification file in which the target nodes are located as a target classification file;
the first adding module is used for adding the classification identifier of the target node to the scene file if the target node has a sub-level node and any tag data of the sub-level node of the target node does not appear in the tag file in the target classification file;
the second adding module is used for adding the classification identifier of the target node and the classification identifier of the parent node of the target node to the scene file if the parent node exists in the target classification file;
the classification module is used for classifying the scene files into categories corresponding to the added classification identifiers in the target scene library according to the added classification identifiers;
the adding classification module is used for responding to a first updating operation of the target classification file, and adding a new classification identifier for the scene file again when the type of the first updating operation is a deletion target node and the target node has a parent node so as to realize the reclassification of the scene file;
And the updating module is used for responding to a second updating operation of the tag file and updating the tag file.
The embodiment of the invention provides electronic equipment, which comprises:
a processor and a memory;
the processor is configured to execute the steps of the tree structure-based scene file classification and updating method according to any of the embodiments by calling a program or an instruction stored in the memory.
An embodiment of the present invention provides a computer readable storage medium storing a program or instructions for causing a computer to execute the steps of the tree structure-based scene file classification and updating method according to any of the embodiments.
The embodiment of the invention has the following technical effects:
and storing the scene files to be classified in a data pool of the cloud in a unified storage mode, and determining a target classification file by matching the tag file with the classification files existing in the classification tree form, so as to classify the scene files in the target scene library. The invention realizes unified management and classification of the scene files based on one data pool and a plurality of classification scene libraries, and on the basis, the classification accuracy and classification efficiency of the scene files can be improved by matching the tag files with the classification files, updating the tag files and updating the classification files, so that the technical problems of incapability of unified management, disordered data, fuzzy limit and data redundancy of the scene files are avoided, the management efficiency and classification accuracy of the scene files are improved, and convenience is provided for improving the retrieval efficiency of the scene files in the later period.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of a method for classifying and updating scene files based on a tree structure according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a relationship between a data pool and a categorized scene library in the cloud;
FIG. 3 is a schematic diagram of the structure of a classification tree;
FIG. 4 is a schematic flow chart of searching a target classification file according to an embodiment of the present invention;
fig. 5 is a schematic flow chart of generating a tag file corresponding to a scene file to be classified according to an embodiment of the present invention;
FIG. 6 is a schematic flow chart of generating a target classification file according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a device for classifying and updating scene files based on a tree structure according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below. It will be apparent that the described embodiments are only some, but not all, embodiments of the invention. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the invention, are within the scope of the invention.
The scene file classifying and updating method based on the tree structure is mainly suitable for the situation that the scene files are classified and stored before the scene files are screened and the target vehicles are tested by the scene files. The scene file classification and updating method based on the tree structure provided by the embodiment of the invention can be executed by the electronic equipment integrated in the main controller of the vehicle or independent of the target vehicle.
Example 1:
the method for classifying and updating the scene files based on the tree structure in the embodiment relies on a scene management platform, and the platform is realized based on the following four files: scene files, tag templates, and classification files.
Wherein, the scene file refers to: the OpenDRIVE static scene file and the OpenSCENARIO dynamic scene file can be uploaded in a combined mode or independently.
The tag file or the tag tree refers to a tag file integrally formed by the tags of the same scene file after each scene file is to be tagged with different tags, the tag file and the scene file are stored together in a database (e.g., a DB database), and the scene file and the tag file belong to the same scene.
The label templates refer to some label templates in a label management module (namely a second functional module described below) in the current platform, such as ODD/SOTIF/custom labels, and each type of label template has a separate label tree file stored in a database. It should be noted that: the tag tree files are stored in the database in a tree structure, but the tag templates corresponding to the tag tree files are used as reference files for generating tag trees for each scene, the tag tree of each scene is composed of tags selected from the tag templates, and the tag tree inherits the hierarchical relationship of the tag tree files. For example, a label template may contain 500 labels (corpus), but a label tree corresponding to one scene may only use 30 labels therein. These 30 labels inherit the original hierarchical relationship.
A classification file, or classification tree, each class in the classification tree containing parent and child nodes; each level in the classification tree consists of a level name (i.e., a node name, which may be identified as a classification) and a corresponding plurality of labels. The categories and hierarchies are combined to form a classification tree.
The platform mainly comprises four functional modules, wherein the first functional module is used for uploading scenes and executing the following processes: and selecting a tag library, uploading the scene file and storing the scene file into a data pool of the cloud. According to the cloud management method and the cloud management device, cloud management of mass scene files is achieved through the mode that the scene files are stored in the cloud, and the problem that local scene libraries of enterprises are scattered and cannot be managed uniformly can be solved.
The second functional module is used for scene annotation and label template management, wherein the purpose of the scene annotation is to automatically generate a label file corresponding to the scene file in an automatic annotation and/or manual annotation mode depending on the existing label library in the platform. The embodiment realizes the key feature labeling of the scene file, can more scientifically and reasonably carry out classification management, solves the problems of disordered data, fuzzy limit and repeated redundancy of the data of the current scene library, and further realizes the efficient retrieval application of the scene file.
The third functional module is used for scene classification, and executes the following procedures: the method comprises the steps of firstly relying on the existing classified file library in a platform, matching classified files from the classified file library, listing target classified files matched with scene files, and then transferring the scene files to a target scene library corresponding to the target classified files.
The fourth functional module is used for retrieving the scene file, specifically, screening the scene file and then applying the scene file. According to the embodiment, the scene file is searched in a mode of searching the tag data, key features can be combined, the scene file meeting the test and verification requirements can be quickly searched, and the application efficiency of the scene file is greatly improved.
The functions of the four functional modules in embodiment 1 are specifically analyzed as follows, and are not described herein.
Fig. 1 is a flowchart of a method for classifying and updating a scene file based on a tree structure according to an embodiment of the present invention. Referring to fig. 1, the method for classifying and updating scene files based on a tree structure specifically includes the following steps:
s10, acquiring a scene file to be classified, and storing the scene file to be classified in a data pool of a cloud.
In this embodiment, the scene file, or referred to as scene data, contains at least one entity object, or referred to as node, containing static and/or dynamic entities, including but not limited to: nodes in road structures, nodes in traffic facilities, nodes in temporary changes in traffic, nodes in traffic participants, nodes in natural environments; dynamic entities include, but are not limited to: nodes corresponding to signal lamp control logic and nodes corresponding to events.
In this embodiment, the scene file is a formed, structured, standard simulation scene file, which may be described in a structured extensible markup language. The scene files to be classified, namely uploading scene files, can be in the form of single openX files such as static road network files (openDRIVE files), dynamic scene files (openSCENARIO files) and the like, or can be combined files of openX. It should be understood that uploading a single file refers to uploading the. Xodr file and the. Xosc file separately, creating two corresponding scene files. Uploading the combined file refers to uploading the xodr file and the xosc file in a compressed file mode, creating a scene file, further realizing automatic scanning of the scene file based on a tag library, generating a tag file, and further prompting a matched target classification file.
Therefore, the embodiment supports the uploading of OpenDRIVE files and OpenSCENARIO file single files, and supports the uploading of OpenDRIVE files and OpenSCENARIO file combination files zip.
On the uploading interface, a selection icon for the scene file can be set, and the user can click on the icon and quickly add the icon to the scene file in a path searching mode. After uploading, the present embodiment may display the name of the uploaded scene file, the format of the scene file, the name of the uploading person, and the like. On the interface, the user can perform various operations such as classification, deletion and the like on the scene file.
In this embodiment, one data pool corresponds to a plurality of scene libraries, and because the classification files corresponding to different scene libraries are different, the scene libraries are also referred to as classification scene libraries, and the relationship between the data pool in the cloud and the classification scene libraries is shown in fig. 2. It should be noted that the data in the classification scene library only references the data in the data pool.
The embodiment supports cloud storage and cloud management of the scene files, and meets the application requirements of high concurrency of multiple users. In the process of managing a specific scene file, the embodiment adopts a storage mode of a data pool and a plurality of classified scene libraries. After uploading, the scene files are all stored in the same data pool, then scene classification is carried out according to different classification rules, finally the scene files are stored in different classification scene libraries, and the scene files are displayed in the form of files under folders corresponding to the different classification scene libraries.
For example, when there are 3 classification scene libraries, the platform display interface displays the folder 1 corresponding to the classification scene library 1, the folder 2 corresponding to the classification scene library 2, and the folder 3 corresponding to the classification scene library 3. It should be noted that, in this embodiment, whether there are scene files under the folder may be displayed in different display manners, for example, there are 3 scene files in the folder 1, there are 1 scene files in the folder 2, both the folder 1 and the folder 2 are displayed in the first expression form, the folder 3 is an empty folder, and the folder 3 is displayed in the second expression form.
In addition, after the user places the mouse on the folder, common setting operations on the folder, such as forwarding, deleting, deriving, renaming, etc., may be displayed on the icon corresponding to the folder.
In step S10, the scene files to be classified are stored in the data pool of the cloud, and then classified into the target scene library according to the classification rule.
S30, searching target classification files matched with the tag files from all existing classification files, determining nodes containing all tags in the tag files in the classification files as target nodes in the searching process, and determining the classification files where the target nodes are located as target classification files.
The method comprises the steps that a label file is a file generated by adding labels to scene files, and a target classification file is label data corresponding to each entity object in all existing scene files in a target scene library in a classification tree mode; the target classification file comprises multiple levels of nodes, each level of nodes comprises a classification identifier and corresponding tag data, and the child level of nodes inherit the tag data of the parent level of nodes.
It should be understood that child level nodes are simply referred to as child levels or children, and parent level nodes are simply referred to as parents or parent classes. In addition, one node corresponds to one category, so that the classification file can reflect that a plurality of categories exist in the corresponding classification scene library, different levels may exist between different categories, and different classifications of the same level may also exist.
In this embodiment, the label inheritance mechanism adopted by the above inheritance is basically consistent with the inheritance concept of the object-oriented thought, and has the following advantages: 1. label sharing, reducing the workload of user-defined labels; 2. the child class is similar to the parent class, but can have the characteristics of the child class, if the label of one scene only has the parent class label, the scene belongs to the parent class, and if the scene has the parent class label and the child class label, the scene belongs to the parent class and the child class; 3. the level is clear, so that the scene can support classification and screening of father level and son level, and a more strict standard is provided for scene classification.
From the technical implementation point of view, when classifying a scene, there is no case of judging whether a rule to be matched is the last stage, as long as the scene tag contains all tags of one classification rule (i.e. one class), the scene belongs to the classification rule, and in view of the inheritance relationship of the tags in the classification rule, the scene may belong to multiple classification rules (i.e. multiple classes), so that a certain method can be adopted to construct a classification tree in this embodiment.
In the method, a label template set consisting of a plurality of label templates is constructed, whether the label templates are ordered or not is determined, root nodes are filtered, grouping is carried out according to parent nodes, and child nodes are constructed recursively. In recursively constructing child nodes of a tree structure, the present embodiment may perform, in operation: defining a root node, grouping according to a father node, and judging whether to sort.
The target scene library is one of all existing classification scene libraries, and each classification scene library corresponds to one classification file.
Alternatively, the classification scene library in the present embodiment may be classified according to the function of automatic driving as: ACC scene library, LKA scene library, APA scene library, alk scene library, AEB scene library, etc.
Optionally, the classification scene library in the present embodiment may be further classified according to scene characteristics, including but not limited to: whether there is an object, road type, weather type, etc.
Optionally, the classification scene library in the present embodiment may be further classified according to the behavior of the dynamic entity, where the behavior of the dynamic entity includes, but is not limited to: cutting in, cutting out, maintaining speed, changing channels, rapid acceleration, rapid deceleration, etc.
It should be noted that, the classification tree corresponding to the classification scene library has no fixed classification principle, and the user can define how to classify the scenes in the scene library according to the user's own use requirement and test requirement.
The target classification file is a tree-structured classification tree, the tag file is a tree-structured tag tree, and the difference points between the tag tree and the classification tree are as follows: which is a file generated by adding a tag to the scene file to be classified at this time.
S100, in the target classification file, if the target node has a sub-level node and any tag data of the sub-level node of the target node does not appear in the tag file, adding the classification identification of the target node to the scene file.
S110, in the target classification file, if a parent node exists in the target node, adding the classification identification of the target node and the classification identification of the parent node of the target node to the scene file.
It should be understood that a classification file, or classification hierarchy, corresponds to a classification tree (a classification tree contains multiple classifications, i.e., a classification tree contains multiple categories), and a classification file corresponds to a classification scene library. The platform is internally provided with a plurality of classification scene libraries and corresponding classification files.
The classification tree refers to a representation form of classifying all scene files in the same scene library according to a preset method or rule by a user. The automatic classification function relies on a classification tree, as shown in fig. 3, which has multiple levels, with multiple labels in a label library making up each node in the classification. That is, the classification file is made up of a plurality of nodes, each node containing a corresponding tag.
In this embodiment, each node in the classification tree has an inheritance relationship. Because each node has a corresponding parent class and/or child class, inheritance relationships exist between nodes. As shown in FIG. 3, class A is a parent node, A-2 belongs to a child node, and a class file A is generated after storage, wherein the class file A contains nodes and labels contained in each node. Specifically, the subclass A-2 comprises two labels of "rainy days", "4 lanes", the subclass A-2-2 comprises three labels of "rainy days", "4 lanes" and "darkness", wherein "rainy days" and "4 lanes" are labels inherited from the subclass A-2 by the subclass A-2, and "darkness" is a label added by the subclass A-2-2 alone.
In this embodiment, the tree structure adopted by the classification tree and the inheritance relationship existing between the child level and the parent level can serve the scene classification rule (i.e., the classification rule), so that repeated operation of adding the label in each level of classification is reduced, the child level automatically inherits the label of the parent level, the operation amount of a user is reduced, and the data amount stored in the cloud is reduced.
The label tree is composed of labels corresponding to the entity objects in the scene file to be classified. Each tag is obtained from the tag library. In the generated tag tree, the scene class refers to that all tags in the tag tree are used for defining entity objects in the scene file, such as lane types, lane numbers, weather conditions and the like. The non-scene class is that all tags in the tag tree are used to describe other features of the scene in the scene file, such as risk level, commonality, system failure condition, etc. The mixed class consisting of the non-scene class and the scene class is that the label tree contains the two types of labels.
For example, the ACC function verification scene file corresponding to the classification file includes expressways and urban roads, where the expressways may be classified further down. To implement automatic classification, the present embodiment may add a label to each type node, such as a "night speed limit" type, where the label includes a "dim light" label, etc.
In summary, the scene file is uploaded and then enters a data pool, a label file is generated through automatic scanning, the platform matches all existing target classification files according to the label file, the classification condition which the scene file accords with is prompted after the matching is completed, and a user can check in a menu page to be processed. For example, the target classification file to which the tag file of the classification file 1 is matched includes: classification file-1, classification file-2, and classification file-3. After viewing, the user may select which categorized scene library to move the scene file into.
S40, classifying the scene file into the category corresponding to the added classification identifier in the target scene library according to the added classification identifier.
In this embodiment, the number of the cloud data pools is one, and the number of the scene libraries is multiple, so that the purpose of this embodiment is to select a target scene library from multiple scene libraries, and then select a category corresponding to the added classification identifier from multiple categories of the target scene library, so as to implement fine-grained classification of the scene file.
In addition, because the label data according to different categories are different, the classification file can establish the classification rule in the corresponding classification scene library through the node position of each node in the tree structure, the relation among the nodes, the classification identification of the nodes and the label data of the nodes, and automatically completes the corresponding classification of each scene to a certain category in the corresponding classification scene library by matching the label tree of any scene uploaded to the platform with the classification tree, thereby realizing the automatic process of uploading the scene file to the classification scene.
S120, responding to a first updating operation of the target classified file, and when the type of the first updating operation is deleting the target node and the target node has a parent node, adding a new classified identification for the scene file again to realize the reclassification of the scene file.
S130, updating the tag file in response to a second updating operation of the tag file.
The embodiment realizes unified management and classification of the scene files based on one data pool and a plurality of scene libraries, and on the basis, the classification accuracy and classification efficiency of the scene files can be improved by matching the tag files with the classification files, updating the tag files and updating the classification files, so that the technical problems of incapability of unified management, disordered data, fuzzy limit and data redundancy of the scene files are avoided, the management efficiency and classification accuracy of the scene files are improved, and convenience is provided for improving the retrieval efficiency of the scene files in the later period.
In a specific embodiment, after step S40, the method for classifying and updating scene files based on a tree structure further includes the following steps, wherein:
s50, receiving tag data input by the input device, and screening out a scene file corresponding to the tag data input by the input device from a target scene library when the tag data input by the input device is matched with the target classification file.
S60, testing the target vehicle by using a scene file corresponding to the tag data input by the input equipment, and obtaining a test verification result.
Through the automatic classification of the scene files, the embodiment can improve the management efficiency of the scene files, and on the basis, the embodiment can also improve the retrieval efficiency of the scene files, thereby providing convenience for quick retrieval of users.
The embodiment relies on the platform, provides a service calling architecture, and is used for realizing data calling processing between a data pool and a classified scene library, thereby realizing a scene file classification and updating method based on a tree structure. The Service calling architecture comprises a Registry Service end (Registry), java Service (Java-Service) and NET Service; the Java service comprises an internally embedded Java local registry client and a Java application client; the NET service includes an internally embedded NET local registry client and a NET application server. The call flow between the services is as follows:
(1) The Java service sends a registration request to a registry server; (2) The registration center server side sends a subscription request to the NET service; (3) the registry server sends a notification to the NET service; (4) NET service requests Java service to call.
The Java server call flow is as follows: (a) the user uploads the scene file through the Java service. (b) The Java service reads the scene file, and performs verification and preprocessing on the scene file, wherein the verification includes, but is not limited to, verification on file format, size limitation and the like of the scene file, and the preprocessing refers to decompression processing performed when the scene file is a zip file, renaming processing performed when the file name is the same as that of a previous scene file, file movement, file archiving and various related data storage to the DB database server. (c) The scene file and other data (such as classification files, tag libraries, etc.) are serialized, and a Registry service is invoked by communicating with a Registry server (Registry) through a Java local Registry client to obtain a NET service address list. (d) The Java local registry client dispatches a specific service address (or designated service address) through a load balancing algorithm, and invokes a scene analysis interface. (e) And analyzing the scene file by the NET service, and performing deserialization to obtain the entity object. (f) The NET service matches the classification file corresponding to the label file of the scene file through a matching algorithm, and the classification file represents the belonging classification and label in the form of a classification tree. (g) NET service returns the analysis result to Java service. (h) The Java service performs final processing on the analysis result, sequences the scene file, the scene information, the classification file information of scene matching and the tag information, and stores the sequences in the DB database server.
According to the service, the embodiment can ensure the data calling processing between the data pool and the classified scene library, and further provides technical support for the accurate classification of the scene files.
Example 2:
fig. 4 is a schematic flow chart of searching for a target classification file according to an embodiment of the present invention. On the basis of the above embodiment 1, the description is made exemplarily for searching for the target classification file matching the tag file. Referring to fig. 4, the present embodiment includes the following steps, in which:
s301, analyzing the tag file to obtain an analysis result.
It should be understood that the generation process of the tag file is described in embodiment 3 below, and will not be described in detail here. Because the OpenLABEL content in the tag file of the scene file is completely in Json format, the Json format is a lightweight data exchange format, is a subset based on ECMAScript specification, and adopts text format of completely independent and programming language to store and represent data, thereby being beneficial to analysis and generation of a computer. Therefore, in this embodiment, the programming language may quickly analyze the content in the tag file through OpenLABEL, so as to be further used to implement scene classification.
In this embodiment, the binary stream of the OpenLABEL file corresponding to the scene file is obtained according to the scene ID, then the binary stream is converted into a character string according to the coding mode, the character string is inversely sequenced into the OpenLABEL object by using the Json tool, and finally the tag information of the OpenLABEL object is extracted by using the preset LINQ method to form an array (i.e., the analysis result).
S302, acquiring an existing classification file.
The embodiment obtains a classification file from the front end through a writing interface, wherein the classification file comprises multiple stages of nodes, and each stage of nodes comprises a classification identifier and corresponding specific tag data. Since each node corresponds to a classification rule, the classification identity of the node is alternatively referred to as a classification ID, classification rule ID, etc.
S303, searching a target classification file matched with the analysis result from the existing classification files by adopting an iteration method.
The objective of this embodiment is to find the intersection of step S301 and step S302, and determine, by using an iterative method and combining with the application of a matching algorithm, whether the scene file belongs to a class corresponding to a certain node.
It should be noted that, when adding the classification rule ID to the scene file, it is required to determine whether the target node where the currently matched tag is located exists in the parent node, if so, adding the classification ID of the parent node of the target node in addition to the classification ID of the target node to the scene file, and if not, adding the classification ID of the currently matched target node to the scene file.
For example, the tag a of the scene X and the existing classification rule B are matched by a matching algorithm, and if not, the next iteration is performed. If the classification rule B is matched, judging whether the existing classification rule B has a sub-level, and if the existing classification rule B has the sub-level, adding classification IDs of the sub-level B and the sub-level B for the scene X. If the child level does not exist, judging whether the existing classification rule B is only matched, if yes, judging whether the parent level exists in the B, and if the parent level exists, adding classification IDs of the parent level and the B for the scene X.
The embodiment of the application provides a specific matching mode of the tag file and the classified file, and by the matching mode, the target classified file matched with the analysis result can be accurately searched from the existing classified file, so that the classification efficiency and the accuracy of the scene file are improved.
Example 3:
fig. 5 is a schematic flow chart of generating a tag file corresponding to a scene file to be classified according to an embodiment of the present application. On the basis of the above embodiment 1, generating a tag file includes: s20, according to a preset scene labeling mode, scene labeling is carried out on the scene files to be classified, and a label file corresponding to the scene files to be classified is generated.
In this embodiment, the preset scene labeling modes include an automatic labeling mode and a manual labeling mode. A scene file may add multiple tags in multiple tag templates, alternatively referred to as a tag library. In this embodiment, the number of tags is numerous, so that a rich tag library is built in the platform, which covers SOTIF tags, ODD tags, scene tags, accident tags, violation tags and the like, and scene tags can be automatically added to scene files conforming to the OpenX standard. The SOTIF label includes 188 labels, the ODD label includes 329 labels, the scene label includes 522 labels, and the labels are not specifically described in this embodiment because of numerous types of labels.
When various labels are extracted, the embodiment mainly comes from expert experience aiming at ODD labels, and refers to international standard ISO34503 and national standard automatic driving System design operation condition for extraction; for the SOTIF tag, this embodiment comes mainly from the extraction of understanding and expert experience for the intended functional safety; for scene labels, the embodiment is based on understanding of scenes for many years, and according to a six-layer description system of the scenes, each layer is decomposed and refined downwards to form a label system for extraction.
For example: the ODD label template, the scene label template, the SOTIF label template and other custom label templates are added in the scene A, a user can select a plurality of label data from the templates, and the scene is marked by using label files formed by all the label data. The present embodiment divides the tag library into three types according to tag types: a non-scene class, a scene class, and a hybrid class consisting of the non-scene class and the scene class. The corresponding relation among the label libraries, the labeling modes and the label types is shown in the following table 1:
TABLE 1 tag library, labeling scheme, and relationship corresponding to tag types
Therefore, according to the preset scene labeling mode, the automatic labeling mode and the manual labeling mode are classified. The labels of the scene can adopt an automatic labeling mode and a manual labeling mode, and the labels of the non-scene only adopt the manual labeling mode.
When selecting a label template, the scene label library is generally selected by default, and other label libraries can be selected in a self-defining way according to the selection of a user. In addition, the user can selectively remove the selected label templates according to actual requirements.
In step S20, according to a preset scene labeling mode, scene labeling is performed on the scene files to be classified, and a tag file corresponding to the scene files to be classified is generated for exemplary illustration. Referring to fig. 5, the present embodiment includes the following steps, in which:
s201, splitting a scene file to be classified into an OpenDRIVE file and an OpenSCENARIO file.
In this embodiment, the OpenDRIVE file is referred to as an OpenDRIVE static scene file, and the openscanio file is referred to as an openscanio dynamic scene file.
S202, identifying entity objects in an OpenDRIVE file and entity objects in an OpenSCENARIO file.
It should be understood that the scene file includes xml format text files of OpenDRIVE and openscenetwork, and the embodiment can convert the file content into entity objects first, so that the scene tags can be matched through the tag templates conveniently.
S203, labeling each entity object in the OpenDRIVE file and the OpenSCENARIO file according to the specified label template to obtain label data.
In the embodiment of the application, the appointed label template is an existing label library in a platform selected by a user according to requirements.
S204, carrying out serialization processing on all the tag data, and outputting the tag file in the OpenLABEL format.
In the embodiment of the application, the label file in the OpenLABEL format is called an OpenLABEL file, and the label file is a label tree with a tree structure.
In this embodiment, the purpose of steps S201 to S204 is to automatically label standardized simulation scene files (including OpenDRIVE static scene files and openscanio dynamic scene files) by semantic analysis and mapping relationships (including extraction algorithms), create a label tree for each simulation scene file, and store the label tree in the standardized OpenLABEL file.
The preset scene annotation mode comprises the following two modes, and the specific analysis is as follows:
(1) Automatic labeling mode
After the scene file is uploaded, the embodiment automatically scans the label of the uploaded scene file according to a label library built in the platform, generates a corresponding OpenLABEL file according to the automatically marked label, and stores the OpenLABEL file in the platform. Optionally, in a scene tag library built in the platform, tags of scene classes are necessary options, tags such as ODD, SOTIF and the like are selectable options, and a user can add the tags according to own needs.
(2) Manual labeling mode
After the scene file is uploaded, a user can manually add the label according to own needs besides automatically adding the label, update the label of the existing OpenLABEL file and store the label in the platform.
As can be seen from the above description, the scene management platform in this embodiment opens up the definition and framework between OpenLABEL, openDRIVE, openSCENARIO standards, so that each standard is highly compatible, and realizes scene file intercommunication and mutual recognition, and supports multi-simulator simulation application.
And (3) combining the call flow of the Java server, and re-analyzing the scene marking flow realized in the steps S201-S204, wherein the steps (a) - (d) are as follows:
(a) The user uploads the scene file in the xml format, reads the file data of the scene file, converts the file data into a binary stream form and stores the binary stream form in a data pool.
(b) And judging whether the scene file accords with (XML Schema Definition, XSD) files according to the file data, and if not, sending out a prompt.
(c) If yes, the scene file is converted into an entity object.
In this embodiment, the data structures (including class, interface, enumeration, etc.) of OpenDRIVE and openscenetwork are defined in advance according to the XSD file of the OpenX standard, the logical relationship between each functional module of the OpenX standard is defined, and each attribute of the field in the XML file is marked. The present embodiment supports OpenDRIVE version 1.6 and OpenDRIVE version 1.7. In addition, the embodiment also writes the XMLConverter static method class in advance, takes charge of deserializing the character string into an entity object, and loads the XML file by using System.
On the one hand, the embodiment uses the BinaryReader to read the original data in the scene file from the binary stream, and converts the original data into the character string according to a preset coding mode. On the other hand, in this embodiment, the Deserialize method in xml converter is used to Deserialize the character string, so as to obtain the physical object of OpenDRIVE or openscenetwork. The conversion process is also to check whether the imported scene file accords with the XSD format once, and if the check cannot pass, a warning or a prompt is sent out.
(d) And labeling the scene file according to the label template selected by the user, and returning the label file.
This step (d) achieves a match of the entity object in the scene with the tag in the tag template. The original data in the scene file needs to be matched with the content defined in the tag template in a mode of calculation, unit conversion and the like, and the embodiment provides a conversion relation, which is specifically as follows: (1) OpenSCENARIO matches scene classification conforming to the ISO34504 standard. (2) OpenDRIVE matches scene classification conforming to the ISO34504 standard. (3) And serializing the label object output in the two steps into JsonString by using NewtonJson for persistent storage. (4) outputting an OpenLABEL file.
For the above (1) OpenSCENARIO matching with the scene classification conforming to the ISO34504 standard, the embodiment first defines a data structure conforming to the ISO34504 standard and a tag data structure adopted by dynamic scene labeling, and when defining an enumeration type, it should be noted that the tag data type of ISO34504 may be multi-choice enumeration.
Exemplary, dynamic entity's tag data structure, including dynamic entity name, fourth level as a true tag, e.g., truck, passenger, car, etc., longitudinal activity (fill-in value example: hold speed, host vehicle has this item, follow-up lane, etc.), initial status (target vehicle has this item except host vehicle), participant's role (target vehicle has this item except host vehicle), and enhanced conspicuity (last level of content filled in).
Based on this, the partial field mapping table of the ISO34504 dynamic entity, environment and openscenario1.0 is shown in table 2:
table 2-field mapping of ISO34504 dynamic entity
For (2) OpenDRIVE matching with the scene classification conforming to the ISO34504 standard, the embodiment first defines a data structure conforming to the ISO34504 standard and a tag data structure adopted by static road network labeling, and the data structure includes the following data by way of example: the type of area available for travel, the geometry of the area available for travel, the road specifications, the area available for travel markers, the edges of the area available for travel, road markings, etc.
Based on this, the partial field mapping tables of the ISO34504 static entity and opendrive version 1.6 and version 1.7 are shown in table 3:
table 3-field mapping of ISO34504 static entities
For (4) outputting an OpenLABEL file, the embodiment defines an OpenLABEL file format data structure first, then defines a bulk TTL file by using RDF tunnel syntax for OpenLABEL reference, finally converts labels generated in the steps according to a file format defined by the OpenLABEL standard, sequences the labels into Json strings by using NewtonJson, and writes the string byte stream into a text file by using system.
In the embodiment, in step S201 to step S204, the semantic analysis and the structural analysis are performed to label the scene file, and the tag tree of the tree structure is generated, so that the classification of the scene is performed by comparing the tag tree with the classification tree.
The embodiment provides a specific generation flow of the tag file corresponding to the scene file, which can ensure that the accurate tag file is generated, and provides data support for the subsequent rapid and efficient matching with the classified file.
Example 4:
fig. 6 is a schematic flow chart of generating a target classification file according to an embodiment of the present invention. Before implementing S10 in the above embodiment 1, the scene file classification further includes generating a target classification file, specifically including the following steps, where:
And S70, creating a major class and multi-level subclass nodes contained in the major class.
It should be understood that the major classes are classifications, each major class corresponds to a classification, and each level of sub-class node is a sub-level node of the previous level of node. When the subclass node is created, the embodiment provides two display methods of the mind map and the text outline through the design interfaces corresponding to the classification trees, and supports the user to add, delete and edit the subclass node.
S80, determining label data of all levels of sub-class nodes.
The embodiment provides configuration rules, configures corresponding detailed rules for each sub-class node, and enters a rule configuration page to add labels to the sub-class nodes. The user can select and add labels from the existing label trees in the current label library, and the whole combination is used as the detailed rule of the sub-class node. The purpose of step S80 is to add labels to each node depending on the label library.
S90, sorting the label data of all levels of sub-class nodes according to the classification tree mode to generate a target classification file.
It should be understood that the multi-level sub-class nodes are alternatively referred to as multi-level sub-classes. The object classification file consists of sub-classification names of each level and labels contained therein. In this embodiment, the target classification file is correspondingly generated when the classification file is stored.
In this embodiment, storing the classification file refers to storing the classification file after the configuration of the labels corresponding to each node is completed, where the classification file is used for automatic classification of the subsequent label file, and further is used for automatic classification of the subsequent scene file.
After the object classification file is generated, the embodiment creates a folder of the object scene library. The created file is an empty file in an initial state, and after the automatic classification of the scene files is completed, the scene files classified into the target scene library are displayed under the scene file. Therefore, when the tag file matches the target classification file, the scene file can be classified into the target scene library for storage.
The scene management platform provided by the embodiment of the application can provide the functions of scene annotation, automatic classification of scenes and the like, and a user can realize classification management of scene files and design of classification trees through a third functional module.
Aiming at classification management, the platform provides various setting functions of a classification tree with self-defined design, and aiming at the related requests of classification management such as creating the classification tree, acquiring the classification tree, adding a label to the classification tree, deleting the classification tree, modifying the classification tree and the like, the embodiment can provide a back-end service, respond and store related data to a DB database server.
When creating the classification tree, the embodiment creates a related classification tree according to scene characteristics, and if a node has a parent class during classification, the node is mounted to the parent class. The step of obtaining the classification tree is to read the related classification tree and construct the classification tree according to the belonged relation. Adding labels to the classification tree refers to creating a set of labels according to classification rules. Deleting a classification tree refers to deleting the classification tree and all subclasses and tag sets under the classification tree. Modifying the classification tree refers to modifying the name of the classification file and the hierarchical relationship of the classification tree.
The back-end service relies on a switching processor, a classification management interface controller means, a classification management data processing means, a classification management data access means, and the like. In addition, the platform is internally provided with a multi-level and multi-dimensional classification system, and technical support is provided for users to design a classification scene library.
In this embodiment, since the tag file and the classification file have a matching relationship, when any one of the two files is updated, the matching relationship may not be satisfied any more, and at this time, the platform may recheck and give a corresponding hint. The content of the prompt includes, but is not limited to: the scene file is re-matched to another classification file, no longer matches the original classification file, etc., and the user may manually move the scene file into or out of a classification scene library.
In this embodiment, embodiments 2, 3 and 4 are combined, and the automatic classification flow of the scene file is described as follows:
step 1, creating a tag tree corresponding to a scene file through steps S201-S204.
In this step S1, uploading, labeling and storing of scene files may be implemented. Specifically, uploading OpenDRIVE static scenes and OpenSCENARIO dynamic scene files, selecting a used tag template, and automatically scanning tags. In the embodiment, a tag tree is correspondingly generated for each scene file, front-end data and back-end data interaction is performed through json files according to the file format of OpenLabel, and the interaction is stored in a data pool.
And 2, creating a classification tree.
The step 2 and the step 1 can be performed simultaneously without sequence. The classification tree is created and a corresponding classification scene library is created, at this time, no scene data exists in the classification scene library, and a user can perform naming of each level of the classification scene library (the classification scene library is actually a classification hierarchical database structure for performing classification and hierarchical classification according to a certain manually formulated logic).
In this embodiment, a corresponding classification rule is configured for each category, and a classification tree is formed for each classification scene library by adding a label to the classification rule. The labels used herein and the labels added in step 1 are labels in the scene label template.
For example, the classification tree includes node 1 at a first level, node 2, node 3, node 4 at a second level, and nodes 5, node 6, and node 7 at a third level. Wherein node 2, node 3, node 4 are sub-level nodes of node 1, and node 5, node 6, and node 7 are sub-level nodes of node 2.
That is, node 1 is a parent node of nodes 2, 3, 4, and node 2 is a parent node of nodes 5, 6, and 7.
In the first hierarchy, the class identification and tag data of node 1 are class 1 and tag a, respectively.
In the second hierarchy, the class identification and label data of node 2 are sub-class 1 and label B, label C and label D, respectively. Meanwhile, node 2 inherits tag a in node 1. Similarly, the classification identifier and the label data of the node 3 are sub-classification 2 and label B, and label R and label H, respectively. Meanwhile, node 3 inherits tag a in node 1. The classification identity and label data of node 4 are sub-classification 3 and label V, label L and label S, respectively. Meanwhile, node 4 inherits tag a in node 1.
In the third hierarchy, the classification identity and the label data of the node 5 are sub-classification 2-1 and label T, label P and label U, respectively, while the node 5 inherits the label data of the node 3, and also inherits the label in the node 1, so the node 5 inherits the label B, label R, label H, label a.
Similarly, in the third hierarchy, the classification identifier and the label data of the node 6 are the sub-classification 2-2 and the label L, the label a and the label T, respectively, and at the same time, the node 6 inherits the label data of the node 3 and also inherits the label in the node 1, so that the node 6 inherits the label B, the label R, the label H and the label a.
Similarly, in the third hierarchy, the classification identifications and the label data of the node 7 are the sub-classifications 2-3 and the labels a, E and O, respectively, while the node 7 inherits the label data of the node 3 and also inherits the labels in the node 1, so the node 7 inherits the labels B, R, H and a.
In practical application, the number of levels in the classification tree and the number of nodes contained in each level can be adaptively adjusted according to practical needs, and the embodiment of the application is not limited in particular.
And step 3, judging whether the labels in the label tree contain all labels of the last-stage nodes in the classification tree. If yes, executing the step 4, and if not, ending.
In this step, the judging operation can be understood as matching, and the matching rule is full matching, that is, the inclusion relationship is satisfied when the labels in the label tree of the scene must include all the sub-level labels of the last level of the classification tree.
The platform judges whether the uploaded scene in step 1 belongs to the classification scene library created in step 2 by matching the inclusion relation between the label tree and the classification tree, and adds the classification scene library to a certain class of classification/category corresponding to the classification scene library.
For example, there are 4 tag trees of scenes to be classified, only one tag A1 in the tag tree of scene 1, only one tag A2 in the tag tree of scene 2, two tags A1 and A2 of the same level in the tag tree of scene 3, and 5 tags in the tag tree of scene 4: a4, A5, A1, A2 and A3. While there is only one classification tree, which includes a parent class and four child classes: subclass 1, subclass 2, subclass 3, subclass 4, wherein subclass 1 includes subclass 1-1 and subclass 1-2. The labels of the subclasses 1-1 are A1, the labels of the subclasses 1-2 are A2 and A3, the subclasses 2, 3 and 4 have no corresponding child nodes, the labels of the subclasses 2 are A1 and A2, the labels of the subclasses 3 are A1, A2 and A3, and the labels of the subclasses 4 are A4 and A5.
The last level node subclass 1-1, subclass 1-2, subclass 3, subclass 4, therefore the classification tree has 5 classifications/categories, the label A1 in the label tree of scene 1 and the labels of 5 categories in the classification tree are respectively matched, and the matching result is that scene 1 can be matched to subclass 1-1. Repeating the matching operation, and matching the tag A2 in the tag tree of the scene 2 with the tags of 5 categories in the classification tree, wherein the tag A2 cannot contain all the tags of any category, so that the matching result is that all the categories in the classification tree cannot be matched, and the classification scene library corresponding to the classification tree cannot be matched. While scenario 3 contains all the tags of subclass 2, and scenario 4 contains all the tags of subclass 3 and subclass 4.
And 4, the system displays the scene through the prompt box and can move into a certain category of the classification scene library corresponding to the classification tree.
Step 5, identifying whether the user confirms to move into a certain category of the classification scene library, if so, executing step 6; if not, ending.
And step 6, classifying the scene file into a certain category of the classified scene library.
It can be seen that the key point of this embodiment is how to use the tag tree and the classification tree to match for automatic classification of scenes, in addition to generating the tag tree. Either the tag tree or the classification tree is used for the automatic classification service of the scene, namely, the main purpose is to realize the automatic classification of the scene. In addition, because of the nature of the tag tree, these tags can also be used for retrieval and screening.
In a specific embodiment, responding to the first updating operation of the object classification file comprises:
s11, receiving a first updating operation of the target classification file, and determining the type of the first updating operation.
For updating the object classification file, the types of the first updating operation include: editing classification rules, deleting classification rules, etc.
Editing classification rules refers to: after editing the classification rules and the scene files are no longer matched, prompting that the scene files are no longer matched with new classification files in the menu page to be processed, and enabling the user to manually move out. The deletion classification rule refers to: under the condition that a scene file exists in a classified scene library corresponding to the classified file to be deleted, after the classified file is deleted, the scene file is moved into a parent class, if the parent class does not exist, the matching relation with the classified file is released, and then re-matching is carried out.
And when the type of the first updating operation is editing the classification rule, executing the steps S12-S14.
And S12, when the type of the first updating operation is editing classification rules, judging whether the updated target classification file is matched with the target scene library. When the type of the first updating operation is editing classification rules, judging whether the new rules influence the classification of the scene under the new rules; if the influence is present, S14 is executed, and if the influence is not present, S13 is executed.
And S13, if yes, responding to the first updating operation.
It should be appreciated that in response to the first update operation, the original object classification file is updated to a new object classification file.
And S14, if not, displaying a prompt of not matching the target scene library.
In this embodiment, the scene file is continuously stored in the current target scene library corresponding to the target classification file, and the scene file is prompted to be not matched with the target classification file and can be removed.
When the type of the first update operation is the deletion target node, the embodiment executes the following steps S15 to S17, wherein:
and S15, when the type of the first updating operation is a deletion target node, judging whether the target scene file in the target scene library corresponding to the target classification file is to be deleted or not.
S16, if so, judging whether the scene file can be classified into the class corresponding to the parent node of the target node, and if so, classifying the scene file into the class corresponding to the parent node.
And S17, if the scene file is not classified, deleting the target node, and reclassifying the scene file as the scene file to be classified.
According to the embodiment, the accuracy of the storage position of the scene file after the classified file is updated can be ensured through the coping strategy when the classified file is updated.
In a specific embodiment, S130 includes the steps of: s21, responding to a second updating operation of the tag file, and obtaining the updated tag file.
In this embodiment, for updating the tag file, the method includes: replacing the scene file, adding a label template to the scene file, adding a label, deleting the label and the like.
It should be appreciated that tag templates, such as scene tag libraries and SOTIF tag libraries, are added. The label template can be used for adding labels to the classification scene library, can also be used for defining classification files, is a basic resource library, and takes the smallest particles as the labels. The scene tag library consists of a large number of tags such as cloudy days, rainy days, 8 lanes and the like. Tags in the SOTIF tag library, such as: high risk, etc.
If the type of the second updating operation is a replacement scene file, the embodiment of the application rescans the content in the tag file according to all the existing tag libraries, judges whether to synchronously update the tag file, if so, obtains the updated tag file, and executes step S22; if not, the platform is used for recording and prompting the user that the scene file is replaced, the tag file can be updated, and then the user can manually update the tag file.
After the new and old tag files are compared, when the synchronous update tag file is selected in this embodiment, the tags of the non-scene class can be selected and reserved, and other contents are updated to the contents in the new tag file. In this embodiment, the labels of the non-scene class may be selected not to be retained, and all the contents in the original label file are updated to the contents in the new label file.
To sum up, in the present embodiment, the re-checking of the conversion scene file includes the steps of:
(a) And reading file data of the new scene file.
(b) Judging whether the new scene file accords with the XSD file according to the file data; if not, a prompt is sent out; if yes, go to step (c).
(c) The new scene file is converted into an entity object.
(d) Judging whether to reserve a non-scene label; if not, executing the step (e); if yes, go to step (f).
(e) And re-labeling the new scene file according to the label template selected by the user.
(f) And retaining non-scene labels, and simultaneously marking the new scene file according to the label template selected by the user.
After step (e) or (f) is performed, step (g) is performed to return the labeling result, and the flow is ended.
As can be seen from steps (d) to (f), in this embodiment, judging content of whether the existing scene content remains a "non-scene tag" is added, if so, extracting and temporarily storing the "non-scene tag" according to tag attribute, and outputting and displaying the re-scanned scene tag together; if the non-scene label is not reserved, the original label information is deleted, and only the scene information obtained after rescanning is displayed.
In this embodiment, if the type of the second update operation is adding a tag template, adding a tag, deleting a tag, or the like, the scene file is saved, the tag file is updated by using the newly added tag template, and step S22 is executed after the update is completed.
After updating the tag file in response to the second updating operation of the tag file in S130, the tree structure-based scene file classifying and updating method further includes:
s22, searching another classification file matched with the updated label file.
The step can be understood as that the classification file is re-matched by using the updated label file, and after re-matching, whether another classification file matched exists is judged, if not, the step is finished.
And S23, displaying a prompt box for judging whether to move into the new classification scene library when the other classification file is not the target classification file corresponding to the target scene library.
When another matched classified file exists, the embodiment judges whether the scene file is to be moved into a new classified scene library, if not, the scene file is continuously stored in the current classified scene library, and the scene file is prompted to be not matched with the current classified scene library and can be moved out; if yes, go to step S24.
S24, responding to the selection operation based on the prompt box, and classifying the scene files in a new classification scene library, wherein the new classification scene library is another scene library corresponding to another classification file.
According to the embodiment, the accuracy of the storage position of the scene file after the tag file is updated can be ensured through the coping strategy when the tag file is updated.
After the content is updated by the classified files and the tag files, considering that the data volume in the subsequent platform is too large, the update prompt cannot be checked in real time, the platform is temporarily not prompted to have the mismatch condition, but provides a one-time synchronous update function, namely a data synchronous update function, when a user clicks a button corresponding to the data synchronous update function, the platform automatically matches the tag files with the classified files again as a whole, scene re-automatic classification operation is executed, an alarm prompt is given on an interface for the updated matching relation after the scene re-automatic classification operation is completed, and the user can manually move the scene files out of and into a classified scene library.
In summary, the embodiment provides the whole process of generating the target classification file, the coping strategy when the classification file is updated, and the coping strategy when the tag file is updated, so that the accuracy of classification of the scene file in various scenes can be ensured.
Example 5:
fig. 7 is a schematic structural diagram of a device for classifying and updating scene files based on a tree structure according to an embodiment of the present invention. The apparatus of this embodiment may be in the form of software and/or hardware. As shown in fig. 7, the apparatus for classifying and updating a scene file based on a tree structure according to the present embodiment includes: the acquisition storage module 51, the search module 52, the first addition module 53, the second addition module 54, the classification module 55, the addition classification module 56, and the update module 57. Wherein:
the acquiring and storing module 51 is configured to acquire a scene file to be classified, and store the scene file to be classified in a data pool of the cloud.
The searching module 52 is configured to search for the target classification file matching the tag file from all existing classification files. The method comprises the steps that a label file is a file generated by adding labels to scene files, and a target classification file is label data corresponding to each entity object in all existing scene files in a target scene library in a classification tree mode; the target classification file comprises multiple levels of nodes, each level of nodes comprises a classification identifier and corresponding tag data, and the child level of nodes inherit the tag data of the parent level of nodes; the target scene library is one of all existing classification scene libraries, and each classification scene library corresponds to one classification file.
The searching module 52 is further configured to determine, during the searching process, nodes in the classification file that include all labels in the label file as target nodes, and determine the classification file in which the target nodes are located as target classification file.
The first adding module 53 is configured to add, in the target classification file, a classification identifier of the target node to the scene file if the target node has a child level node and any tag data of the child level node of the target node does not appear in the tag file.
The second adding module 54 is configured to add, in the target classification file, if the target node has a parent node, the classification identifier of the target node and the classification identifier of the parent node of the target node to the scene file.
The classification module 55 is configured to classify the scene file into a category corresponding to the added classification identifier in the target scene library according to the added classification identifier.
The adding classification module 56 is configured to, in response to a first update operation on the target classification file, re-add a new classification identifier to the scene file when the type of the first update operation is that of deleting the target node and the target node has a parent node, so as to implement the re-classification of the scene file.
The updating module 57 is configured to update the tag file in response to a second updating operation on the tag file.
The device for classifying and updating the scene file based on the tree structure provided by the embodiment can be used for executing the method for classifying and updating the scene file based on the tree structure provided by any of the method embodiments, and the implementation principle and technical effects are similar, and are not repeated here.
Fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. As shown in fig. 8, the electronic device 400 includes one or more processors 401 and memory 402.
The processor 401 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities and may control other components in the electronic device 400 to perform desired functions.
Memory 402 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random Access Memory (RAM) and/or cache memory (cache), and the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, and the like. One or more computer program instructions may be stored on the computer readable storage medium that may be executed by the processor 401 to implement the tree-structured based scene file classification and updating method and/or other desired functions of any of the embodiments of the invention described above. Various content such as initial arguments, thresholds, etc. may also be stored in the computer readable storage medium.
In one example, the electronic device 400 may further include: an input device 403 and an output device 404, which are interconnected by a bus system and/or other forms of connection mechanisms (not shown). The input device 403 may include, for example, a keyboard, a mouse, and the like. The output device 404 may output various information to the outside, including early warning prompt information, braking force, etc. The output device 404 may include, for example, a display, speakers, a printer, and a communication network and remote output devices connected thereto, etc.
Of course, only some of the components of the electronic device 400 that are relevant to the present invention are shown in fig. 8 for simplicity, components such as buses, input/output interfaces, etc. are omitted. In addition, electronic device 400 may include any other suitable components depending on the particular application.
In addition to the methods and apparatus described above, embodiments of the present invention may also be a computer program product comprising computer program instructions which, when executed by a processor, cause the processor to perform the steps of the tree structure based scene file classification and updating method provided by any of the embodiments of the present invention.
The computer program product may write program code for performing operations of embodiments of the present invention in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present invention may also be a computer-readable storage medium, on which computer program instructions are stored, which, when being executed by a processor, cause the processor to perform the steps of the tree-structure-based scene file classification and updating method provided by any embodiment of the present invention.
The computer readable storage medium may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to limit the scope of the present application. As used in this specification, the terms "a," "an," "the," and/or "the" are not intended to be limiting, but rather are to be construed as covering the singular and the plural, unless the context clearly dictates otherwise. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method or apparatus comprising such elements.
It should also be noted that the positional or positional relationship indicated by the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. are based on the positional or positional relationship shown in the drawings, are merely for convenience of describing the present application and simplifying the description, and do not indicate or imply that the apparatus or element in question must have a specific orientation, be constructed and operated in a specific orientation, and thus should not be construed as limiting the present application. Unless specifically stated or limited otherwise, the terms "mounted," "connected," and the like are to be construed broadly and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present application will be understood in specific cases by those of ordinary skill in the art.
It should be further noted that, the user information and data related to the present application (including, but not limited to, data for analysis, stored data, displayed data, etc.) are all information and data authorized by the user or fully authorized by each party, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards of the related country and region, and provide corresponding operation entries for the user to select authorization or rejection.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the essence of the corresponding technical solutions from the technical solutions of the embodiments of the present application.

Claims (9)

1. The scene file classifying and updating method based on the tree structure is characterized by comprising the following steps of:
acquiring a scene file to be classified, and storing the scene file to be classified in a data pool of a cloud;
Searching target classified files matched with the tag files from all existing classified files; the label file is a file generated by adding labels to the scene file, and the target classification file is label data corresponding to each entity object in all the existing scene files in the target scene library in a classification tree mode; the target classification file comprises multiple stages of nodes, each stage of nodes comprises a classification identifier and corresponding tag data, and a child stage of nodes inherits the tag data of a parent stage of nodes; the target scene library is one of all existing classification scene libraries, and each classification scene library corresponds to one classification file;
in the searching process, determining nodes containing all labels in the label file in the classification file as target nodes, and determining the classification file in which the target nodes are located as a target classification file;
in the target classification file, if the target node has a sub-level node and any tag data of the sub-level node of the target node does not appear in the tag file, adding a classification identifier of the target node to the scene file;
In the target classification file, if a parent node exists in the target node, adding a classification identifier of the target node and a classification identifier of the parent node of the target node to the scene file;
classifying the scene file into a category corresponding to the added classification identifier in the target scene library according to the added classification identifier;
responding to a first updating operation of the target classified file, and when the type of the first updating operation is a deletion target node and a father node exists in the target node, adding a new classified identification for the scene file again to realize the reclassification of the scene file;
updating the tag file in response to a second updating operation on the tag file;
generating the tag file includes:
splitting the scene file to be classified into an OpenDRIVE file and an OpenSCENARIO file;
identifying an entity object in the OpenDRIVE file and an entity object in the OpenSCENARIO file;
labeling each entity object in the OpenDRIVE file and the OpenSCENARIO file according to the appointed label template to obtain label data;
And carrying out serialization processing on all the tag data, and outputting a tag file in an OpenLABEL format.
2. The method of claim 1, wherein the locating the object classification file that matches the tag file comprises:
analyzing the tag file to obtain an analysis result;
acquiring an existing classification file;
and searching a target classification file matched with the analysis result from the existing classification files by adopting an iteration method.
3. The method of claim 1, wherein after classifying the scene file in the target scene library, the method further comprises:
receiving tag data input by an input device, and screening a scene file corresponding to the tag data input by the input device from a target scene library when the tag data input by the input device is matched with a target classification file;
and testing the target vehicle by using a scene file corresponding to the tag data input by the input equipment to obtain a test verification result.
4. The method of claim 1, wherein generating the object classification file comprises:
creating a major class and multi-level subclass nodes contained in the major class; each major class corresponds to a classification, and each level of sub-class node is a sub-level node of the previous level of node;
Determining label data of each level of sub-class nodes;
and sorting the label data of all levels of sub-class nodes according to the classification tree mode to generate the target classification file.
5. The method of claim 1, wherein the responding to the first update operation to the object classification file comprises:
receiving a first updating operation on the target classification file, and determining the type of the first updating operation;
when the type of the first updating operation is editing classification rules, judging whether the updated target classification file is matched with the target scene library;
if yes, responding to the first updating operation;
if not, displaying a prompt of not matching the target scene library.
6. The method of claim 1, further comprising, after updating the tag file in response to the second update operation on the tag file:
searching another classification file matched with the updated tag file;
when the other classified file is not the target classified file corresponding to the target scene library, displaying a prompt box for judging whether to move into the new classified scene library;
and responding to the selection operation based on the prompt box, and classifying the scene file in the new classification scene library, wherein the new classification scene library is another scene library corresponding to the another classification file.
7. A scene file classifying and updating device based on a tree structure, comprising:
the acquisition and storage module is used for acquiring the scene files to be classified and storing the scene files to be classified in a data pool of the cloud;
the searching module is used for searching the target classification files matched with the tag files from all the existing classification files; the label file is a file generated by adding labels to the scene file, and the target classification file is label data corresponding to each entity object in all the existing scene files in the target scene library in a classification tree mode; the target classification file comprises multiple stages of nodes, each stage of nodes comprises a classification identifier and corresponding tag data, and a child stage of nodes inherits the tag data of a parent stage of nodes; the target scene library is one of all existing classification scene libraries, each classification scene library corresponds to one classification file, and the label file is generated, and the method comprises the following steps:
splitting the scene file to be classified into an OpenDRIVE file and an OpenSCENARIO file;
identifying an entity object in the OpenDRIVE file and an entity object in the OpenSCENARIO file;
Labeling each entity object in the OpenDRIVE file and the OpenSCENARIO file according to the appointed label template to obtain label data;
all the tag data are subjected to serialization processing, and a tag file in an OpenLABEL format is output;
the searching module is further used for determining nodes containing all labels in the label file in the classification file as target nodes in the searching process, and determining the classification file in which the target nodes are located as a target classification file;
the first adding module is used for adding the classification identifier of the target node to the scene file if the target node has a sub-level node and any tag data of the sub-level node of the target node does not appear in the tag file in the target classification file;
the second adding module is used for adding the classification identifier of the target node and the classification identifier of the parent node of the target node to the scene file if the parent node exists in the target classification file;
the classification module is used for classifying the scene files into categories corresponding to the added classification identifiers in the target scene library according to the added classification identifiers;
The adding classification module is used for responding to a first updating operation of the target classification file, and adding a new classification identifier for the scene file again when the type of the first updating operation is a deletion target node and the target node has a parent node so as to realize the reclassification of the scene file;
and the updating module is used for responding to a second updating operation of the tag file and updating the tag file.
8. An electronic device, the electronic device comprising:
a processor and a memory;
the processor is configured to execute the steps of the tree-structure-based scene file classification and updating method according to any one of claims 1 to 6 by calling a program or instructions stored in the memory.
9. A computer-readable storage medium storing a program or instructions that cause a computer to execute the steps of the tree-structure-based scene file classification and updating method according to any one of claims 1 to 6.
CN202310833543.0A 2023-07-10 2023-07-10 Scene file classification and updating method, device and equipment based on tree structure Active CN116561650B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310833543.0A CN116561650B (en) 2023-07-10 2023-07-10 Scene file classification and updating method, device and equipment based on tree structure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310833543.0A CN116561650B (en) 2023-07-10 2023-07-10 Scene file classification and updating method, device and equipment based on tree structure

Publications (2)

Publication Number Publication Date
CN116561650A CN116561650A (en) 2023-08-08
CN116561650B true CN116561650B (en) 2023-09-19

Family

ID=87502233

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310833543.0A Active CN116561650B (en) 2023-07-10 2023-07-10 Scene file classification and updating method, device and equipment based on tree structure

Country Status (1)

Country Link
CN (1) CN116561650B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102902811A (en) * 2012-10-19 2013-01-30 北京金和软件股份有限公司 Database design method for quickly generating tree structure
CN111291158A (en) * 2020-01-22 2020-06-16 北京猎户星空科技有限公司 Information query method and device, electronic equipment and storage medium
CN113434603A (en) * 2021-02-07 2021-09-24 支付宝(杭州)信息技术有限公司 Data storage method, device and system based on credible account book database
CN113934742A (en) * 2021-10-26 2022-01-14 冷杉云(北京)科技股份有限公司 Data updating method, node information storage method, electronic device, and medium
WO2023272852A1 (en) * 2021-06-29 2023-01-05 未鲲(上海)科技服务有限公司 Method and apparatus for classifying user by using decision tree model, device and storage medium
CN115687724A (en) * 2021-07-29 2023-02-03 腾讯科技(深圳)有限公司 Information processing method, information processing device, computer equipment and storage medium
CN115956241A (en) * 2020-03-04 2023-04-11 杜一军 System and method for data set management tasks using search trees and tagged data items

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8495085B2 (en) * 2010-09-27 2013-07-23 International Business Machines Corporation Supporting efficient partial update of hierarchically structured documents based on record storage
US8875094B2 (en) * 2011-09-14 2014-10-28 Oracle International Corporation System and method for implementing intelligent java server faces (JSF) composite component generation

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102902811A (en) * 2012-10-19 2013-01-30 北京金和软件股份有限公司 Database design method for quickly generating tree structure
CN111291158A (en) * 2020-01-22 2020-06-16 北京猎户星空科技有限公司 Information query method and device, electronic equipment and storage medium
CN115956241A (en) * 2020-03-04 2023-04-11 杜一军 System and method for data set management tasks using search trees and tagged data items
CN113434603A (en) * 2021-02-07 2021-09-24 支付宝(杭州)信息技术有限公司 Data storage method, device and system based on credible account book database
WO2023272852A1 (en) * 2021-06-29 2023-01-05 未鲲(上海)科技服务有限公司 Method and apparatus for classifying user by using decision tree model, device and storage medium
CN115687724A (en) * 2021-07-29 2023-02-03 腾讯科技(深圳)有限公司 Information processing method, information processing device, computer equipment and storage medium
CN113934742A (en) * 2021-10-26 2022-01-14 冷杉云(北京)科技股份有限公司 Data updating method, node information storage method, electronic device, and medium

Also Published As

Publication number Publication date
CN116561650A (en) 2023-08-08

Similar Documents

Publication Publication Date Title
CN107341262B (en) Serialization and deserialization method and device for object type column in database
WO2011151111A1 (en) Method and system of adapting a data model to a user interface component
US7159171B2 (en) Structured document management system, structured document management method, search device and search method
CN115827101B (en) Cloud integration system and method for earth application model
CN110705237A (en) Automatic document generation method, data processing device, and storage medium
US20050216501A1 (en) System and method of providing and utilizing an object schema to facilitate mapping between disparate domains
CN117035081B (en) Construction method and device of multi-element multi-mode knowledge graph
CN116561650B (en) Scene file classification and updating method, device and equipment based on tree structure
CN111124938B (en) Method for generating componentized test case
CN113159618A (en) Rule engine design method and device with separated technical view angles
CN110162731B (en) Method for displaying IFC model component space information on Web
CN117009441A (en) Knowledge graph construction system and method based on relational database
EP1405217A1 (en) A flexible virtual database system including a hierarchical application parameter repository
Oommen et al. Fast learning automaton-based image examination and retrieval
CN116450655A (en) Tree structure data processing method and device, electronic equipment and storage medium
CN112488642B (en) Cloud file management method based on structured labels and taking object as core
CN106909645B (en) A kind of space-time data organization of unity method of expansible definition
Stoter et al. A data model for multi-scale topographical data
US8566814B2 (en) Transporting object packets in a nested system landscape
CN113032353A (en) Data sharing method, system, electronic device and medium
CN113221528A (en) Automatic generation and execution method of clinical data quality evaluation rule based on openEHR model
CN115392855B (en) Drainage basin water environment big data mining and business decision support system
CN112966055B (en) Method for establishing multi-granularity space-time object database of entity
CN115687163B (en) Scene library construction method, device, equipment and storage medium
CN114357970A (en) Credit investigation report analysis method, system, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant