CN117407311A - Test scene generation method and device, electronic equipment and storage medium - Google Patents

Test scene generation method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117407311A
CN117407311A CN202311456863.5A CN202311456863A CN117407311A CN 117407311 A CN117407311 A CN 117407311A CN 202311456863 A CN202311456863 A CN 202311456863A CN 117407311 A CN117407311 A CN 117407311A
Authority
CN
China
Prior art keywords
scene
parameter
test
parameters
parameter value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311456863.5A
Other languages
Chinese (zh)
Inventor
牛绍凯
杨明
王易之
吴宇涵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Nebula Internet Technology Co ltd
Original Assignee
Beijing Nebula Internet Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Nebula Internet Technology Co ltd filed Critical Beijing Nebula Internet Technology Co ltd
Priority to CN202311456863.5A priority Critical patent/CN117407311A/en
Publication of CN117407311A publication Critical patent/CN117407311A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a test scene generation method, a test scene generation device, electronic equipment and a storage medium. The method comprises the following steps: acquiring a scene type and corresponding original scene parameters of a test scene; simplifying the original scene parameters to obtain at least one target scene parameter; generating at least one parameter value for each of the target scene parameters; and combining the parameter values of the target scene parameters to obtain test scenes corresponding to the parameter values of each group. The technical scheme of the embodiment of the invention reduces the time consumed for generating the test scene, improves the efficiency and accuracy of generating the test scene, and further improves the reliability of the vehicle networking function test.

Description

Test scene generation method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a test scenario generating method, a device, an electronic device, and a storage medium.
Background
Along with the development of the vehicle-road cooperation industry, the intelligent network-connected vehicle gradually advances to the target of vector production, and simultaneously, higher requirements are provided for the vehicle network-connected function test. The large-scale production requires batch development of test tasks; meanwhile, the complexity of the vehicle networking function test and the diversity of scenes are considered, the vehicle performance needs to be tested in all aspects, and the reliability of the vehicle networking function test can be ensured only through abundant and diversified test scenes.
At present, a simple scene creation mode can only manually create single scenes one by one, namely, a creator can manually set different values of scene parameters according to test requirements. However, the manual creation method requires inputting a large number of scene parameters when creating a large number of test scenes, which takes a long time, and may generate errors in the input process, so that the accuracy of the created test scenes is low, and the reliability of the vehicle networking function test is difficult to ensure.
Disclosure of Invention
The invention provides a test scene generation method, a device, electronic equipment and a storage medium, which reduce the time consumed by generating a test scene, improve the efficiency and accuracy of generating the test scene, and further improve the reliability of vehicle networking function test.
According to an aspect of the present invention, there is provided a test scenario generation method, including:
acquiring a scene type and corresponding original scene parameters of a test scene;
simplifying the original scene parameters to obtain at least one target scene parameter;
generating at least one parameter value for each target scene parameter;
and combining the parameter values of the target scene parameters to obtain test scenes corresponding to the parameter values of each group.
According to another aspect of the present invention, there is provided a test scene generating apparatus including:
the scene type acquisition module is used for acquiring the scene type of the test scene and the corresponding original scene parameters;
the target scene parameter determining module is used for simplifying the original scene parameters to obtain at least one target scene parameter;
the parameter value generation module is used for generating at least one parameter value of each target scene parameter;
and the test scene generation module is used for combining the parameter values of the target scene parameters to obtain test scenes corresponding to the parameter values of each group.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the test scenario generation method of any one of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to implement the test scenario generation method according to any one of the embodiments of the present invention when executed.
According to the technical scheme, the scene type and the corresponding original scene parameters of the test scene are acquired, the original scene parameters are subjected to simplified processing to obtain at least one target scene parameter, at least one parameter value of each target scene parameter is generated, the parameter values of each target scene parameter are combined to obtain the test scene corresponding to each group of parameter values, the problems that a large number of scene parameters need to be input when a large number of test scenes are created in a manual creation mode, the time spent is long, errors can be generated in the input process, the accuracy of the created test scene is low, the reliability of the vehicle network connection function test is difficult to guarantee are solved, the time consumed for generating the test scene is shortened, the efficiency and the accuracy of the test scene generation are improved, and the reliability of the vehicle network connection function test is further improved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a test scenario generation method according to a first embodiment of the present invention;
fig. 2 is a flowchart of a test scenario generation method according to a second embodiment of the present invention;
fig. 3 is a flowchart of a test scenario generation method according to a second embodiment of the present invention;
fig. 4 is a simplified schematic diagram of a forward collision early warning scene parameter provided according to a second embodiment of the present invention;
fig. 5 is a schematic view of a scenario in which a host vehicle is traveling straight and a remote vehicle is traveling straight according to a second embodiment of the present invention;
fig. 6 is a schematic view of a scenario in which a host vehicle is traveling straight and a remote vehicle is turning left according to a second embodiment of the present invention;
fig. 7 is a schematic structural diagram of a test scene generating device according to a third embodiment of the present invention;
fig. 8 is a schematic structural diagram of an electronic device implementing a test scenario generating method according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a test scenario generation method according to a first embodiment of the present invention. The embodiment of the invention is applicable to the situation of generating the test scene, the method can be executed by the test scene generating device, the test scene generating device can be realized in the form of hardware and/or software, and the test scene generating device can be configured in the electronic equipment carrying the test scene generating function.
Referring to the test scene generation method shown in fig. 1, the method includes:
s110, acquiring the scene type of the test scene and the corresponding original scene parameters.
The testing process of the vehicle networking function test of the intelligent vehicle can comprise three links of scene generation, ring test and result analysis. Therefore, in the vehicle networking function test, the selection of the test scene is particularly important. The test scene can be a simulation scene of the intelligent vehicle for vehicle networking function test. The test scenario may include a scenario type and corresponding scenario parameters of a vehicle networking function test of the intelligent vehicle. From the perspective of scene application, the test scene can comprise a traffic sign marking recognition and response scene, a traffic signal lamp recognition and response scene, a front vehicle driving state recognition and response scene, an obstacle recognition and response scene, a pedestrian and non-motor vehicle recognition and avoidance scene, a vehicle following driving scene, a roadside parking scene, a overtaking scene, a merging scene, a forward collision early warning scene, an intersection collision early warning scene, an annular intersection collision early warning scene or a red light running early warning scene and the like. The scene parameters may be used to describe information of the test scene. The original scene parameters may be all scene parameters contained in the test scene. In the process of generating the test scene, the type of the test scene to be generated is determined, and the original scene parameters associated with the test scene can be obtained. In addition to the necessary scene parameters for the vehicle networking function test, the original test parameters also comprise redundant scene parameters. Therefore, the original scene parameters need to be simplified to improve the efficiency of generating the test scene.
Specifically, the scene type of the test scene input by the demander for generating the test scene can be obtained. Based on the scene type of the test scene, the original scene parameters associated with the test scene in the database or the local storage space are acquired.
S120, simplifying the original scene parameters to obtain at least one target scene parameter.
The target scene parameter may be a scene parameter that is processed by simplifying the original scene parameter. In comparison, the target scene parameters are necessary parameters in the test scene generation process; and the original scene parameters are all the scene parameters of the test scene. The original scene parameters comprise redundant scene parameters in the process of generating the test scene. The number of target scene parameters is smaller than the original scene parameters.
Specifically, the association degree between the original scene parameters and the test scene can be detected, and the original scene parameters with higher association degree with the test scene are screened to be used as target scene parameters.
Optionally, the original scene parameters may be de-duplicated first, and repeated scene parameters in the original scene parameters may be removed to obtain candidate scene parameters. And detecting the association degree between the original scene parameters and the test scene, and screening the original scene parameters with higher association degree between the original scene parameters and the test scene as target scene parameters.
S130, generating at least one parameter value of each target scene parameter.
The target scene parameter may correspond to at least one parameter value. Generating parameter values of the target scene parameters is essentially generalizing the target scene parameters, i.e. generalizing the test scene.
Specifically, a random number generation method may be used to randomly generate at least one parameter value of the target scene parameter.
And S140, combining the parameter values of the target scene parameters to obtain test scenes corresponding to the parameter values of each group.
Essentially, the test scene is a combined result of parameter values of different target scene parameters. By combining the parameter values of different target scene parameters, different test scenes can be obtained, and therefore generalization of the test scenes of the same scene type is realized.
Specifically, the parameter values of the target scene parameters can be randomly combined to obtain the test scene corresponding to each group of parameter values.
In the prior art, a test scene is generated by a simple scene creation mode. The simple scene creation mode can only manually create single scenes one by one, namely, a creator can manually set different values of scene parameters according to test requirements. However, the manual creation method requires inputting a large number of scene parameters when creating a large number of test scenes, which takes a long time, and may generate errors in the input process, so that the accuracy of the created test scenes is low, and the reliability of the vehicle networking function test is difficult to ensure. Furthermore, test scenario generation has the problem of homogeneity, i.e. only identical or similar test scenarios can be generated.
According to the technical scheme, the scene type of the test scene and the corresponding original scene parameters are acquired, the original scene parameters are subjected to simplified processing to obtain at least one target scene parameter, at least one parameter value of each target scene parameter is generated, the parameter values of each target scene parameter are combined to obtain the test scene corresponding to each group of parameter values, automatic generation of the parameter values of the scene parameters in the test scene is realized, generalization of the test scene is realized, the problem that a large number of scene parameters need to be input when a large number of test scenes are created in a manual creation mode is avoided, the time spent is long is solved, the generation time of the parameter values is shortened, the generation efficiency of the test scene is improved, the problem that errors are possibly generated in the input process of the manual creation mode is also avoided, the accuracy of the created test scene is low, the reliability of the vehicle network connection function test is difficult to ensure is also avoided, in addition, the generation efficiency of the test scene is further improved through simplified processing of the original scene parameters in the process of generating the test scene.
In an alternative embodiment of the present invention, processing the original scene parameters to obtain at least one target scene parameter includes: detecting the association degree between the original scene parameters and the test scene, and screening each original scene parameter to obtain alternative scene parameters associated with the test scene; screening absolute position parameters in the alternative scene parameters, and converting the absolute position parameters obtained by screening into relative position parameters; and determining other scene parameters except the absolute position parameter and the relative position parameter in the alternative scene parameters as target scene parameters.
The alternative scene parameters may be original scene parameters having a degree of association with the test scene. The alternative scene parameters are the result of a preliminary screening of the original scene parameters. The preliminary screening condition of the candidate scene parameters is the degree of association with the test scene. Optionally, the number of candidate scene parameters is less than or equal to the original scene parameters. The number of alternative scene parameters is greater than the target scene parameter. The absolute position parameter may be a position coordinate in a world coordinate system. The relative position parameter may be a relative position coordinate. For example, if the test scene includes a host vehicle and a remote vehicle, the absolute position parameters may include a host vehicle position coordinate and a remote vehicle position coordinate; the relative scene parameter may be a relative position coordinate between the host vehicle and the remote vehicle. In a test scenario, the core parameter is essentially the relative position between two vehicles, rather than the absolute position of a single vehicle, thereby converting the absolute position parameter into a relative position parameter, which is more convenient for generalization of the test scenario.
Specifically, a relevance function may be used to calculate the relevance between each original scene parameter and the test scene. Alternatively, the calculated association degree between the original scene parameter and the test scene may be compared with a preset association degree threshold, and the original scene parameter corresponding to the association degree greater than or equal to the preset association degree threshold may be determined as the candidate scene parameter. Optionally, the association degree between each original scene parameter and the test scene may be ordered according to the order from large to small, and the original scene parameters with the previous preset association degree number may be ordered and determined as the candidate scene parameters. The preset association threshold and the preset association number are preset by a technician, and can be set and adjusted according to experience of the technician.
Alternatively, a pre-trained scene parameter screening model may be used to input the scene type and the original scene parameters of the test scene into the model, and output the alternative scene parameters. The training samples of the scene parameter screening model may include a scene type of the test scene sample, an original scene parameter sample, and a target scene parameter sample. The scene parameter screening model may be, for example, a convolutional neural network model.
Specifically, the absolute position parameters can be screened from the alternative scene parameters according to the parameter identification and the parameter format, and the transverse and longitudinal distances between the absolute position parameters are calculated to determine the relative position parameters. The absolute position parameters can be removed from the alternative scene parameters to obtain other scene parameters. Other scene parameters and relative position parameters may be jointly determined as target scene parameters.
According to the scheme, the association degree between the original scene parameters and the test scene is detected, the association degree between the original scene parameters and the test scene is considered, the primary screening of the original scene parameters is realized, the alternative scene parameters are obtained, the further simplification of the original scene parameters is realized by converting the absolute position parameters in the alternative scene parameters into the relative position parameters, and meanwhile, the generalization of the test scene is facilitated by adopting the relative position parameters.
In an optional embodiment of the invention, after obtaining the test scenario corresponding to each set of parameter values, the method further includes: generating a test scene description file according to the scene type, the target scene parameters and the corresponding parameter values of the test scene; the test scene description file is used for describing the test scene so that a management party of the test scene can review and manage the test scene.
The test scene description file may be a file containing test scene description information. The test scenario description file may be used to describe a test scenario. The test scene description file may contain the scene type of the test scene, the target scene parameters, and the corresponding parameter values. Optionally, the test scene description file includes forms including, but not limited to, forms of tables, documents, pictures, and the like. Optionally, the test scene description file may be stored in a database or a local storage space, so that a manager of the test scene can conveniently review and manage the test scene. The management party of the test scenario may be a party providing the test scenario for the demander of the test scenario. Optionally, when the same test scene needs to be generated by the demander of different test scenes and the input conditions are the same, the generated test scene description file can be called to directly and rapidly generate a new test scene, so that the generation efficiency of the test scene can be further improved, meanwhile, when uncertain factors such as system faults and the like influence the generation process of the test scene, the test scene can be directly generated according to the test scene description file, and the fault tolerance of the generation process of the test scene is improved.
Specifically, the test type of the test scene, the target scene parameter and the corresponding parameter value can be jointly generated into the test scene description file.
According to the scheme, the test scene description file is generated, so that the convenience of the test scene management party in consulting and managing the test scene is improved, in addition, the test scene description file can be utilized to directly generate the test scene, and the generation efficiency of the test scene and the fault tolerance of the test scene generation process are further improved.
In an alternative embodiment of the invention, the test scene is a vehicle networking simulation test scene; the scene types of the test scene comprise a traffic sign marking recognition and response scene, a traffic signal lamp recognition and response scene, a vehicle following driving scene, a roadside parking scene, a forward collision early warning scene, an intersection collision early warning scene or a red light running early warning scene.
The vehicle networking simulation test scene can be a simulation test scene of vehicle networking function test. Unlike live test scenarios, the test scenario in this scheme is a simulated test scenario. The scene type of the test scene may be used to distinguish between different test scenes. The scene types of the test scene can comprise a traffic sign marking recognition and response scene, a traffic signal lamp recognition and response scene, a vehicle following driving scene, a roadside parking scene, a forward collision early warning scene, an intersection collision early warning scene or a red light running early warning scene. The traffic sign marking recognition and response scene can be used for evaluating the recognition capability of vehicles on traffic signs and markings. By way of example, traffic sign marking identification and response scenarios may include test scenarios for speed limit signs, stop-let sign markings, lane lines, and crosswalk line sign markings. Optionally, tests to disable, warn and indicate class mark markings may also be added. The traffic light identification and response scenario may be used to evaluate the vehicle's ability to identify traffic lights. By way of example, traffic light identification and response scenarios may include test scenarios of motor vehicle lights, direction indicator lights, non-motor vehicle lights, crosswalk lights, lane lights, flashing warning lights, and road and railroad plane intersection lights. The following scene may be used to evaluate the ability of the vehicle to follow a lead vehicle. For example, the following travel scenario may include test scenarios for stabilizing following travel, stop-and-go functions, and formation travel. Roadside parking scenarios may be used to evaluate the ability of a vehicle to park alongside when running into driving risk. The roadside parking scene may include a roadside emergency parking and a right-most in-lane roadside parking test scene. The forward collision early warning scene may be a test scene in which a forward collision occurs to the vehicle. The intersection collision early warning scene can be a test scene of the vehicle collision at the intersection. The vehicle runs a red light in the early warning scene.
According to the scheme, the test scene is embodied as the vehicle network simulation test scene, the scene type of the test scene is embodied as the traffic sign marking recognition and response scene, the traffic signal lamp recognition and response scene, the vehicle following driving scene, the roadside parking scene, the forward collision early warning scene, the intersection collision early warning scene or the red light running early warning scene, the batch generation of different test scenes is realized, and the efficiency and the accuracy of the batch generation of the test scenes are further improved.
Example two
Fig. 2 is a flowchart of a test scenario generation method according to a second embodiment of the present invention. Based on the embodiment, the embodiment of the invention embodies the step of obtaining the parameter value interval and the parameter value interval of the target scene parameter by generating at least one parameter value of each target scene parameter; at least one parameter value of each target scene parameter is generated in the parameter value interval according to the parameter value interval, so that the generation mode of the parameter value of the target scene parameter is simplified, the generation efficiency of the parameter value of the target scene parameter is improved, and the generalization efficiency of the test scene is further improved. In the embodiments of the present invention, the descriptions of other embodiments may be referred to in the portions not described in detail.
Referring to the test scenario generation method shown in fig. 2, the method includes:
s210, acquiring the scene type of the test scene and the corresponding original scene parameters.
S220, simplifying the original scene parameters to obtain at least one target scene parameter.
S230, acquiring a parameter value interval and a parameter value interval of the target scene parameter.
The parameter value interval may be an interval to which a parameter value of the target scene parameter belongs. The parameter value interval may be an interval between parameter values of the target scene parameter. The parameter value intervals and parameter value intervals may be entered by a demander of the test scenario. The parameter value intervals and parameter value intervals may be pre-generated by a technician and stored in a database.
Specifically, a parameter value interval and a parameter value interval of a target scene parameter input by a demander of the test scene can be obtained. Alternatively, the parameter value interval and the parameter value interval of the target scene parameter may be obtained randomly in the database.
S240, generating at least one parameter value of each target scene parameter according to the parameter value interval in the parameter value interval.
Specifically, an initial value of a parameter value may be randomly selected within the parameter value interval. And sequentially generating at least one parameter value of the target scene parameters according to the parameter intervals.
S250, combining the parameter values of the target scene parameters to obtain test scenes corresponding to the parameter values of each group.
According to the technical scheme, the scene type of the test scene and the corresponding original scene parameters are obtained, the original scene parameters are subjected to simplified processing to obtain at least one target scene parameter, at least one parameter value of each target scene parameter is generated, the parameter values of each target scene parameter are combined to obtain the test scene corresponding to each group of parameter values, the generation mode of the parameter values of the target scene parameters is simplified, the generation efficiency of the parameter values of the target scene parameters is improved, and the generalization efficiency of the test scene is further improved.
In an alternative embodiment of the present invention, acquiring a parameter value interval and a parameter value interval of a target scene parameter includes: and determining a parameter value interval and a parameter value interval of the target scene parameter by adopting a preset distribution method.
The preset distribution method can be used for determining the distribution condition of each parameter value. The preset distribution method may include an average distribution method, a random number method, a normal distribution method, and the like, for example. For example, the preset distribution method may be an average distribution method; the target scene parameter may be a speed parameter; the parameter value interval of the speed parameter can be [10km/h,20km/h ]; the parameter value interval can be 5km/h; the parameter values of the speed parameters may include 10km/h, 15km/h and 20km/h. For another example, the preset distribution method may be a random number method; the target scene parameter may be a speed parameter; the parameter value interval of the speed parameter can be [10km/h,20km/h ]; the parameter value interval may be 3; the parameter values of the speed parameters may include 11km/h, 13km/h and 19km/h. For another example, the preset distribution method may be a normal distribution method; the target scene parameter may be a speed parameter; the parameter value interval of the speed parameter can be mu=15 km/h, sigma=5 km/h; the parameter value interval may be 3; the parameter values of the speed parameters may include 15km/h, 12.6km/h and 20.1km/h.
Specifically, a preset distribution method such as an average distribution method, a random number method or a normal distribution method can be adopted to determine a parameter interval and a corresponding parameter value interval of the target scene parameter. At least one parameter value for each target scene parameter may be generated at parameter value intervals within a parameter value interval.
According to the scheme, the preset distribution method is introduced, so that the parameter value interval and the parameter value interval of the target scene parameter are determined, meanwhile, the distribution condition of the parameter value of the target scene parameter is also determined, and the accuracy of the generated parameter value of the target scene parameter is further improved.
In an alternative embodiment of the present invention, generating at least one parameter value for each target scene parameter comprises: when the test scene is an early warning test scene, selecting at least one known scene parameter and one unknown scene parameter from all target scene parameters; the early warning test scene is a critical scene with danger early warning; generating at least one parameter value for each known scene parameter; acquiring a parameter relation between known scene parameters and unknown scene parameters; and calculating the parameter value of the unknown scene parameter according to the parameter value and the parameter relation of the known scene parameter.
From the aspect of predictability of the test result, the test scene may include a conventional test scene and an early warning test scene. Optionally, scene parameters of the conventional test scene are randomly generated, and test results of the conventional test scene cannot be predicted. The early warning test scenario may be a predetermined test scenario in which a hazard early warning may occur. The early warning test scene may be a critical scene where a hazard early warning occurs. The known scene parameters may be target scene parameters that randomly generate parameter values. The unknown scene parameters may be target scene parameters that generate parameter values for the computation. Wherein the number of unknown scene parameters may be 1. The number of known scene parameters is 1 less than the number of target scene parameters. The parameter relationship between the known scene parameter and the unknown scene parameter may be a predetermined parameter relationship. Alternatively, the parameter relationship between the known scene parameters and the unknown scene parameters may be pre-stored in a database.
Specifically, when the test scene is an early warning test scene, it may be understood that when the test scene is a predetermined test scene in which danger early warning occurs, one unknown scene parameter and at least one known scene parameter may be selected from the target scene parameters. At least one parameter value for each known scene parameter may be randomly generated. A parameter relationship between known scene parameters and unknown scene parameters stored in a database may be obtained. Parameter values for the unknown scene parameters may be calculated based on the parameter values for the known scene parameters and the parameter relationship.
According to the scheme, the known scene parameters and the unknown scene parameters are selected from the target scene parameters, the parameter values of the known scene parameters are randomly generated, the parameter values of the unknown scene parameters are calculated and generated according to the parameter values and the parameter relations of the known scene parameters, the generation of the early warning test scene is realized, the predictability of the test scene generation process is improved, the selectivity of the test scene is also improved, and the accuracy of the test scene is further improved.
Fig. 3 is a flow chart of a test scenario generation method. Referring to the test scenario generation method shown in fig. 3, the method includes:
s310, selecting a scene type of the test scene and corresponding original scene parameters.
Specifically, the scene type of the test scene input by the demander of the test scene can be obtained. Based on the scene type of the test scene, the original scene parameters associated therewith in the database or local storage space may be obtained.
S320, determining target scene parameters.
In the process of generating the test scenes, the number of the scene parameters can be reduced as much as possible according to the scene types of the test scenes under the premise of accurately describing the test scene information, so that unnecessary parameter values of the scene parameters are prevented from being generated, and the generation efficiency of the test scenes is prevented from being influenced.
Specifically, the absolute position parameters can be screened from the original scene parameters, and the absolute position parameters obtained by screening are converted into the relative position parameters. Other scene parameters than the absolute position parameter among the original scene parameters, and the relative position parameter may be determined as target scene parameters.
Illustratively, FIG. 4 is a simplified schematic diagram of forward collision warning scene parameters. Table 1 is the original scene parameters before optimization in the forward collision early warning scene. Table 2 is the optimized target scene parameters in the forward collision early warning scene.
TABLE 1 original scene parameters before optimization (Forward Collision early-warning scene)
Table 2 optimized target scenario parameters (Forward Collision early-warning scenario)
As can be seen from tables 1 and 2, the purpose of the scene parameter optimization is to reduce the number of scene parameters in the test scene and reduce the computational complexity. By replacing the absolute position parameters of each vehicle with the relative position parameters between the vehicles, the goal of reducing the number of scene parameters can be achieved.
Taking a forward collision early warning scene as an example, the core parameter of the test scene is a transverse and longitudinal relative position parameter between vehicles. Wherein the relative position parameters may include longitudinal forward-backward vehicle distance l and lateral position shift rate, and relative absolute positions (x 1 ,y 1 ) And (x) 2 ,y 2 ) Is not heavyThe requirement is that. Therefore, the longitudinal front-rear vehicle distance l and the lateral position shift rate are used as the relative position parameters instead of the relative position parameters (x 1 ,y 1 ) And relative position parameter (x 2 ,y 2 ). When scene generalization is performed, namely, when generating the parameter value of the target scene parameter, the parameter value of the absolute position parameter of two vehicles is not required to be generalized, and only the parameter values of the longitudinal front-rear vehicle distance l and the transverse position offset rate are required to be generalized.
Illustratively, table 3 is the target scene parameters of the optimized partial test scene.
TABLE 3 target scene parameters for optimized partial test scenes
S330, judging whether the test scene is an early warning test scene, if so, executing S340; if not, then S370 is performed.
Specifically, whether the test scene is an early warning test scene can be judged according to the requirements of the demander of the test scene. If yes, executing S340; if not, then S370 is performed.
S340, selecting known parameters and unknown scene parameters from the target scene parameters.
Specifically, one of the unknown scene parameters and the known scene parameters may be selected from the target scene parameters
S350, selecting a parameter generalization method to generate parameter values of known scene parameters.
The parameter generalization method is a preset distribution method. By way of example, the parameter generalization method may include an average distribution method, a random number method, a normal distribution method, and the like. Different generalization methods correspond to different parameter setting modes, namely, the corresponding parameter value intervals and parameter value intervals are different.
Specifically, a preset distribution method may be adopted to determine a parameter value interval and a parameter value interval of known scene parameters. Within the parameter value interval, at least one parameter value for each known scene parameter may be generated at parameter value intervals.
Illustratively, table 4 is a parameter generalization method for speed parameters.
Table 4 parameter generalization method for speed parameters
S360, calculating and generating parameter values of unknown scene parameters.
For the early warning test scene for triggering early warning, after the known scene parameters are determined, the unknown scene parameters are automatically calculated and determined through an algorithm, so that the generated early warning test scene can trigger early warning.
Specifically, a parameter relationship between known scene parameters and unknown scene parameters stored in a database may be obtained. Parameter values for the unknown scene parameters may be calculated based on the parameter values for the known scene parameters and the parameter relationship.
By way of example, fig. 5 is a schematic view of a scenario in which a host vehicle is traveling straight and a far vehicle is traveling straight. Fig. 6 is a schematic view of a scenario in which a host vehicle is traveling straight and a far vehicle is turning left. As shown in the figure, when generating an early warning test scene of collision of a main vehicle and a distant vehicle at an intersection, known scene parameters, namely the speed V of the main vehicle, can be generated firstly 1 Distance d from main vehicle to stop line 1 Direction of travel (north-south), speed of departure V 2 After the driving direction (east-west), the distance d from the far car to the stop line can be automatically controlled according to the parameter relation between the known scene parameter and the unknown scene parameter 2 And (5) performing calculation.
Wherein the parameter relationship between the known scene parameter and the unknown scene parameter is as follows:
wherein d 2 Distance from far car to stop line; d, d 1 The distance from the main vehicle to the parking line is set; d, d 3 Is the lane line width; v (V) 1 Is the distance of the main vehicle; v (V) 2 Is the distance of the far car; d, d 4 Is an intersection shape parameter. Wherein d 3 And d 4 Is a common parameter between different scenarios.
S370, selecting a parameter generalization method to generate a parameter value of the target scene parameter.
Specifically, a preset distribution method may be adopted to determine a parameter value interval and a parameter value interval of the target scene parameter. At least one parameter value for each target scene parameter may be generated at parameter value intervals within the parameter value interval.
S380, generating test scenes in batches.
Specifically, the parameter values of the target scene parameters can be randomly combined to obtain the test scene corresponding to each group of parameter values.
Optionally, after obtaining the test scene corresponding to each set of parameter values, a test scene description file may be generated according to the scene type of the test scene, the target scene parameter and the corresponding parameter value.
The test scene description file can be used for describing the test scene so that a management party of the test scene can review and manage the test scene. The test scene description file contains target scene parameters and corresponding parameter values of the test scene.
For example, table 5 is a test scenario description of an intersection collision warning scenario.
Table 5 test scenario description of intersection collision early warning scenario has files
According to the method, a large number of test scenes can be generated rapidly through a small number of target scene parameters so as to be used for comprehensively testing the vehicle networking function, the accuracy of the parameter values of the generated target scene parameters can be guaranteed, errors possibly generated by manual input are avoided, meanwhile, the creation speed is greatly improved, the efficiency of scene creation links in the vehicle networking function test is effectively improved, the traditional manual creation mode is improved to be automatic generation, the generation process of the test scenes is faster and more accurate, and the method has important promotion significance for promoting the mass production process of intelligent networking automobiles.
Example III
Fig. 7 is a schematic structural diagram of a test scene generating device according to a third embodiment of the present invention. The embodiment of the invention is applicable to the situation of generating the test scene, the device can execute the test scene generation method, the device can be realized in the form of hardware and/or software, and the device can be configured in the electronic equipment carrying the test scene generation function.
Referring to the test scene generating apparatus shown in fig. 7, comprising: a scene type acquisition module 710, a target scene parameter determination module 720, a parameter value generation module 730, and a test scene generation module 740. The scene type obtaining module 710 is configured to obtain a scene type of the test scene and a corresponding original scene parameter; the target scene parameter determining module 720 is configured to simplify the original scene parameter to obtain at least one target scene parameter; a parameter value generating module 730, configured to generate at least one parameter value of each target scene parameter; the test scene generating module 740 is configured to combine the parameter values of the target scene parameters to obtain test scenes corresponding to the parameter values of the groups.
According to the technical scheme, the scene type of the test scene and the corresponding original scene parameters are acquired, the original scene parameters are subjected to simplified processing to obtain at least one target scene parameter, at least one parameter value of each target scene parameter is generated, the parameter values of each target scene parameter are combined to obtain the test scene corresponding to each group of parameter values, automatic generation of the parameter values of the scene parameters in the test scene is realized, generalization of the test scene is realized, the problem that a large number of scene parameters need to be input when a large number of test scenes are created in a manual creation mode is avoided, the time spent is long is solved, the generation time of the parameter values is shortened, the generation efficiency of the test scene is improved, the problem that errors are possibly generated in the input process of the manual creation mode is also avoided, the accuracy of the created test scene is low, the reliability of the vehicle network connection function test is difficult to ensure is also avoided, in addition, the generation efficiency of the test scene is further improved through simplified processing of the original scene parameters in the process of generating the test scene.
In an alternative embodiment of the present invention, the parameter value generating module 730 includes: the parameter value interval acquisition unit is used for acquiring a parameter value interval and a parameter value interval of the target scene parameter; and the first parameter value generating unit is used for generating at least one parameter value of each target scene parameter according to the parameter value interval in the parameter value interval.
In an alternative embodiment of the present invention, the parameter value interval acquisition unit includes: and the parameter value interval acquisition subunit is used for determining a parameter value interval and a parameter value interval of the target scene parameter by adopting a preset distribution method.
In an alternative embodiment of the present invention, the parameter value generating module 730 includes: the scene parameter selection unit is used for selecting at least one known scene parameter and one unknown scene parameter from all target scene parameters when the test scene is an early warning test scene; the early warning test scene is a critical scene with danger early warning; a second parameter value generation unit for generating at least one parameter value for each known scene parameter; a parameter relation acquisition unit for acquiring a parameter relation between a known scene parameter and an unknown scene parameter; and the third parameter value generating unit is used for calculating the parameter value of the unknown scene parameter according to the parameter value and the parameter relation of the known scene parameter.
In an alternative embodiment of the present invention, the target scene parameter determination module 720 includes: the parameter association degree detection unit is used for detecting association degree between original scene parameters and a test scene, and screening each original scene parameter to obtain alternative scene parameters associated with the test scene; the absolute position parameter conversion unit is used for screening absolute position parameters in the alternative scene parameters and converting the absolute position parameters obtained by screening into relative position parameters; and the target scene parameter determining unit is used for determining other scene parameters except the absolute position parameter and the relative position parameter in the alternative scene parameters as target scene parameters.
In an alternative embodiment of the invention, the apparatus further comprises: the scene description file generation module is used for generating a test scene description file according to the scene type, the target scene parameters and the corresponding parameter values of the test scene after obtaining the test scene corresponding to each group of parameter values; the test scene description file is used for describing the test scene so that a management party of the test scene can review and manage the test scene.
In an alternative embodiment of the invention, the test scene is a vehicle networking simulation test scene; the scene types of the test scene comprise a traffic sign marking recognition and response scene, a traffic signal lamp recognition and response scene, a vehicle following driving scene, a roadside parking scene, a forward collision early warning scene, an intersection collision early warning scene or a red light running early warning scene.
The test scene generating device provided by the embodiment of the invention can execute the test scene generating method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the executing method.
In the technical scheme of the embodiment of the invention, the scene type, the corresponding original scene parameter, the parameter value interval and the parameter value interval of the target scene parameter, the parameter relation between the known scene parameter and the unknown scene parameter and the like are acquired, stored and applied, and the like of the related test scene are in accordance with the regulations of related laws and regulations, and the common sequence is not violated.
Example IV
Fig. 8 shows a schematic structural diagram of an electronic device 800 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 8, the electronic device 800 includes at least one processor 801, and a memory such as a Read Only Memory (ROM) 802, a Random Access Memory (RAM) 803, etc., communicatively connected to the at least one processor 801, wherein the memory stores a computer program executable by the at least one processor, and the processor 801 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 802 or the computer program loaded from the storage unit 808 into the Random Access Memory (RAM) 803. In the RAM 803, various programs and data required for the operation of the electronic device 800 can also be stored. The processor 801, the ROM 802, and the RAM 803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to the bus 804.
Various components in electronic device 800 are connected to I/O interface 805, including: an input unit 806 such as a keyboard, mouse, etc.; an output unit 807 such as various types of displays, speakers, and the like; a storage unit 808, such as a magnetic disk, optical disk, etc.; and a communication unit 809, such as a network card, modem, wireless communication transceiver, or the like. The communication unit 809 allows the electronic device 800 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The processor 801 may be a variety of general and/or special purpose processing components with processing and computing capabilities. Some examples of processor 801 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 801 performs the various methods and processes described above, such as a test scenario generation method.
In some embodiments, the test scenario generation method may be implemented as a computer program, which is tangibly embodied on a computer-readable storage medium, such as the storage unit 808. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 800 via the ROM 802 and/or the communication unit 809. When a computer program is loaded into RAM 803 and executed by processor 801, one or more steps of the test scenario generation method described above may be performed. Alternatively, in other embodiments, the processor 801 may be configured to perform the test scenario generation method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above can be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS (Virtual Private Server ) service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A test scenario generation method, the method comprising:
acquiring a scene type and corresponding original scene parameters of a test scene;
simplifying the original scene parameters to obtain at least one target scene parameter;
generating at least one parameter value for each of the target scene parameters;
and combining the parameter values of the target scene parameters to obtain test scenes corresponding to the parameter values of each group.
2. The method of claim 1, wherein said generating at least one parameter value for each of said target scene parameters comprises:
acquiring a parameter value interval and a parameter value interval of the target scene parameter;
and generating at least one parameter value of each target scene parameter according to the parameter value interval in the parameter value interval.
3. The method according to claim 2, wherein the acquiring parameter value intervals and parameter value intervals of the target scene parameter comprises:
and determining a parameter value interval and a parameter value interval of the target scene parameter by adopting a preset distribution method.
4. The method of claim 1, wherein said generating at least one parameter value for each of said target scene parameters comprises:
when the test scene is an early warning test scene, selecting at least one known scene parameter and one unknown scene parameter from the target scene parameters; the early warning test scene is a critical scene with danger early warning;
generating at least one parameter value for each of the known scene parameters;
acquiring a parameter relation between the known scene parameter and the unknown scene parameter;
And calculating the parameter value of the unknown scene parameter according to the parameter value of the known scene parameter and the parameter relation.
5. The method of claim 1, wherein said processing said original scene parameters to obtain at least one target scene parameter comprises:
detecting the association degree between the original scene parameters and the test scene, and screening the original scene parameters to obtain alternative scene parameters associated with the test scene;
screening absolute position parameters from the alternative scene parameters, and converting the absolute position parameters obtained by screening into relative position parameters;
and determining other scene parameters except for absolute position parameters in the alternative scene parameters and the relative position parameters as the target scene parameters.
6. The method according to claim 1, further comprising, after said obtaining said test scenes corresponding to each set of said parameter values:
generating a test scene description file according to the scene type of the test scene, the target scene parameter and the corresponding parameter value; the test scene description file is used for describing a test scene so that a test scene manager can review and manage the test scene.
7. The method of claim 1, wherein the test scenario is a vehicle networking simulation test scenario; the scene types of the test scene comprise a traffic sign marking recognition and response scene, a traffic signal lamp recognition and response scene, a vehicle following driving scene, a roadside parking scene, a forward collision early warning scene, an intersection collision early warning scene or a red light running early warning scene.
8. A test scenario generation apparatus, the apparatus comprising:
the scene type acquisition module is used for acquiring the scene type of the test scene and the corresponding original scene parameters;
the target scene parameter determining module is used for simplifying the original scene parameters to obtain at least one target scene parameter;
a parameter value generating module, configured to generate at least one parameter value of each of the target scene parameters;
and the test scene generation module is used for combining the parameter values of the target scene parameters to obtain test scenes corresponding to the parameter values of each group.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
The memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the test scenario generation method of any one of claims 1-7.
10. A computer readable storage medium storing computer instructions for causing a processor to implement the test scenario generation method of any one of claims 1-7 when executed.
CN202311456863.5A 2023-11-03 2023-11-03 Test scene generation method and device, electronic equipment and storage medium Pending CN117407311A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311456863.5A CN117407311A (en) 2023-11-03 2023-11-03 Test scene generation method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311456863.5A CN117407311A (en) 2023-11-03 2023-11-03 Test scene generation method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117407311A true CN117407311A (en) 2024-01-16

Family

ID=89492385

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311456863.5A Pending CN117407311A (en) 2023-11-03 2023-11-03 Test scene generation method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117407311A (en)

Similar Documents

Publication Publication Date Title
CN113408141B (en) Automatic driving test method and device and electronic equipment
CN111739344B (en) Early warning method and device and electronic equipment
WO2021057134A1 (en) Scenario identification method and computing device
US20220076038A1 (en) Method for controlling vehicle and electronic device
CN115909749B (en) Vehicle running road risk early warning method, device, equipment and storage medium
JP2021168174A (en) Method and apparatus for identifying vehicle alignment information, electronic device, roadside device, cloud control platform, storage medium, and computer program product
CN115273477B (en) Intersection driving suggestion pushing method, device, system and electronic equipment
CN116580271A (en) Evaluation method, device, equipment and storage medium for perception fusion algorithm
CN114301938B (en) Vehicle-road cooperative vehicle event determining method, related device and computer program product
EP4206610A1 (en) Map matching method and apparatus, and electronic device and storage medium
CN117407311A (en) Test scene generation method and device, electronic equipment and storage medium
CN114706372A (en) Test method, device, equipment and storage medium
CN115456060A (en) Processing method and device for predicted track
CN115248993A (en) Method and device for detecting reality of simulation scene model and storage medium
CN113962107A (en) Method and device for simulating driving road section, electronic equipment and storage medium
CN114169247A (en) Method, device and equipment for generating simulated traffic flow and computer readable storage medium
CN116401111B (en) Function detection method and device of brain-computer interface, electronic equipment and storage medium
CN118035788A (en) Target vehicle relative position classification method, device, equipment and storage medium
CN117496735A (en) Indication board change prompting method, device, equipment and storage medium
CN117593896A (en) Opposite incoming vehicle early warning method, device, equipment and storage medium
CN116343480A (en) Traffic signal lamp switching point prediction method and device, electronic equipment and storage medium
CN117146797A (en) Method, device, equipment and medium for adjusting virtual lane line of high-precision map intersection
CN116401554A (en) Data classification method, device, equipment and medium
CN116767262A (en) Driving auxiliary line display method, device, equipment and medium
CN116013109A (en) Traffic prompt method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination