CN116150040A - Track data testing method, display decision method, storage medium and electronic equipment - Google Patents

Track data testing method, display decision method, storage medium and electronic equipment Download PDF

Info

Publication number
CN116150040A
CN116150040A CN202310445419.7A CN202310445419A CN116150040A CN 116150040 A CN116150040 A CN 116150040A CN 202310445419 A CN202310445419 A CN 202310445419A CN 116150040 A CN116150040 A CN 116150040A
Authority
CN
China
Prior art keywords
scene
data
test
track data
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310445419.7A
Other languages
Chinese (zh)
Other versions
CN116150040B (en
Inventor
张波
何焱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Zejing Automobile Electronic Co ltd
Original Assignee
Jiangsu Zejing Automobile Electronic Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Zejing Automobile Electronic Co ltd filed Critical Jiangsu Zejing Automobile Electronic Co ltd
Priority to CN202310445419.7A priority Critical patent/CN116150040B/en
Publication of CN116150040A publication Critical patent/CN116150040A/en
Application granted granted Critical
Publication of CN116150040B publication Critical patent/CN116150040B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/365Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3604Software analysis for verifying properties of programs
    • G06F11/3612Software analysis for verifying properties of programs by runtime analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

The disclosure relates to the technical field of data processing, in particular to a track data testing method, a display decision method, a storage medium and electronic equipment. The track data testing method comprises the following steps: determining a target test scene and corresponding track data to be evaluated; executing the test scene data corresponding to the target test scene and displaying the track data to be evaluated so as to acquire corresponding first test result data; executing the test scene data corresponding to the target test scene to obtain corresponding second test result data; and comparing the first test result data with the second test result data to determine an evaluation result of the track data to be evaluated according to the test result data comparison result.

Description

Track data testing method, display decision method, storage medium and electronic equipment
Technical Field
The disclosure relates to the technical field of data processing, and in particular relates to a track data testing method, a track data display decision method, a storage medium and electronic equipment.
Background
With the rapid development of the related art of the HUD (Head Up Display), the HUD device has developed various device types and is capable of displaying data contents such as navigation data, driving data, track information, shop information, and other product information in the HUD device. Taking the AR-HUD as an example, the system generated AR navigation projection can be displayed in the driver's viewable area. Due to the complexity and diversity of the driving environment, the continuous display of the navigation line inevitably interferes with the sight line, so that the display of other important information of the AR-HUD is affected, and the road conditions of the surrounding environment are shielded. Therefore, whether the track data displayed in the AR-HUD can meet the driving requirement and the safety requirement becomes a technical problem to be solved.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The disclosure provides a track data testing method, a track data testing device, a track data display decision method, a track data display decision system, a storage medium and an electronic device, which can accurately evaluate whether track information needs to be displayed in different scenes.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a track data testing method, the method comprising:
determining a target test scene and corresponding track data to be evaluated;
executing the test scene data corresponding to the target test scene and displaying the track data to be evaluated so as to acquire corresponding first test result data; and
executing the test scene data corresponding to the target test scene to obtain corresponding second test result data;
And comparing the first test result data with the second test result data to determine an evaluation result of the track data to be evaluated according to the test result data comparison result.
In some exemplary embodiments, the executing the test scenario data corresponding to the target test scenario and displaying the track data to be evaluated to obtain corresponding first test result data includes:
executing test scene data corresponding to the target test scene in a simulation system, so as to be used for carrying out simulated driving operation and completing simulated driving tasks by a tester according to the displayed track data to be evaluated;
and acquiring data of operation data of a target type in the simulated driving operation process by utilizing a preset data acquisition task so as to acquire the first test result data.
In some exemplary embodiments, the executing the test scenario data corresponding to the target test scenario to obtain corresponding second test result data includes:
executing test scene data corresponding to the target test scene in a simulation system, so as to be used for carrying out simulated driving operation by a tester and completing a simulated driving task;
and acquiring data of operation data of a target type in the simulated driving operation process by utilizing a preset data acquisition task so as to acquire the second test result data.
In some exemplary embodiments, the method further comprises:
judging whether the target scene meets a track data display strategy for displaying corresponding track data or not;
when the target scene is determined to meet the track data display strategy, configuring the target scene as a scene to be tested;
configuring test scene data and track data according to scene content of a scene to be tested; and
configuring a corresponding scene test task for the scene to be tested, so as to execute the scene test task and obtain a corresponding test result;
the scene test tasks comprise test tasks corresponding to a single vehicle speed value and test tasks corresponding to a preset vehicle speed range.
In some exemplary embodiments, the configuring test scene data and track data according to scene content of a scene to be tested includes:
planning a path according to the scene content of the scene to be tested;
configuring corresponding track data based on a path planning result; and
and configuring a trigger position and an end position of the track data in the scene to be tested.
According to a second aspect of the present disclosure, there is provided a track data testing device comprising:
the scene determining module is used for determining a target test scene and corresponding track data to be evaluated;
The first test data acquisition module is used for executing the test scene data corresponding to the target test scene and displaying the track data to be evaluated so as to acquire corresponding first test result data; and
the second test data acquisition module is used for executing the test scene data corresponding to the target test scene to acquire corresponding second test result data;
and the evaluation result generation module is used for comparing the first test result data with the second test result data so as to determine the evaluation result of the track data to be evaluated according to the test result data comparison result.
According to a third aspect of the present disclosure, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the track data testing method described above.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to implement the track data testing method described above via execution of the executable instructions.
According to a fifth aspect of the present disclosure, there is provided a display decision method of track data, the method comprising:
Identifying a current scene type corresponding to the current path scene according to the navigation data and the environment information;
judging whether the current scene type belongs to an evaluated target scene type or not;
judging whether alarm type information exists when the current path scene is determined to belong to the target scene type;
and when the alarm type information is determined to be absent and the current display content in the target display area meets the preset display condition, displaying track data corresponding to the current path scene in the target display area.
In some exemplary embodiments, the method further comprises: and when the current path scene is determined not to belong to the target scene type, not displaying track data corresponding to the current path scene.
In some exemplary embodiments, the method further comprises: and when the alarm type information is determined to exist, not displaying the track data corresponding to the current path scene.
In some exemplary embodiments, the method further comprises: and when the alarm type information is determined not to exist, if the priority of the current display content in the target display area is higher than the priority of the track data, not displaying the track data corresponding to the current path scene.
In some exemplary embodiments, the determining whether the current scene type belongs to the evaluated target scene type includes:
comparing the current scene type with the evaluated target scene type, and determining an evaluation result of the matched scene type when the matched scene type exists;
and when the corresponding evaluation result is displayable track data, determining that the current path scene belongs to the target scene type.
In some exemplary embodiments, the method further comprises: the method for testing track data according to any one of the above embodiments is used to evaluate the target scene type in advance to obtain a corresponding scene evaluation result.
According to a sixth aspect of the present disclosure, there is provided a display decision system of track data:
the current scene type identification module is used for identifying the current scene type corresponding to the current path scene according to the navigation data and the environment information;
a scene type comparison module for judging whether the current scene type belongs to an evaluated target scene type;
the alarm information identification module is used for judging whether alarm type information exists or not when the current path scene is determined to belong to the target scene type;
And the current display content identification module is used for displaying track data corresponding to the current path scene in the target display area when the alarm type information does not exist and the current display content in the target display area meets the preset display condition.
According to a seventh aspect of the present disclosure, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described track data display decision method.
According to an eighth aspect of the present disclosure, there is provided an electronic device comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to implement the above-described track data display decision method via execution of the executable instructions.
According to the track data testing method provided by the embodiment of the disclosure, corresponding first test result data is obtained for a target test scene under the condition of displaying the track data to be evaluated, corresponding second test result data is obtained under the condition of no data to be evaluated, and the first test result data and the second test result data are subjected to data comparison, so that the evaluation result of the track data to be evaluated in the target test scene can be obtained according to the data comparison result; the evaluation result can be used for evaluating the necessity of displaying the track data in the scene, and the display value of the track in the scene can be accurately evaluated.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
Fig. 1 schematically illustrates a schematic diagram of a track data testing method in an exemplary embodiment of the present disclosure.
Fig. 2 schematically illustrates a schematic diagram of a system architecture in an exemplary embodiment of the present disclosure.
Fig. 3 schematically illustrates a schematic diagram of a method of configuring a test task in an exemplary embodiment of the present disclosure.
Fig. 4 schematically illustrates a schematic diagram of a display effect of a course lane change guide of an oversized vehicle scene in an exemplary embodiment of the present disclosure.
Fig. 5 schematically illustrates a schematic diagram of a display effect of a route straight guidance after lane change of an oversized vehicle in an exemplary embodiment of the present disclosure.
Fig. 6 schematically illustrates a schematic diagram of a display decision method of track data in an exemplary embodiment of the present disclosure.
Fig. 7 schematically illustrates a composition diagram of a track data testing device in an exemplary embodiment of the present disclosure.
Fig. 8 schematically illustrates a composition diagram of a track data display decision system in an exemplary embodiment of the present disclosure.
Fig. 9 schematically illustrates a composition diagram of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
In the related art, an AR-HUD (augmented reality-head up display) navigation system is a vehicle navigation system in which an augmented reality technology, a head up display technology, and a map navigation technology are integrated. The system projects the generated AR navigation within a viewable area of the driver. By the system, visual and accurate route guidance can be provided for a driver. Due to the complexity and diversity of the driving environment, the continuous display of the navigation line inevitably interferes with the sight line, so that the display of other important information of the AR-HUD is affected, and the road conditions of the surrounding environment are shielded. Therefore, in the face of real and complex road conditions, how to determine the displaying and disappearing time of the route and whether the displaying of the route will threaten driving safety is determined, and the corresponding solution is lacking in the related technology.
In order to overcome the disadvantages and shortcomings of the prior art, the present exemplary embodiment provides a track data testing method, which can be used for accurately evaluating the necessity of displaying track data in a preset scene. Referring to fig. 1, the track data testing method may include:
step S11, determining a target test scene and corresponding track data to be evaluated;
Step S12, executing the test scene data corresponding to the target test scene and displaying the track data to be evaluated so as to obtain corresponding first test result data; and
step S13, executing the test scene data corresponding to the target test scene to obtain corresponding second test result data;
and S14, comparing the first test result data with the second test result data to determine an evaluation result of the track data to be evaluated according to the test result data comparison result.
According to the track data testing method provided by the example embodiment, the corresponding first test result data is obtained for the target test scene under the condition of displaying the track data to be evaluated, the corresponding second test result data is obtained under the condition of no track data to be evaluated, and the first test result data and the second test result data are subjected to data comparison, so that the evaluation result of the track data to be evaluated in the target test scene can be obtained according to the data comparison result; the evaluation result can be used for evaluating the necessity of displaying the track data in the scene, and the display value of the track in the scene can be accurately evaluated.
The steps of the track data testing method in this exemplary embodiment will be described in more detail with reference to the accompanying drawings and examples.
In this example embodiment, referring to fig. 3, the track data testing method may include:
step S101, judging whether a target scene meets a track data display strategy for displaying corresponding track data;
step S102, when determining that a target scene meets the track data display strategy, configuring the target scene as a scene to be tested;
step S103, configuring test scene data and track data according to scene content of a scene to be tested; and
step S104, configuring a corresponding scene test task for the scene to be tested, so as to be used for executing the scene test task and obtaining a corresponding test result.
In particular, the above-described method may be applied to a system architecture as shown in fig. 2. As shown in fig. 2, the system architecture may include a terminal device 201 (e.g., a portable computer, although it may be a desktop computer, tablet computer, etc.), a network 202, and a server 203. The network 202 is a medium used to provide a communication link between the terminal device 201 and the server 203. The network 202 may include various connection types, such as wired communication links, wireless communication links, and the like. It should be understood that the number of terminal devices, networks and servers in fig. 2 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, the server 203 may be a server cluster formed by a plurality of servers.
In one embodiment, the user may select a driving scenario as a target scenario by using the terminal device 201, and send scenario content description data corresponding to the target scenario to the server 203, where when the server 203 receives the scenario content description data, it may first determine whether the track data display policy is satisfied, and if so, may configure a test task for the scenario, and after executing, feed back a corresponding test result to the terminal device.
Specifically, when the position information is used for navigation, a vehicle driving scene and a track guiding scene can be predefined for different road conditions, environments and different driving processes; for example: in the guiding scene of the conventional route, an intersection guiding scene, a multi-lane guiding lane changing scene, and the like can be included; under the guiding scene of the complex route, the method can comprise a continuous lane changing scene, a high-speed access scene, a overtaking scene and the like; the scene of guiding in time under special conditions can comprise a scene of guiding an optimal lane route by the traffic jam of the lane, a scene of guiding a new route after the route is wrong, and the like. In addition, a tunnel driving scene, a mountain driving scene, and the like may be included. For different driving and guiding scenes, judgment can be carried out one by one. The track data display strategy may include: judging whether a display requirement exists or not; for example, whether there is a driver's desire to display specific track data in the relevant scene. Additionally, the track data display policy may further include: judging whether the track data is suitable for display; for example, the judgment can be made according to preset conditions such as the complexity of the intersection, whether other information displayed simultaneously is mutually interfered or not, and the like. In addition, the track data display strategy may further include: it is determined whether there is a necessity to display track data. For example, the track data display policy may configure specific judgment conditions according to a custom rule. For each known and customized scene type, the scene type can be respectively used as a target scene to judge whether the track data display strategy is satisfied. When the target scene is judged to meet the rule in the track data display strategy, the target scene can be used as a scene to be tested.
In this example embodiment, the configuring test scene data and track data according to scene content of a scene to be tested includes: planning a path according to the scene content of the scene to be tested; configuring corresponding track data based on a path planning result; and configuring the trigger position and the end position of the track data in the scene to be tested.
Specifically, corresponding test scene data and track data can be configured for each scene to be tested according to specific scene content; the test scene data may include a planned path start point and end point, a planned path length, a simulated driving speed, a simulated road condition data, and the like. Meanwhile, corresponding track data can be constructed according to the planned path data and used for guiding running according to the track data during testing.
Referring to the system architecture shown in fig. 2, for each scenario to be tested selected by the user at the terminal device 201, the server 203 may configure a corresponding scenario test task for each scenario to be tested, and may execute the test task at the server.
For example, the scenario test tasks may include a test task corresponding to a single vehicle speed value, and a test task corresponding to a preset vehicle speed range. That is, for the current target test scenario to be tested, multiple test tasks with different speeds may be configured to obtain multiple sets of results.
In step S11, a target test scenario is determined, along with corresponding track data to be evaluated.
In this example embodiment, a scene set to be tested may be pre-constructed, where the scene set to be tested may include a plurality of scenes to be tested and corresponding scene data. One scenario may be selected from the set as a target test scenario for the current execution of the test. For example, the current target test scenario may be under a complex route scenario—an oversized car scenario. And according to the specific driving scene, planning the route information in advance, and displaying the route in an animation mode in a driving scene picture to play a role in guiding driving.
In step S12, the test scenario data corresponding to the target test scenario is executed and the track data to be evaluated is displayed, so as to obtain corresponding first test result data.
In this example embodiment, the step S12 may specifically include:
step S121, executing test scene data corresponding to the target test scene in a simulation system, so as to be used for a tester to perform a simulated driving operation according to the displayed track data to be evaluated and complete a simulated driving task;
step S122, performing data acquisition on the operation data of the target type in the simulated driving operation process by using a preset data acquisition task, so as to obtain the first test result data.
Specifically, the driving environment may be simulated for testing using a simulated vehicle control system in which test scenario data of the target scenario is executed. For example, in a complex route scenario—a test scenario of an oversized vehicle scenario, a tester begins to simulate driving a vehicle to complete a cut-in task in the scenario, the time is counted at the beginning of driving, and during the simulated driving, the tester needs to perform driving actions according to a guiding path, such as path guiding as shown in fig. 4.
Meanwhile, operation data in the simulated driving process can be acquired by utilizing a pre-configured data acquisition task. Wherein, the operation data in the simulated driving process may include: any one or a combination of any plurality of direction operation data, brake operation data, accelerator operation data, simulated driving behavior duration data and simulated line data.
For example, data acquisition may be performed using buried points that are pre-arranged in a simulation test environment. For example, the following buried point behaviors are set in a test scenario: steering wheel: the steering wheel rotates; and (3) braking: the brake is stepped on; throttle: the throttle is released; route trace: the vehicle driving route is overlapped with the pre-planned route line route; duration of time: reaching the scene end point. And continuously monitoring the set buried point behaviors in the simulated driving process of the tester, and judging whether the buried point behaviors act or not. And when the buried point behavior is generated in the driving simulation process, collecting vehicle data and scene data when the behavior is generated. When the overtaking operation is completed and returns to the original lane and the linear distance of the overtaking vehicle exceeds a preset length, for example, the length exceeds 100m, the end of the simulated driving can be judged.
The steering wheel angle may be 0 degrees when the vehicle is in the forward direction, the left direction is the-direction, and the right direction is the +direction; when the angle of the steering wheel changes, triggering the corresponding buried point to acquire data; specifically, the collected data may further include the current position of the vehicle and time point information of the current behavior occurrence time. The brake operation data can be a pedal behavior count triggered when the brake is stepped on, and meanwhile, corresponding time point information of the current position and the current behavior occurrence time of the vehicle can be acquired. The throttle operation data can be a throttle release count triggered when the throttle is released, and meanwhile, corresponding time point information of the current position and the current behavior occurrence time of the vehicle can be acquired. The simulated route data may be the overlap rate of the actual travel route of the vehicle with the pre-planned course route. The driving behavior simulation duration data may be total duration of reaching the end point of the test scene, and specifically may collect time point information when the current behavior occurs.
After the simulated driving process is finished, the data can be processed. Specifically, the data information collected in the driving process can be processed, the actual driving route and the pre-planned route line route in the driving process are preferentially compared, if the overlapping rate of the two routes is more than or equal to 90%, the test is an effective test, other collected data are sorted, if the overlapping rate of the two routes is less than 90%, the test is an ineffective test, the test is carried out again, and new data information is collected.
In addition, regarding the embedding point range of the data embedding point, the current position of the driving vehicle of the tester is taken as a reference point, the statistical distance range is determined according to the target scene requirement, and the data acquisition is carried out in the range, so that the uniformity and the accuracy of the data are ensured. Taking the position of the host vehicle in the simulated oversized vehicle scene as a reference coordinate, and taking the position 100m away from the large vehicle in the same lane in front as a starting point for data statistics; after the vehicle finishes the task of the oversized vehicle, the position which is on the same lane as the vehicle and exceeds the 100m position of the oversized vehicle is the end, and at the moment, the data detection and acquisition are stopped. In addition, the navigation path matched with the target scene can be generated by utilizing an algorithm in advance, triggered at a proper position and transmitted to the AR-HUD to display the optimal navigation path route for guiding. For example, referring to fig. 4 and 5, the display path line may be triggered at a distance of 40m from the cart, the straight path line may be displayed after lane change, and the return path line may be displayed after 40m of overtaking. In addition, in order to ensure that the vehicle speeds in the state of having the route test and the state of having no route test are consistent, a plurality of sets of vehicle speed range standards can be set to respectively perform corresponding simulation tests, and the vehicle speed in the example comprises the vehicle speed before overtaking, the vehicle speed during overtaking and the vehicle speed exceeding the front vehicle by 100 m.
In step S13, the test scenario data corresponding to the target test scenario is executed to obtain corresponding second test result data.
In this example embodiment, the step S13 may include:
step S131, executing test scene data corresponding to the target test scene in a simulation system, so as to be used for a tester to perform a simulated driving operation and complete a simulated driving task;
step S132, performing data acquisition on the operation data of the target type in the simulated driving operation process by using a preset data acquisition task to acquire the second test result data.
Specifically, as described in the above embodiments, the driving environment may be simulated by simulation software, and the simulated driving hardware device may be bound. The test scene is an oversized vehicle lane change scene, and the test scene does not need to be displayed with preset track animation. And the test personnel drives according to a preset speed value or speed interval. The tester starts to simulate the driving of the vehicle to finish the overtaking task in the scene, the timing is carried out when the driving starts, and the tester can freely plan the driving behavior in the driving process. Likewise, buried points may be utilized for collection of test data. The embedded point range can comprise a range of distances which are counted by taking the current position of the driving vehicle of the tester as a reference point and determining according to the requirements of the target scene, and data acquisition is carried out in the range, so that the uniformity and the accuracy of the data are ensured. In the example, the position of the host vehicle in the simulated oversized vehicle scene is taken as a reference coordinate, and the position 100m away from the front large vehicle in the same lane is taken as a starting point for data statistics; after the vehicle finishes the task of the oversized vehicle, the position which is on the same lane as the vehicle and exceeds the 100m position of the oversized vehicle is the end, and at the moment, the data detection and acquisition are stopped. In the test process, the speed of the vehicle is ensured to be consistent in the state of the test with the path line and the state of the test without the path line, a plurality of groups of speed range standards can be set to respectively carry out corresponding simulation tests, and the speed of the vehicle in the example comprises the speed before overtaking, the speed during overtaking and the speed exceeding the speed of the cart when the distance from the front cart is 100 m. In the same way, steering wheel data, brake data, throttle data, scene duration data may be collected as second test result data. The collected first test result data and second test result data may be uploaded to a database.
In step S14, the first test result data and the second test result data are compared, so as to determine an evaluation result of the track data to be evaluated according to the test result data comparison result.
In this example embodiment, after the first test result data of the driving simulation of the target test scene under the track condition and the second test result data of the driving simulation under the no-track condition are collected, the two sets of test result data may be subjected to parameter comparison. Specifically, 1) comparing the completion time of whether the simulation display scene has the track line or not, wherein the scene with shorter time length is smoother, more efficient and valuable; and 2) comparing data such as steering wheels, brakes, throttles and the like, and comparing the safety of the two sets of data and the vehicle stability according to specific scenes.
For example, taking a complex route scene, i.e. an oversized car scene as an example, the comparison of the buried points includes the following contents: comparing parameter information of simulated driving under the condition that the oversized parking scene contains the route lines and the route lines are not displayed: the steering wheel has smaller rotation angle, which indicates that the running is smoother and the safety coefficient is higher in the scene; the number of brake treading times is small, which indicates that the running is smoother and the safety coefficient is higher in the scene; the throttle is loosened for a small time, which indicates that the running is smoother and the safety coefficient is higher in the scene; the completion time of the scene is short, which indicates that the vehicle runs more smoothly and more efficiently in the scene.
In some exemplary embodiments, for collected steering wheel rotation data, data statistics analysis can be performed by using steering wheel rotation angle, duration of steering action and change of lanes, so as to determine whether the steering wheel rotation angle is in a relatively stable change trend of increasing (decreasing) at a constant speed and maintaining for a period of time and decreasing (increasing) at a constant speed in the process of overtaking and returning to the original driving road, if the change trend is relatively stable, the steering wheel rotation angle is indicated to be more stable in the process of overtaking, and if the change trend is unstable, the steering wheel rotation angle is indicated to be more stable in the process of overtaking, if the steering wheel rotation angle is maintained to reach another lane for a period of time and the steering wheel rotation angle is returned to be correct again, the steering wheel rotation angle is similar to the current lane, the steering wheel rotation angle is indicated to be lower in the driving scene.
For the acquired brake data, the distribution condition of the trigger positions of the brake stepping actions in the whole driving scene can be judged, the secondary values of the stepping actions in each stage in the driving process (the driving process can be divided into a front of a super car, a lane changing to a left fast lane, a left lane returning to an original lane and four stages after the lane changing is finished), for example, the frequency value of the behavior in two lane changing stages is far higher than the behavior generation frequency in other driving stages, and the stability and safety coefficient is judged to be lower; if the frequency value of the behavior generation in the two lane change stages is similar to the frequency of the behavior generation in other driving stages, the stability and safety coefficient is higher. And a reasonable threshold value for generating braking action in each stage in the scene can be preset, if the number of times of braking in half or more stages is greater than a preset reasonable threshold value, the stability coefficient is lower, otherwise, the trigger position distribution condition is judged according to the mode. Likewise, the throttle release number can be determined in this manner.
For the scene completion time, the time proportion comparison required for completing each driving process can be increased. Specifically, the proportional relation of the duration required by each driving process in the ideal overtaking state can be pre-configured, and if the error between the actual overtaking part ratio and the ideal proportional relation obtained after the actual test is smaller, the stable safety coefficient is higher, otherwise, the stable safety coefficient is low.
When the number of the data comparison result configuration route scenes is larger than the number of the no-route scenes, the route has display value, and the route is determined to be the target scene configuration route and is used as an evaluation result; when the number of the comparison result configuration route scenes meets the condition and is smaller than or equal to the number of the no-route scenes, the route has no display value, and the route is not required to be configured for the target scene and is used as an evaluation result of the target test scene.
According to the track data testing method, corresponding track data to be evaluated are generated in advance aiming at a target testing scene, and simulated driving can be carried out under the condition of track guidance, so that first testing result data of various driving operation data in the driving process are obtained; under the same conditions, the test personnel can freely drive under the condition of no route guidance, and second test result data of various driving operation data in the driving process are obtained; and comparing the two groups of test result data, so that whether the display necessity of the track data and the display value of the evaluation result are provided can be judged under the target test scene.
In the present exemplary embodiment, a display decision method of track data is provided. Referring to fig. 6, the method may specifically include:
step S61, identifying the current scene type corresponding to the current path scene according to the navigation data and the environment information;
step S62, judging whether the current scene type belongs to the evaluated target scene type;
step S63, judging whether alarm type information exists when the current path scene is determined to belong to the target scene type;
step S64, when it is determined that the alarm type information does not exist and the current display content in the target display area meets the preset display condition, track data corresponding to the current path scene is displayed in the target display area.
In this exemplary embodiment, the method further includes: and when the current path scene is determined not to belong to the target scene type, not displaying track data corresponding to the current path scene.
In this exemplary embodiment, the method further includes: and when the alarm type information is determined to exist, not displaying the track data corresponding to the current path scene.
In this exemplary embodiment, the method further includes: and when the alarm type information is determined not to exist, if the priority of the current display content in the target display area is higher than the priority of the track data, not displaying the track data corresponding to the current path scene.
In this exemplary embodiment, the method further includes: the determining whether the current scene type belongs to the evaluated target scene type includes: comparing the current scene type with the evaluated target scene type, and determining an evaluation result of the matched scene type when the matched scene type exists; and when the corresponding evaluation result is displayable track data, determining that the current path scene belongs to the target scene type.
In this exemplary embodiment, the method further includes: the target scene type is evaluated in advance by using the track data testing method described in the above embodiment to obtain a corresponding scene evaluation result.
Specifically, when the driver uses the navigation data, corresponding guidance information may be displayed in the HUD device of the vehicle, for example, as shown in fig. 4 and 5, and track information may be displayed in the AR-HUD device to guide the route, so as to realize a function of assisting driving. In the navigation process, whether the scene where the current path is located is a predefined and evaluated target scene type or not can be judged according to the real-time position information and the real-time acquired external environment information, and a corresponding evaluation result is obtained. If the current path scene is an estimated target type scene, for example, a continuous lane change scene under a complex route, and the corresponding estimation result is that the track data need to be displayed for guiding, whether the current scene is triggered by the alarm type information can be firstly judged, if the alarm type information exists, the alarm type information prompt is preferentially displayed to ensure the driving safety, and the track is not displayed at the moment. Or if no alarm information exists currently, entering the next flow. For example, alert class triggers include: vehicle distance warning, front-rear collision warning, environment pedestrian warning, blind area warning, lane departure warning and the like.
And when judging that the alarm information does not exist currently, judging the number of the current display contents in the display interface of the AR-HUD. Specifically, it may be determined whether the number of currently displayed contents in the target display area of the track line to be displayed is less than three, and whether the displayed contents overlap. If the current display elements are less than 3 items and the design elements are not overlapped, displaying the navigation line in the target display area; alternatively, if the determination condition is not satisfied, the route line is not displayed. On the display interface of the AR-HUD, the content quantity of the information displayed at the same time needs to be controlled. Important information is displayed preferentially, so that the driving of a user is ensured, otherwise, the burden and decision time are increased for the driver, and the dangerous driving coefficient is improved. The interface accords with the information layout principle, and the number of design elements presented by the AR-HUD is preferably maintained between 1 and 3; meanwhile, information overlapping is avoided, the overlapped part is displayed according to priority, and the readability of the information is ensured.
For example, a scene suitable for displaying a track line may include: 1. a guiding scenario of a regular route, such as: intersection guiding scenes, multi-lane guiding lane changing scenes and the like; 2. guide scenes of complex routes, such as: a continuous lane changing scene, a high-speed entrance scene, an oversized vehicle scene and the like; 3. special cases, such as: the traffic jam of the lane guides the optimal lane route scene, guides the new route scene after the route is wrong, and the like. Of course, in other exemplary embodiments of the present disclosure, a driving scenario may be customized by a user, and a displaying manner of the route line may be customized.
According to the track data display decision method, multiple rounds of judgment are made on the scene type of the current path, whether alarm information exists or not, the number of display contents of the display area, the corresponding display positions and the necessity of displaying the track, and whether the corresponding track is displayed or not is finally decided. Therefore, the display of the route line is ensured not to influence the display of other important information and the driving.
It is noted that the above-described figures are only schematic illustrations of processes involved in a method according to an exemplary embodiment of the invention, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Further, referring to fig. 7, in the embodiment of the present example, there is further provided a track data testing apparatus 70, which includes: a scene determination module 701, a first test data acquisition module 702, a second test data acquisition module 703, and an evaluation result generation module 704. Wherein,,
the scenario determination module 701 may be configured to determine a target test scenario, and corresponding track data to be evaluated.
The first test data obtaining module 702 may be configured to execute the test scenario data corresponding to the target test scenario and display the track data to be evaluated, so as to obtain corresponding first test result data.
The second test data obtaining module 703 may be configured to execute the test scenario data corresponding to the target test scenario to obtain corresponding second test result data.
The evaluation result generation module 704 may be configured to compare the first test result data with the second test result data, so as to determine an evaluation result of the track data to be evaluated according to the test result data comparison result.
Further, referring to fig. 8, in this exemplary embodiment, there is further provided a display decision system 80 for track data, the system including: a current scene type identification module 801, a scene type comparison module 802, an alarm information identification module 803 and a current display content identification module 804. Wherein,,
the current scene type identification module 801 may be configured to identify a current scene type corresponding to a current path scene according to navigation data and environment information.
The scene type comparison module 802 may be configured to determine whether the current scene type belongs to an estimated target scene type.
The alarm information identifying module 803 may be configured to determine whether alarm type information exists when it is determined that the current path scene belongs to the target scene type.
The current display content identification module 804 may be configured to display track data corresponding to the current path scene in the target display area when it is determined that the alert type information does not exist and the current display content in the target display area meets a preset display condition.
The specific details of each module in the above-mentioned track data testing device 70 and the track data display decision system 80 are described in detail in the corresponding track data testing method and track data display decision method, so that they will not be described in detail herein.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Fig. 9 shows a schematic diagram of an electronic device suitable for use in implementing embodiments of the invention.
It should be noted that the electronic device 1000 shown in fig. 9 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present disclosure.
As shown in fig. 9, the electronic apparatus 1000 includes a central processing unit (Central Processing Unit, CPU) 1001 that can perform various appropriate actions and processes according to a program stored in a Read-Only Memory (ROM) 1002 or a program loaded from a storage section 1008 into a random access Memory (Random Access Memory, RAM) 1003. In the RAM 1003, various programs and data required for system operation are also stored. The CPU 1001, ROM 1002, and RAM 1003 are connected to each other by a bus 1004. An Input/Output (I/O) interface 1005 is also connected to bus 1004.
The following components are connected to the I/O interface 1005: an input section 1006 including a keyboard, a mouse, and the like; an output portion 1007 including a Cathode Ray Tube (CRT), a liquid crystal display (Liquid Crystal Display, LCD), and a speaker; a storage portion 1008 including a hard disk or the like; and a communication section 1009 including a network interface card such as a LAN (Local Area Network ) card, a modem, or the like. The communication section 1009 performs communication processing via a network such as the internet. The drive 1010 is also connected to the I/O interface 1005 as needed. A removable medium 1011, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is installed on the drive 1010 as needed, so that a computer program read out therefrom is installed into the storage section 1008 as needed.
In particular, according to embodiments of the present invention, the processes described below with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present invention include a computer program product comprising a computer program loaded on a storage medium, the computer program comprising program code for performing the method shown in the flowchart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 1009, and/or installed from the removable medium 1011. When executed by a Central Processing Unit (CPU) 1001, the computer program performs various functions defined in the system of the present application.
Specifically, the electronic device may be an intelligent mobile electronic device such as a mobile phone, a tablet computer or a notebook computer. Alternatively, the electronic device may be an intelligent electronic device such as a desktop computer. Alternatively, the electronic device may be a display device of the HUD.
It should be noted that, the storage medium shown in the embodiments of the present invention may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (Erasable Programmable Read Only Memory, EPROM), flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any storage medium that is not a computer readable storage medium and that can transmit, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a storage medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present invention may be implemented by software, or may be implemented by hardware, and the described units may also be provided in a processor. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
It should be noted that, as another aspect, the present application further provides a storage medium, which may be included in an electronic device; or may exist alone without being incorporated into the electronic device. The storage medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the methods described in the embodiments below. For example, the electronic device may implement the various steps shown in fig. 1 or 6.
Furthermore, the above-described drawings are only schematic illustrations of processes included in the method according to the exemplary embodiment of the present invention, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (13)

1. A method of testing track data, the method comprising:
determining a target test scene and corresponding track data to be evaluated;
executing the test scene data corresponding to the target test scene and displaying the track data to be evaluated so as to acquire corresponding first test result data; and
executing the test scene data corresponding to the target test scene to obtain corresponding second test result data;
and comparing the first test result data with the second test result data to determine an evaluation result of the track data to be evaluated according to the test result data comparison result.
2. The method according to claim 1, wherein the executing the test scenario data corresponding to the target test scenario and displaying the track data to be evaluated to obtain the corresponding first test result data includes:
executing test scene data corresponding to the target test scene in a simulation system, so as to be used for carrying out simulated driving operation and completing simulated driving tasks by a tester according to the displayed track data to be evaluated;
And acquiring data of operation data of a target type in the simulated driving operation process by utilizing a preset data acquisition task so as to acquire the first test result data.
3. The method of claim 1, wherein the executing the test scenario data corresponding to the target test scenario to obtain the corresponding second test result data includes:
executing test scene data corresponding to the target test scene in a simulation system, so as to be used for carrying out simulated driving operation by a tester and completing a simulated driving task;
and acquiring data of operation data of a target type in the simulated driving operation process by utilizing a preset data acquisition task so as to acquire the second test result data.
4. The method of track data testing according to claim 1, wherein the method further comprises:
judging whether the target scene meets a track data display strategy for displaying corresponding track data or not;
when the target scene is determined to meet the track data display strategy, configuring the target scene as a scene to be tested;
configuring test scene data and track data according to scene content of a scene to be tested; and
Configuring a corresponding scene test task for the scene to be tested, so as to execute the scene test task and obtain a corresponding test result;
the scene test tasks comprise test tasks corresponding to a single vehicle speed value and test tasks corresponding to a preset vehicle speed range.
5. The method for testing track data according to claim 4, wherein the configuring the test scene data and the track data according to the scene content of the scene to be tested comprises:
planning a path according to the scene content of the scene to be tested;
configuring corresponding track data based on a path planning result; and
and configuring a trigger position and an end position of the track data in the scene to be tested.
6. A method for display decision-making of track data, the method comprising:
identifying a current scene type corresponding to the current path scene according to the navigation data and the environment information;
judging whether the current scene type belongs to an evaluated target scene type or not;
judging whether alarm type information exists when the current path scene is determined to belong to the target scene type;
and when the alarm type information is determined to be absent and the current display content in the target display area meets the preset display condition, displaying track data corresponding to the current path scene in the target display area.
7. The method of determining a display of track data of claim 6, further comprising:
and when the current path scene is determined not to belong to the target scene type, not displaying track data corresponding to the current path scene.
8. The method of determining a display of track data of claim 6, further comprising:
and when the alarm type information is determined to exist, not displaying the track data corresponding to the current path scene.
9. The method of determining a display of track data of claim 6, further comprising:
and when the alarm type information is determined not to exist, if the priority of the current display content in the target display area is higher than the priority of the track data, not displaying the track data corresponding to the current path scene.
10. The method of claim 6, wherein determining whether the current scene type belongs to an estimated target scene type comprises:
comparing the current scene type with the evaluated target scene type, and determining an evaluation result of the matched scene type when the matched scene type exists;
And when the corresponding evaluation result is displayable track data, determining that the current path scene belongs to the target scene type.
11. The method of determining a display of track data of claim 10, further comprising:
the method for testing the track data according to any one of claims 1 to 5 is used for evaluating the target scene type in advance to obtain a corresponding scene evaluation result.
12. A storage medium having stored thereon a computer program, which when executed by a processor implements the track data testing method of any one of claims 1 to 5 and/or the display decision method of track data of any one of claims 6 to 11.
13. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the track data testing method of any one of claims 1 to 5 and/or the display decision method of track data of any one of claims 6 to 11 via execution of the executable instructions.
CN202310445419.7A 2023-04-24 2023-04-24 Track data testing method, display decision method, storage medium and electronic equipment Active CN116150040B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310445419.7A CN116150040B (en) 2023-04-24 2023-04-24 Track data testing method, display decision method, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310445419.7A CN116150040B (en) 2023-04-24 2023-04-24 Track data testing method, display decision method, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN116150040A true CN116150040A (en) 2023-05-23
CN116150040B CN116150040B (en) 2023-07-14

Family

ID=86352910

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310445419.7A Active CN116150040B (en) 2023-04-24 2023-04-24 Track data testing method, display decision method, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN116150040B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111459132A (en) * 2020-03-12 2020-07-28 武汉理工大学 Evaluation method and system for navigation function of ship
CN112307566A (en) * 2020-11-12 2021-02-02 安徽江淮汽车集团股份有限公司 Vehicle simulation test method, device, equipment and storage medium
CN114415628A (en) * 2021-12-28 2022-04-29 阿波罗智联(北京)科技有限公司 Automatic driving test method and device, electronic equipment and storage medium
WO2022151627A1 (en) * 2021-01-18 2022-07-21 广东纳睿雷达科技股份有限公司 Flight track initiation method and system based on target velocity characteristics

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111459132A (en) * 2020-03-12 2020-07-28 武汉理工大学 Evaluation method and system for navigation function of ship
CN112307566A (en) * 2020-11-12 2021-02-02 安徽江淮汽车集团股份有限公司 Vehicle simulation test method, device, equipment and storage medium
WO2022151627A1 (en) * 2021-01-18 2022-07-21 广东纳睿雷达科技股份有限公司 Flight track initiation method and system based on target velocity characteristics
CN114415628A (en) * 2021-12-28 2022-04-29 阿波罗智联(北京)科技有限公司 Automatic driving test method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
畅言;郗蕴天;罗利强;: "一种新型的数据融合系统", 信息与电脑(理论版), no. 07 *

Also Published As

Publication number Publication date
CN116150040B (en) 2023-07-14

Similar Documents

Publication Publication Date Title
CN109520744B (en) Driving performance testing method and device for automatic driving vehicle
CN113642633B (en) Method, device, equipment and medium for classifying driving scene data
Feng et al. Safety assessment of highly automated driving systems in test tracks: A new framework
CN109919347B (en) Road condition generation method, related device and equipment
CN107063711B (en) Method and apparatus for testing unmanned vehicles
Essa et al. Simulated traffic conflicts: do they accurately represent field-measured conflicts?
CN110562258B (en) Method for vehicle automatic lane change decision, vehicle-mounted equipment and storage medium
Nguyen et al. Traffic conflict assessment for non-lane-based movements of motorcycles under congested conditions
CN112819968B (en) Test method and device for automatic driving vehicle based on mixed reality
CN113535569B (en) Control effect determination method for automatic driving
CN111477028B (en) Method and device for generating information in automatic driving
CN114428998A (en) Integrated simulation test and evaluation method and system for automatic driving system
King et al. A taxonomy and survey on validation approaches for automated driving systems
CN112671487B (en) Vehicle testing method, server and testing vehicle
CN113918615A (en) Simulation-based driving experience data mining model construction method and system
CN112816226A (en) Automatic driving test system and method based on controllable traffic flow
CN113867367B (en) Processing method and device for test scene and computer program product
CN117075350B (en) Driving interaction information display method and device, storage medium and electronic equipment
Wang et al. Transferability analysis of the freeway continuous speed model
CN116150040B (en) Track data testing method, display decision method, storage medium and electronic equipment
CN116088538B (en) Vehicle track information generation method, device, equipment and computer readable medium
Twaddle et al. Integration of an external bicycle model in SUMO
CN116847401B (en) Internet of vehicles testing method, device and readable storage medium
CN118296813A (en) Traffic simulation application method, device, equipment and storage medium
Biemelt et al. Subjective evaluation of filter-and optimization-based motion cueing algorithms for a hybrid kinematics driving simulator

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant