CN116773216A - Test method, device, system, storage medium and electronic equipment - Google Patents

Test method, device, system, storage medium and electronic equipment Download PDF

Info

Publication number
CN116773216A
CN116773216A CN202310696914.5A CN202310696914A CN116773216A CN 116773216 A CN116773216 A CN 116773216A CN 202310696914 A CN202310696914 A CN 202310696914A CN 116773216 A CN116773216 A CN 116773216A
Authority
CN
China
Prior art keywords
video
target video
simulated vehicle
latitude
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310696914.5A
Other languages
Chinese (zh)
Other versions
CN116773216B (en
Inventor
张涛
李凯
韩雨青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Zejing Automobile Electronic Co ltd
Original Assignee
Jiangsu Zejing Automobile Electronic Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Zejing Automobile Electronic Co ltd filed Critical Jiangsu Zejing Automobile Electronic Co ltd
Priority to CN202310696914.5A priority Critical patent/CN116773216B/en
Publication of CN116773216A publication Critical patent/CN116773216A/en
Application granted granted Critical
Publication of CN116773216B publication Critical patent/CN116773216B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0808Diagnosing performance data
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • Navigation (AREA)

Abstract

The disclosure provides a testing method, a testing device, a testing system, a storage medium and electronic equipment, and relates to the technical field of computers. The test method comprises the following steps: responding to initial configuration operation of the simulated vehicle, and determining a target video from a pre-recorded video library; the method comprises the steps of simulating initial configuration operation of a vehicle, wherein the initial configuration operation of the vehicle comprises the initial longitude and latitude configuration operation of the vehicle, and videos contained in a video library are recorded in a driving process under a real scene; determining a target video frame corresponding to the initial longitude and latitude of the simulated vehicle from the target video; and controlling the playing progress of the target video in response to the control operation for the simulated vehicle. The method and the device can reduce the test cost of the vehicle function test and improve the flexibility of the test.

Description

Test method, device, system, storage medium and electronic equipment
Technical Field
The disclosure relates to the technical field of computers, and in particular relates to a testing method, a testing device, a testing system, a storage medium and electronic equipment.
Background
In the process of testing related functions of a vehicle, drive test is often required to be performed in combination with a real vehicle. For example, for Head Up Display (HUD) functions, some test methods involve testing the merits of HUD functions using the combined effect of the real vehicle and the virtual reality that is present during the driving of the real vehicle on the road.
However, these approaches rely on tests of real vehicles, which are costly and limited by the actual travel route, with poor flexibility.
Disclosure of Invention
The disclosure aims to provide a testing method, a testing device, a testing system, a storage medium and an electronic device, so as to solve the problems of high testing cost and poor testing flexibility of vehicle functions at least to a certain extent.
According to a first aspect of the present disclosure, there is provided a test method comprising: responding to initial configuration operation of the simulated vehicle, and determining a target video from a pre-recorded video library; the method comprises the steps of simulating initial configuration operation of a vehicle, wherein the initial configuration operation of the vehicle comprises the initial longitude and latitude configuration operation of the vehicle, and videos contained in a video library are recorded in a driving process under a real scene; determining a target video frame corresponding to the initial longitude and latitude of the simulated vehicle from the target video; and controlling the playing progress of the target video in response to the control operation for the simulated vehicle.
Optionally, in response to simulating an initial configuration operation of the vehicle, determining the target video from the pre-recorded video library includes: determining the initial longitude and latitude of the simulated vehicle in response to the initial longitude and latitude configuration operation of the simulated vehicle; and determining the target video from a pre-recorded video library by combining the initial longitude and latitude of the simulated vehicle.
Optionally, simulating the initial configuration operation of the vehicle further comprises simulating a navigational configuration operation of the vehicle; wherein, determining the target video from the pre-recorded video library in combination with the initial longitude and latitude of the simulated vehicle comprises: responding to the navigation configuration operation of the simulated vehicle, and determining a navigation route by combining the initial longitude and latitude of the simulated vehicle; and determining the target video from a pre-recorded video library according to the navigation route.
Optionally, determining the target video from the pre-recorded video library in combination with the initial longitude and latitude of the simulated vehicle comprises: determining an initial driving direction of the simulated vehicle in response to an initial configuration operation of the simulated vehicle; and determining the target video from a pre-recorded video library based on the initial longitude and latitude and the initial driving direction of the simulated vehicle.
Optionally, recording longitude and latitude corresponding to each video frame when each video frame in the target video is collected; wherein determining a target video frame corresponding to the initial longitude and latitude of the simulated vehicle from the target video comprises: matching the initial longitude and latitude of the simulated vehicle with the longitude and latitude corresponding to each video frame in the target video; and determining the video frame with the highest matching degree from the target video as the target video frame.
Optionally, controlling the playing progress of the target video in response to the control operation for the simulated vehicle includes: determining a speed of the simulated vehicle according to a control operation for the simulated vehicle; and determining the playing speed of the target video according to the speed of the simulated vehicle.
Optionally, the test method further comprises: in the process of playing the target video, HUD data are acquired; the HUD data is sent to the HUD stage for HUD display during the target video playback.
Optionally, the test method further comprises: if the simulated vehicle deviates from the route corresponding to the target video, determining the current longitude and latitude, controlling the next video to start playing from the video frame corresponding to the current longitude and latitude under the condition that the next video is determined from the pre-recorded video library in combination with the current longitude and latitude, and controlling the playing progress of the next video in response to the control operation for the simulated vehicle.
According to a second aspect of the present disclosure, there is provided a test apparatus comprising a video determination module, a frame determination module, and a test control module.
Specifically, the video determining module is used for responding to the initial configuration operation of the simulated vehicle and determining a target video from a pre-recorded video library; the method comprises the steps of simulating initial configuration operation of a vehicle, wherein the initial configuration operation of the vehicle comprises the initial longitude and latitude configuration operation of the vehicle, and videos contained in a video library are recorded in a driving process under a real scene; the frame determining module is used for determining a target video frame corresponding to the initial longitude and latitude of the simulated vehicle from the target video; the test control module is used for controlling the target video to start playing from the target video frame and controlling the playing progress of the target video in response to the control operation aiming at the simulated vehicle.
Optionally, the video determination module is configured to perform: determining the initial longitude and latitude of the simulated vehicle in response to the initial longitude and latitude configuration operation of the simulated vehicle; and determining the target video from a pre-recorded video library by combining the initial longitude and latitude of the simulated vehicle.
Optionally, the simulating the initial configuration operation of the vehicle further includes simulating the navigational configuration operation of the vehicle, in which case the determining the target video from the pre-recorded video library in combination with the initial longitude and latitude of the simulated vehicle includes: responding to the navigation configuration operation of the simulated vehicle, and determining a navigation route by combining the initial longitude and latitude of the simulated vehicle; and determining the target video from a pre-recorded video library according to the navigation route.
Optionally, the process of determining the target video from the pre-recorded video library by the video determining module in combination with simulating the initial longitude and latitude of the vehicle further comprises: determining an initial driving direction of the simulated vehicle in response to an initial configuration operation of the simulated vehicle; and determining the target video from a pre-recorded video library based on the initial longitude and latitude and the initial driving direction of the simulated vehicle.
Optionally, the longitude and latitude corresponding to each video frame are recorded when each video frame in the target video is acquired, in which case the frame determining module may be configured to perform: matching the initial longitude and latitude of the simulated vehicle with the longitude and latitude corresponding to each video frame in the target video; and determining the video frame with the highest matching degree from the target video as the target video frame.
Optionally, the test control module is configured to perform: determining a speed of the simulated vehicle according to a control operation for the simulated vehicle; and determining the playing speed of the target video according to the speed of the simulated vehicle.
Optionally, the test control module is further configured to perform: in the process of playing the target video, HUD data are acquired; the HUD data is sent to the HUD stage for HUD display during the target video playback.
Optionally, the test control module is further configured to perform: if the simulated vehicle deviates from the route corresponding to the target video, determining the current longitude and latitude; and under the condition that the next video is determined from the pre-recorded video library by combining the current longitude and latitude, controlling the next video to start playing from the video frame corresponding to the current longitude and latitude, and controlling the playing progress of the next video in response to the control operation aiming at the simulated vehicle.
According to a third aspect of the present disclosure, there is provided a test system including a test device and a display end.
Specifically, the testing device is used for responding to initial configuration operation of the simulated vehicle, determining target video from a pre-recorded video library, wherein the initial configuration operation of the simulated vehicle comprises initial longitude and latitude configuration operation of the simulated vehicle, the video contained in the video library is video recorded in a driving process under a real scene, determining a target video frame corresponding to the initial longitude and latitude of the simulated vehicle from the target video, controlling the target video to start playing from the target video frame, and responding to control operation of the simulated vehicle to control the playing progress of the target video; the display end is used for displaying the target video played based on the control of the testing device.
Optionally, the testing device is further configured to obtain HUD data during playing of the target video. In this case, the test system further includes a HUD stage for receiving HUD data and performing HUD display based on the HUD data in the course of target video playback.
Optionally, the test system further comprises: the video recording equipment is used for recording videos in advance in the driving process of different road sections under the real scene so as to generate a video library.
Optionally, the video recording device is further configured to collect track data when recording the video; the track data comprises longitude and latitude corresponding to the recorded video frames.
According to a fourth aspect of the present disclosure, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described test method.
According to a fifth aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; the processor is configured to implement the above-described test method via execution of executable instructions.
In the technical solutions provided in some embodiments of the present disclosure, by configuring a simulated vehicle and performing simulated driving in a pre-recorded video, the playing progress of the video is controlled in response to a control operation of the simulated vehicle. On one hand, the test process aims at simulating a vehicle instead of a real vehicle, so that the test cost is low; on the other hand, the simulation driving is carried out in the prerecorded video, so that the scene tested by the vehicle corresponds to prerecorded road conditions, and compared with some virtual scenes configured as, for example, cartoons, the simulation driving method can provide a sense-actual testing environment, and can be matched with various different types of testing environments based on various prerecorded road conditions, the testing environment is optional, and the flexibility of testing on the scene can be improved; in still another aspect, in the present disclosure, the playing of the video is not controlled by a normal playing time, but the control of the playing progress of the video is realized based on the control operation of the simulated vehicle, that is, the playing progress of the video is bound to the control operation of the simulated vehicle, and the playing progress of the video depends on the control of the simulated vehicle, so that the flexibility of the test can be further improved for the test of the vehicle function.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
Fig. 1 schematically illustrates a system architecture diagram of a test system of an embodiment of the present disclosure.
Fig. 2 schematically illustrates a system architecture diagram of a test system of another embodiment of the present disclosure.
Fig. 3 schematically shows a block diagram of a configuration of a test apparatus according to an embodiment of the present disclosure.
Fig. 4 schematically illustrates a block diagram of a simulated driving module of an embodiment of the disclosure.
Fig. 5 schematically illustrates a system architecture diagram of a test system of a further embodiment of the present disclosure.
Fig. 6 schematically shows a block diagram of a video recording apparatus of an embodiment of the present disclosure.
Fig. 7 shows a schematic configuration diagram of a video correspondence file in a video library according to an embodiment of the present disclosure.
Fig. 8 schematically shows a flow chart of a test method of an exemplary embodiment of the present disclosure.
Fig. 9 shows a schematic diagram of a simulated vehicle deployed on a map.
Fig. 10 shows a schematic diagram of a display interface for video playback.
Fig. 11 schematically shows a flow chart of a test procedure of an embodiment of the present disclosure.
Fig. 12 schematically illustrates a block diagram of a testing apparatus according to an exemplary embodiment of the present disclosure.
Fig. 13 schematically illustrates a block diagram of an electronic device according to an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. One skilled in the relevant art will recognize, however, that the aspects of the disclosure may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only and not necessarily all steps are included. For example, some steps may be decomposed, and some steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
The test scheme of the embodiment of the disclosure can be applied to a scene of testing HUD functions, and can also be applied to other test scenes related to vehicle driving, for example, testing acceleration performance, braking performance, automatic avoidance function and/or other intelligent driving functions of a vehicle, and the application scene of the test scheme is not limited by the disclosure.
The display effect of the HUD can be tested for the scene of testing the HUD function. The content tested may be divided into AR (Augmented Reality ) content and non-AR content. The testing of the AR content may, for example, include testing of the course, such as a degree of fit of the course to the ground, i.e., a "ground feel" test, and/or accuracy of identification of where the POI (Point of Interest ) is located. The testing of non-AR content may include, for example, vehicle speed, speed limit, navigation remaining distance, curve type, etc.
Fig. 1 schematically illustrates a system architecture diagram of a test system of an embodiment of the present disclosure. Referring to fig. 1, a test system 1 of an embodiment of the present disclosure may include a test device 11 and a display end 12. The test device 11 may be disposed in an electronic apparatus such as a PC (Personal Computer ) or a server, and the display terminal 12 may be a projector, an electronic screen, or the like, for example. The testing device 11 may be connected to the display end 12 by a wired or wireless method, and the connection method is not limited in this disclosure.
Specifically, the testing device 11 may be configured to determine, in response to an initial configuration operation of the simulated vehicle, a target video from a pre-recorded video library, where the initial configuration operation of the simulated vehicle includes an initial longitude and latitude configuration operation of the simulated vehicle, the video contained in the video library is a video recorded during driving in a real scene, determine, from the target video, a target video frame corresponding to the initial longitude and latitude of the simulated vehicle, control the target video to start playing from the target video frame, and control a playing progress of the target video in response to a control operation for the simulated vehicle. The display end 12 is used for displaying the target video played based on the control of the test device.
In a scenario where the present disclosure solution is applied to testing a HUD, the test system of the embodiments of the present disclosure may also include a HUD gantry.
Referring to fig. 2, a test system 2 of another embodiment of the present disclosure may include a test device 21, a display end 22, and a HUD bench 23. The operations performed by the testing device 21 and the display terminal 22 include the operations performed by the testing device 11 and the display terminal 12. In addition, the testing device 21 may be connected to the HUD stand 23 by wired or wireless means, the communication protocol using, for example, the gRPC framework.
Furthermore, the testing device 21 may be configured to acquire HUD data during playback of the target video. The HUD bench 23 may be used to receive the HUD data sent by the testing device 21 and perform HUD display based on the HUD data during playing of the target video on the display end 12. The HUD displays content including one or more of course, speed limit, navigation remaining distance, turn type, POI information.
Fig. 3 schematically shows a block diagram of a configuration of the test device 21 of the embodiment of the present disclosure. Referring to fig. 3, the testing device 21 may include an analog driving module 301, a video playing module 302, and a HUD communication module 303.
The simulated driving module 301 may be used to provide map and POI information, wherein the role of the map includes initial presentation for initial configuration operations of the simulated vehicle. The simulated driving module 301 may acquire geographical position information of any place, may control the simulated vehicle to advance or retreat, and may set a start point and an end point of navigation. The data generated by the simulated driving module 301 includes one or more of latitude and longitude, direction angle, speed limit information, curve information, lane information, POI information within a predetermined range around (e.g., 1000 meters) of the simulated vehicle, which may typically be exhibited by the HUD.
The video playing module 302 may be configured to perform a process of determining a target video frame, and control the display 22 to play the target video from the target video frame.
The HUD communication module 303 may be configured to send HUD data generated during testing to the HUD stage 23 for visual display of information, including AR content and/or non-AR content.
Fig. 4 schematically shows a block diagram of the simulated driving module 21 of an embodiment of the present disclosure. Referring to fig. 4, the simulated driving module 301 may include a map engine unit 401, a vehicle control unit 402, and a simulated navigation unit 403.
The map engine unit 401 may be used for displaying a map and performing some basic conversion operations, where the map may use a tile map, and its source may include a third party map software or a navigation software, or may be self-developed and rendered, and the present disclosure is not limited to the form and source of the map. The map engine unit 401 may also provide a POI search function that can search for POI information of a specific type (e.g., gas station, supermarket, restaurant, mall, hospital, etc.) within a predetermined range, the provided content including at least POI name, type, longitude and latitude, etc., so that the HUD can be noted by means of AR.
The vehicle control unit 402 may implement control of the simulated vehicle, including but not limited to control of the simulated vehicle speed, direction angle, etc. The simulation data generated during driving can be transmitted to the HUD for HUD display.
The simulated navigation unit 403 may provide a function of setting a start point and an end point, and perform route planning and navigation.
Fig. 5 schematically illustrates a system architecture diagram of a test system of a further embodiment of the present disclosure. Referring to fig. 5, the test system 5 may include a video recording apparatus 50, a testing device 51, a display end 52, and a HUD gantry 53. The testing device 51, the display end 52 and the HUD bench 53 correspond to the testing device 21, the display end 22 and the HUD bench 23, respectively, and will not be described again.
The video recording device 50 may be used to record video during a pre-run real vehicle (or real vehicle) drive test, where the shooting location may be at or near the driver's eye position. The video recording apparatus 50 may generate a video library according to an embodiment of the present disclosure by recording videos in advance during formation of different road segments in a real scene.
In addition, video recording device 50 may also be used to collect track data as videos are recorded. The track data includes longitude and latitude, vehicle speed, direction angle, turning type, etc. of the vehicle. In the application, the video recording frequency is the same as the frequency of collecting the track data (for example, 50 Hz), and the track data is collected at the same time with the same frequency while the video is recorded at a certain frequency; thus, each video frame recorded corresponds to its track data.
Fig. 6 schematically illustrates a block diagram of a video recording device 50 of an embodiment of the present disclosure. Referring to fig. 6, the video recording device 50 may include a video encoding unit 601 and a track acquisition unit 602.
The video encoding unit 601 may be used to control the video recording process. The track acquisition unit 602 may be configured to acquire track data during video recording, and record frame numbers, time stamps, longitude and latitude, navigation data, and the like. As an alternative implementation, for each frame of data, the video encoding may be performed in a format of H264 or H265, the frame number, the timestamp, etc. of each frame may be recorded in a file packet simultaneously, and the high-precision positioning coordinates of the vehicle may use RTK (Real Time Kinematic, real-time motion) or PPK (Post-Processing Kinematic, post-processing motion) technology, which is not limited in this disclosure.
The package format employed by the present disclosure is shown in fig. 7. Referring to fig. 7, in addition to real vehicle video and track data, the file package includes metadata including some description information, specifically including, for example, file version, video duration, positioning mode, navigation route, acquisition frequency, along-the-way track points, and the like. By means of this data, the test device 51 at test time can select matching videos from the video library.
In fig. 7, real car video is recorded original video. The track data may be configured in the form of key-value pairs (key-value), where key is a position coordinate, i.e., latitude and longitude, and value may include a corresponding frame number code, a time stamp, a vehicle speed, a direction angle, and the like.
For the configuration form of the package as shown in fig. 7, the present disclosure names it as a ZJM file.
The test method of the embodiment of the present disclosure will be described below.
It will be appreciated that prior to performing the test, a test system as described above needs to be built, at least including a test device (configured in an electronic apparatus such as a PC) and a display end (such as a projector, an electronic screen, etc.), and various steps of the test method of the present disclosure may be performed by the test device, or may be described as the various steps of the test method described below may be performed by the electronic apparatus configured with the test device.
Fig. 8 schematically shows a flow chart of a test method of an exemplary embodiment of the present disclosure.
Referring to fig. 8, the test method may include the steps of:
s82, responding to initial configuration operation of the simulated vehicle, and determining a target video from a pre-recorded video library; the initial configuration operation of the simulated vehicle comprises the initial longitude and latitude configuration operation of the simulated vehicle, and the video contained in the video library is recorded in the driving process under the real scene.
According to some embodiments of the present disclosure, simulating an initial configuration operation of the vehicle may include placing the operation of the simulated vehicle on a map. Specifically, first, the display may display a map, which may be a tile map, which may be from a third party map or navigation software, which is not limited in this disclosure. Next, an initial configuration operation of the simulated vehicle may be performed based on the map, which may be implemented by means of a handle, a keyboard, a mouse, and in addition, in the case that the display end is a touch screen, the initial configuration of the simulated vehicle may be implemented by performing a touch operation on the map. The present disclosure is not limited in the form of configuration operations.
The initial configuration of the simulated vehicle includes at least a latitude and longitude configuration and a direction angle configuration.
The results of the configuration may appear on a map. Fig. 9 shows a schematic diagram of a simulated vehicle deployed on a map, and referring to fig. 9, the simulated vehicle is deployed in this scenario, as indicated by a navigation mark (or arrow) in the figure, containing at least latitude and longitude information and directional angle information.
In addition to the position and direction angle, the initial configuration of the simulated vehicle may also include a navigation configuration, etc., which is not limiting of the present disclosure.
For the process of determining the target video, the initial longitude and latitude of the simulated vehicle can be determined in response to the initial longitude and latitude configuration operation of the simulated vehicle, and then the target video is determined from a pre-recorded video library by combining the initial longitude and latitude of the simulated vehicle.
According to some embodiments of the present disclosure, an initial driving direction of a simulated vehicle may be determined in response to an initial configuration operation of the simulated vehicle, and a target video may be determined from a pre-recorded video library based on an initial longitude and latitude of the simulated vehicle and the initial driving direction. The driving direction may be represented by a direction angle disclosed in the disclosure, that is, an angle with the north direction, where the angle is an angle with a direction, such as north-west, north-east, etc. However, the driving direction may also be represented in other manners, such as an angle with the forward south, forward west, forward east direction, etc., which the present disclosure does not limit.
As described above, for a plurality of ZJM files stored in advance, ZJM files matching with the initial longitude and latitude and the initial driving direction can be screened out based on metadata thereof, namely, the target video is determined from the video library.
According to other embodiments of the present disclosure, simulating the initial configuration operation of the vehicle further includes simulating the navigation configuration operation of the vehicle, in which case the navigation route may be determined in conjunction with the initial longitude and latitude of the simulated vehicle in response to the navigation configuration operation of the simulated vehicle, and the target video may be determined from the pre-recorded video library according to the navigation route.
Specifically, metadata in the ZJM files include navigation routes, and the ZJM files corresponding to the initial navigation configuration operation can be screened out from a plurality of pre-stored ZJM files through matching operation of the navigation routes, that is, a target video is determined from a video library.
It should be understood that the video library in embodiments of the present disclosure may contain one or more video amounts. If the video corresponding to the simulated vehicle cannot be screened based on the initial configuration operation of the simulated vehicle, the initial configuration operation can be prompted to be conducted again, such as text prompting or voice prompting on a map, so that the configuration of longitude, latitude and/or direction angle can be conducted again.
S84, determining a target video frame corresponding to the initial longitude and latitude of the simulated vehicle from the target video.
In the process of prerecording the target video, track data corresponding to each video frame are collected at the same time, and the track data at least comprises longitude and latitude corresponding to the video frame. Thus, a target video frame may be determined from the target video based on the initial longitude and latitude of the simulated vehicle. The target video frame is one image frame in the target video frames.
Specifically, the initial longitude and latitude of the simulated vehicle can be matched with the longitude and latitude corresponding to each video frame in the target video, and the video frame with the highest matching degree is determined from the target video to be used as the target video frame.
It should be appreciated that the acquisition of video frames has a time interval and that the initial longitude and latitude of the configured simulated vehicle does not necessarily coincide exactly with the longitude and latitude at the time of video frame acquisition, and thus, embodiments of the present disclosure require the computation of a degree of matching, i.e., a degree of positional similarity. And selecting a video frame with the position closest to the initial longitude and latitude from the target video as a target video frame.
S86, controlling the target video to start playing from the target video frame, and controlling the playing progress of the target video in response to the control operation aiming at the simulated vehicle.
After the target video frame is determined, the target video may be played starting from the target video frame. At this time, the picture of the display end is switched from the map to the target video frame.
In an exemplary embodiment of the present disclosure, the playing of the target video is not dependent on the time of the video itself, but is based on the travel control of the simulated vehicle. That is, the playback progress of the target video is controlled based on the control operation for the simulated vehicle.
The play progress of embodiments of the present disclosure may be generally characterized in terms of vehicle speed. Specifically, the speed of the simulated vehicle may be determined according to the control operation of the simulated vehicle, and the playing speed of the target video may be determined according to the speed of the simulated vehicle.
It should be noted that the control of the video playback progress of the present disclosure is real-time control, that is, if the speeds of the simulated vehicles are different at two times, the real-time video playback speeds are different for the two times.
The geographic location of the simulated vehicle may be provided by a map engine and the travel trajectory of the simulated vehicle may be determined by speed and direction. Under the condition that the longitude and latitude, the current speed and the direction of the current position are determined, the position of the simulated vehicle after any time period t can be calculated. If the speed (unit m/s) of the simulated vehicle is denoted as speed, the direction angle (the angle with the north direction is denoted as radian) is denoted as head, the driving distance is denoted as distance, the current latitude is denoted as lat, the current longitude is denoted as lon, the latitude after the time period t is denoted as new_lat, and the longitude after the time period t is denoted as new_lon, the calculation formula is as follows:
distance=speed*t
new_lat=lat+(distance*cos(heading)/DISTANCE_METER_PER_DEGREE new_lon=lat+(distance*sin(heading)/DISTANCE_METER_PER_DEGREE
wherein, distance_meter_per_deglee may be set to a fixed value, which is a DISTANCE between latitudes differing by one DEGREE in the latitudinal direction, and may be taken as approximately 111319.488 METERs.
In addition, in the process of playing the target video, HUD data can be acquired and sent to the HUD rack, so that HUD display can be performed in the process of playing the target video. As indicated above, the HUD display content includes AR content and/or non-AR content, specifically including one or more of a route line, POI identification, vehicle speed, road speed limit, road name, navigation remaining distance, curve type, curve remaining distance.
Fig. 10 schematically shows an effect diagram of video playback in combination with HUD display. Referring to fig. 10, the display end may display information such as a track, a current speed, a curve type, a curve remaining distance, a POI (such as XX mall), and the like while displaying the video frame.
In addition, in the process of playing the target video, if the simulated vehicle deviates from the route corresponding to the target video, the current longitude and latitude, namely the longitude and latitude of the deviated position, is determined. Subsequently, the above-described process of selecting videos from the video library and selecting video frames from the videos is re-performed. Specifically, under the condition that the next video is determined from a pre-recorded video library in combination with the current longitude and latitude, the next video is controlled to start playing from a video frame corresponding to the current longitude and latitude, and the playing progress of the next frame is controlled in response to the control operation for the simulated vehicle.
The test procedure of the embodiment of the present disclosure will be described below with reference to fig. 11.
In step S1102, the electronic device reads the ZJM file list.
In step S1104, the electronic device loads the metadata list.
In step S1106, the electronic device acquires data simulating a vehicle.
In step S1108, the electronic device determines whether the initial configuration operation includes a configuration navigation operation. If navigation is configured, step S1110 is performed; if navigation is not configured, step S1112 is performed.
In step S1110, the electronic device traverses the routes in all ZJM metadata, and selects the ZJM file with the highest matching degree with the simulated vehicle route. The simulated vehicle route is a route planned after configuration navigation.
In step S1112, the electronic device traverses the routes in all ZJM metadata, for example, selecting the first ZJM file containing the simulated vehicle' S current location.
In step S1114, the electronic device loads track data of the ZJM file.
In step S1116, the electronic device acquires the position coordinates of the simulated vehicle, that is, acquires the longitude and latitude of the simulated vehicle.
In step S1118, the electronic apparatus selects the frame number matching the position coordinates in step S1116 from the track data in step S1114.
In step S1120, the electronic device decodes the video frame and renders it.
In step S1122, the electronic device determines whether the video is played. If the playing is finished, ending the flow; if not, step S1124 is performed.
In step S1124, the electronic apparatus determines whether the ZJM route determined in step S1110 or step S1112 is currently deviated. If so, returning to step S1106; if not, return to step S1116.
It should be noted that although the steps of the methods in the present disclosure are depicted in the accompanying drawings in a particular order, this does not require or imply that the steps must be performed in that particular order, or that all illustrated steps be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform, etc.
Further, in this example embodiment, a test apparatus is also provided.
Fig. 12 schematically shows a block diagram of a testing device of an exemplary embodiment of the present disclosure. Referring to fig. 12, a test apparatus 120 according to an exemplary embodiment of the present disclosure may include a video determination module 1211, a frame determination module 1213, and a test control module 1215.
Specifically, the video determination module 1211 may be configured to determine the target video from a pre-recorded video library in response to an initial configuration operation of the simulated vehicle; the method comprises the steps of simulating initial configuration operation of a vehicle, wherein the initial configuration operation of the vehicle comprises the initial longitude and latitude configuration operation of the vehicle, and videos contained in a video library are recorded in a driving process under a real scene; the frame determination module 1213 may be used to determine a target video frame from the target video that corresponds to an initial longitude and latitude of the simulated vehicle; the test control module 1215 may be used to control the playback of the target video from the target video frame and to control the progress of the playback of the target video in response to a control operation for the simulated vehicle.
According to an example embodiment of the present disclosure, the video determination module 1211 may be configured to perform: determining the initial longitude and latitude of the simulated vehicle in response to the initial longitude and latitude configuration operation of the simulated vehicle; and determining the target video from a pre-recorded video library by combining the initial longitude and latitude of the simulated vehicle.
According to an exemplary embodiment of the present disclosure, the simulating the initial configuration operation of the vehicle further includes simulating the navigation configuration operation of the vehicle, in which case the determining of the target video from the pre-recorded video library by the video determining module 1211 in conjunction with the initial longitude and latitude of the simulating vehicle may be configured to perform: responding to the navigation configuration operation of the simulated vehicle, and determining a navigation route by combining the initial longitude and latitude of the simulated vehicle; and determining the target video from a pre-recorded video library according to the navigation route.
In accordance with an exemplary embodiment of the present disclosure, the process of the video determination module 1211 determining the target video from the pre-recorded video library in conjunction with simulating the initial longitude and latitude of the vehicle may be further configured to perform: determining an initial driving direction of the simulated vehicle in response to an initial configuration operation of the simulated vehicle; and determining the target video from a pre-recorded video library based on the initial longitude and latitude and the initial driving direction of the simulated vehicle.
According to an exemplary embodiment of the present disclosure, the longitude and latitude corresponding to each video frame is recorded when each video frame in the target video is acquired, in which case the frame determination module 1213 may be configured to perform: matching the initial longitude and latitude of the simulated vehicle with the longitude and latitude corresponding to each video frame in the target video; and determining the video frame with the highest matching degree from the target video as the target video frame.
According to an example embodiment of the present disclosure, the test control module 1215 may be configured to perform: determining a speed of the simulated vehicle according to a control operation for the simulated vehicle; and determining the playing speed of the target video according to the speed of the simulated vehicle.
According to an example embodiment of the present disclosure, the test control module 1215 may be further configured to perform: in the process of playing the target video, HUD data are acquired; the HUD data is sent to the HUD stage for HUD display during the target video playback.
According to an example embodiment of the present disclosure, the test control module 1215 may be further configured to perform: if the simulated vehicle deviates from the route corresponding to the target video, determining the current longitude and latitude; and under the condition that the next video is determined from the pre-recorded video library by combining the current longitude and latitude, controlling the next video to start playing from the video frame corresponding to the current longitude and latitude, and controlling the playing progress of the next video in response to the control operation aiming at the simulated vehicle.
Since each functional module of the test device according to the embodiment of the present disclosure is the same as that in the system and method embodiments described above, a detailed description thereof is omitted herein.
In an exemplary embodiment of the present disclosure, a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification is also provided. In some possible implementations, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the "exemplary methods" section of this specification, when the program product is run on the terminal device.
The program product for implementing the above-described method according to the embodiments of the present disclosure may employ a portable compact disc read-only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical disk, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
An electronic device 1300 according to such an embodiment of the present disclosure is described below with reference to fig. 13. The electronic device 1300 shown in fig. 13 is merely an example and should not be construed to limit the functionality and scope of use of embodiments of the present disclosure in any way.
As shown in fig. 13, the electronic device 1300 is embodied in the form of a general purpose computing device. The components of the electronic device 1300 may include, but are not limited to: the at least one processing unit 1310, the at least one memory unit 1320, a bus 1330 connecting the different system components (including the memory unit 1320 and the processing unit 1310), and a display unit 1340.
Wherein the storage unit stores program code that is executable by the processing unit 1310 such that the processing unit 1310 performs steps according to various exemplary embodiments of the present disclosure described in the above section of the "exemplary method" of the present specification. For example, the processing unit 1310 may perform various steps of the test methods of the disclosed embodiments.
The storage unit 1320 may include readable media in the form of volatile storage units, such as Random Access Memory (RAM) 13201 and/or cache memory 13202, and may further include Read Only Memory (ROM) 13203.
The storage unit 1320 may also include a program/utility 13204 having a set (at least one) of program modules 13205, such program modules 13205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 1330 may be a local bus representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or using any of a variety of bus architectures.
The electronic device 1300 may also communicate with one or more external devices 1400 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 1300, and/or any device (e.g., router, modem, etc.) that enables the electronic device 1300 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 1350. Also, the electronic device 1300 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, for example, the Internet, through a network adapter 1360. As shown, the network adapter 1360 communicates with other modules of the electronic device 1300 over the bus 1330. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 1300, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are only schematic illustrations of processes included in the method according to the exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (15)

1. A method of testing, comprising:
responding to initial configuration operation of the simulated vehicle, and determining a target video from a pre-recorded video library; the initial configuration operation of the simulated vehicle comprises initial longitude and latitude configuration operation of the simulated vehicle, and the video contained in the video library is recorded in the driving process under the real scene;
determining a target video frame corresponding to the initial longitude and latitude of the simulated vehicle from the target video;
And controlling the target video to start playing from the target video frame, and controlling the playing progress of the target video in response to the control operation for the simulated vehicle.
2. The method of testing of claim 1, wherein determining the target video from the pre-recorded video library in response to simulating an initial configuration operation of the vehicle comprises:
determining the initial longitude and latitude of the simulated vehicle in response to the initial longitude and latitude configuration operation of the simulated vehicle;
and determining the target video from a pre-recorded video library by combining the initial longitude and latitude of the simulated vehicle.
3. The test method of claim 2, wherein the initial configuration operation of the simulated vehicle further comprises a navigational configuration operation of the simulated vehicle; wherein, determining the target video from the pre-recorded video library in combination with the initial longitude and latitude of the simulated vehicle comprises:
responding to the navigation configuration operation of the simulated vehicle, and determining a navigation route by combining the initial longitude and latitude of the simulated vehicle;
and determining the target video from a pre-recorded video library according to the navigation route.
4. The method of testing of claim 2, wherein determining the target video from the pre-recorded video library in combination with the initial longitude and latitude of the simulated vehicle comprises:
Determining an initial driving direction of a simulated vehicle in response to an initial configuration operation of the simulated vehicle;
and determining a target video from a pre-recorded video library based on the initial longitude and latitude and the initial driving direction of the simulated vehicle.
5. The test method according to claim 1, wherein the latitude and longitude corresponding to each video frame is recorded when each video frame in the target video is collected; wherein determining, from the target video, a target video frame corresponding to an initial longitude and latitude of the simulated vehicle comprises:
matching the initial longitude and latitude of the simulated vehicle with the longitude and latitude corresponding to each video frame in the target video;
and determining the video frame with the highest matching degree from the target video as a target video frame.
6. The test method according to claim 1, wherein controlling the progress of playing of the target video in response to a control operation for the simulated vehicle comprises:
determining a speed of the simulated vehicle according to a control operation for the simulated vehicle;
and determining the playing speed of the target video according to the speed of the simulated vehicle.
7. The test method according to any one of claims 1 to 6, further comprising:
In the process of playing the target video, HUD data are acquired;
and sending the HUD data to a HUD rack so as to display the HUD in the process of playing the target video.
8. The test method of claim 1, wherein the test method further comprises:
if the simulated vehicle deviates from the route corresponding to the target video, determining the current longitude and latitude;
and under the condition that the next video is determined from the prerecorded video library by combining the current longitude and latitude, controlling the next video to start playing from the video frame corresponding to the current longitude and latitude, and controlling the playing progress of the next video in response to the control operation for the simulated vehicle.
9. A test device, comprising:
the video determining module is used for responding to the initial configuration operation of the simulated vehicle and determining a target video from a pre-recorded video library; the initial configuration operation of the simulated vehicle comprises initial longitude and latitude configuration operation of the simulated vehicle, and the video contained in the video library is recorded in the driving process under the real scene;
the frame determining module is used for determining a target video frame corresponding to the initial longitude and latitude of the simulated vehicle from the target video;
And the test control module is used for controlling the target video to start playing from the target video frame and controlling the playing progress of the target video in response to the control operation aiming at the simulated vehicle.
10. A test system, comprising:
the testing device is used for responding to initial configuration operation of the simulated vehicle, determining target video from a pre-recorded video library, wherein the initial configuration operation of the simulated vehicle comprises initial longitude and latitude configuration operation of the simulated vehicle, the video contained in the video library is video recorded in a driving process under a real scene, determining a target video frame corresponding to the initial longitude and latitude of the simulated vehicle from the target video, controlling the target video to start playing from the target video frame, and responding to control operation of the simulated vehicle to control the playing progress of the target video;
and the display end is used for displaying the target video played based on the control of the testing device.
11. The test system of claim 10, wherein the test device is further configured to obtain HUD data during the playing of the target video;
Wherein the test system further comprises:
and the HUD rack is used for receiving the HUD data and carrying out HUD display based on the HUD data in the process of playing the target video.
12. The test system of claim 10, wherein the test system further comprises:
and the video recording equipment is used for recording videos in advance in the driving process of different road sections under the real scene so as to generate the video library.
13. The test system of claim 12, wherein the video recording device is further configured to collect track data while recording the video;
the track data comprises longitude and latitude corresponding to the recorded video frames.
14. A storage medium having stored thereon a computer program, which when executed by a processor implements the test method of any of claims 1 to 8.
15. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to implement the test method of any one of claims 1 to 8 via execution of the executable instructions.
CN202310696914.5A 2023-06-12 2023-06-12 Test method, device, system, storage medium and electronic equipment Active CN116773216B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310696914.5A CN116773216B (en) 2023-06-12 2023-06-12 Test method, device, system, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310696914.5A CN116773216B (en) 2023-06-12 2023-06-12 Test method, device, system, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN116773216A true CN116773216A (en) 2023-09-19
CN116773216B CN116773216B (en) 2024-05-31

Family

ID=87992329

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310696914.5A Active CN116773216B (en) 2023-06-12 2023-06-12 Test method, device, system, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN116773216B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101546437A (en) * 2009-04-24 2009-09-30 长安大学 Method for establishing simulation experiment platform of road traffic virtual identifier marking
CN106157572A (en) * 2015-04-21 2016-11-23 惠州市德赛西威汽车电子股份有限公司 The method of testing of automobile active safety early warning system and test device
CN106534734A (en) * 2015-09-11 2017-03-22 腾讯科技(深圳)有限公司 Method and device for playing video and displaying map, and data processing method and system
CN109187048A (en) * 2018-09-14 2019-01-11 盯盯拍(深圳)云技术有限公司 Automatic Pilot performance test methods and automatic Pilot performance testing device
CN109580248A (en) * 2018-12-05 2019-04-05 北京汽车集团有限公司 Smartway road test simulator
CN110097799A (en) * 2019-05-23 2019-08-06 重庆大学 Virtual driving system based on real scene modeling
CN111563185A (en) * 2020-03-19 2020-08-21 平安城市建设科技(深圳)有限公司 Video picture display method, device, terminal and storage medium based on GIS system
CN112489522A (en) * 2020-11-17 2021-03-12 北京三快在线科技有限公司 Method, device, medium and electronic device for playing simulation scene data
CN113361059A (en) * 2020-03-05 2021-09-07 北京京东乾石科技有限公司 Vehicle simulation scene generation method and related equipment
CN113567778A (en) * 2021-06-30 2021-10-29 南京富士通南大软件技术有限公司 Scene-based real-vehicle automatic testing method for vehicle-mounted information entertainment system
CN115393619A (en) * 2022-09-06 2022-11-25 北京赛目科技有限公司 Recommendation method and device for simulation scene in automatic driving and electronic equipment
CN116137113A (en) * 2023-04-20 2023-05-19 眉山中车制动科技股份有限公司 Heavy-duty train model driving system

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101546437A (en) * 2009-04-24 2009-09-30 长安大学 Method for establishing simulation experiment platform of road traffic virtual identifier marking
CN106157572A (en) * 2015-04-21 2016-11-23 惠州市德赛西威汽车电子股份有限公司 The method of testing of automobile active safety early warning system and test device
CN106534734A (en) * 2015-09-11 2017-03-22 腾讯科技(深圳)有限公司 Method and device for playing video and displaying map, and data processing method and system
CN109187048A (en) * 2018-09-14 2019-01-11 盯盯拍(深圳)云技术有限公司 Automatic Pilot performance test methods and automatic Pilot performance testing device
CN109580248A (en) * 2018-12-05 2019-04-05 北京汽车集团有限公司 Smartway road test simulator
CN110097799A (en) * 2019-05-23 2019-08-06 重庆大学 Virtual driving system based on real scene modeling
CN113361059A (en) * 2020-03-05 2021-09-07 北京京东乾石科技有限公司 Vehicle simulation scene generation method and related equipment
CN111563185A (en) * 2020-03-19 2020-08-21 平安城市建设科技(深圳)有限公司 Video picture display method, device, terminal and storage medium based on GIS system
CN112489522A (en) * 2020-11-17 2021-03-12 北京三快在线科技有限公司 Method, device, medium and electronic device for playing simulation scene data
CN113567778A (en) * 2021-06-30 2021-10-29 南京富士通南大软件技术有限公司 Scene-based real-vehicle automatic testing method for vehicle-mounted information entertainment system
CN115393619A (en) * 2022-09-06 2022-11-25 北京赛目科技有限公司 Recommendation method and device for simulation scene in automatic driving and electronic equipment
CN116137113A (en) * 2023-04-20 2023-05-19 眉山中车制动科技股份有限公司 Heavy-duty train model driving system

Also Published As

Publication number Publication date
CN116773216B (en) 2024-05-31

Similar Documents

Publication Publication Date Title
US7110592B2 (en) Image recording apparatus, image reproducing apparatus and methods therefor
US20080147325A1 (en) Method and system for providing augmented reality
US11080908B2 (en) Synchronized display of street view map and video stream
CN101487711B (en) Devices and methods with image track record displaying, browsing and navigating function
EP2843368B1 (en) Method and system for computer-based navigation
US7962284B2 (en) Device, method and medium providing customized audio tours
EP3338058B1 (en) Location based service tools for video illustration, selection, and synchronization
CN104236563A (en) Method for recording actual road conditions during running of automobile through reality navigation
US20180202811A1 (en) Navigation using an image of a topological map
CN102037314A (en) Navigation apparatus and method for recording image data
KR20010052022A (en) Navigation device
US11307038B2 (en) Method and device for acquiring road track, and storage medium
CN106211066A (en) For obtaining the methods, devices and systems of POI data
CN102680992A (en) Method for utilizing video files containing global positioning system (GPS) information to synchronously determine movement track
CN113345108B (en) Augmented reality data display method and device, electronic equipment and storage medium
CN101979962A (en) Navigation display method and device
CN116773216B (en) Test method, device, system, storage medium and electronic equipment
CN110554953A (en) Electronic map indoor simulation navigation method and device, electronic equipment and storage medium
CN110089125B (en) Image display device, image display system, image display method, and storage medium
JP2013183333A (en) Augmented reality system
JPH1063181A (en) Method for supporting preparation and edition of map data base
CN116642511A (en) AR navigation image rendering method and device, electronic equipment and storage medium
CN102706355A (en) Navigation method and mobile terminal
CN112533146A (en) Navigation database establishing method and device and electronic equipment
CN113595779B (en) Method, apparatus, medium, and network analysis system for acquiring data for network analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant