CN114089890A - Vehicle driving simulation method, device, storage medium and program product - Google Patents

Vehicle driving simulation method, device, storage medium and program product Download PDF

Info

Publication number
CN114089890A
CN114089890A CN202111110041.2A CN202111110041A CN114089890A CN 114089890 A CN114089890 A CN 114089890A CN 202111110041 A CN202111110041 A CN 202111110041A CN 114089890 A CN114089890 A CN 114089890A
Authority
CN
China
Prior art keywords
driving
panoramic video
vehicle
environment
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111110041.2A
Other languages
Chinese (zh)
Inventor
王丹
王晓彤
高佩文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Chengshi Wanglin Information Technology Co Ltd
Original Assignee
Beijing Chengshi Wanglin Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Chengshi Wanglin Information Technology Co Ltd filed Critical Beijing Chengshi Wanglin Information Technology Co Ltd
Priority to CN202111110041.2A priority Critical patent/CN114089890A/en
Publication of CN114089890A publication Critical patent/CN114089890A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles

Abstract

The embodiment of the application provides a vehicle driving simulation method, equipment, a storage medium and a program product. In the embodiment of the application, in the process of simulated driving, an internal real scene graph of a target vehicle selected by a user under a driving position visual angle can be rendered in a panoramic video under the driving position visual angle corresponding to an actual driving environment selected by the user, so that the simulated driving of the target vehicle is realized, and interactive driving experience is provided. The simulated driving environment is the actual driving environment at the current position of the electronic terminal, and the road condition and the environment of the simulated driving environment are the current real driving environment of the user; the simulated driving environment is the actual driving environment of the electronic terminal at the current position, the actual driving environment is embodied by the panoramic video of the actual driving environment at the driving position visual angle, the driving vehicle is embodied by the internal real scene image of the target vehicle at the driving position visual angle, so that a driver can feel more real, the immersion of online vehicle watching is improved, and the online vehicle watching experience of a user is improved.

Description

Vehicle driving simulation method, device, storage medium and program product
Technical Field
The present application relates to the field of internet technologies, and in particular, to a method, device, storage medium, and program product for simulating driving of a vehicle.
Background
With the continuous development and popularization of internet technology and terminals, more and more application software (such as APP and the like) is installed and applied. In some application scenarios, a user may view or order a vehicle online through application software.
In the existing online car watching technology, a user can check the internal and external appearances of a car through photos or videos, but the interactive car watching experience which is personally on the scene cannot be generated. Therefore, how to make the user experience an interactive car-viewing experience through technical means becomes a problem for those skilled in the art to continuously research.
Disclosure of Invention
Aspects of the present application provide a vehicle driving simulation method, device, storage medium, and program product for implementing an interactive driving experience and improving a user's viewing experience.
The embodiment of the application provides a vehicle simulation driving method, which comprises the steps that a graphical user interface is provided through an electronic terminal, the graphical user interface is used for displaying a driving vehicle selection control and a driving environment selection control, the driving vehicle selection control is used for selecting a simulation driving vehicle, and the driving environment selection control is used for selecting at least one actual driving environment of the simulation driving vehicle at the current position acquired by the electronic terminal;
the method comprises the following steps:
in response to detecting the triggering of the driving vehicle selection control, acquiring an internal live-action image of the selected target vehicle; the interior real view is obtained from a real view photographed of the interior real view of the target vehicle at a driving seat angle, an
In response to the detection of the triggering of the driving environment selection control, acquiring a panoramic video corresponding to at least one actual driving environment at the current position of the electronic terminal, wherein each actual driving environment at the current position of the electronic terminal is acquired according to a panoramic video shot for each actually driven external environment at a driving position viewing angle, and no road intersection exists between the starting position and the ending position of the actually driven external environment corresponding to each panoramic video;
and displaying the rendering effect of the internal real scene image of the target vehicle in the current panoramic video on the graphical user interface so as to simulate the driving of the target vehicle in the actual driving environment corresponding to the current panoramic video.
The embodiment of the application also provides a vehicle driving simulation device, a graphical user interface is provided through the device, the graphical user interface is used for displaying a driving vehicle selection control and a driving environment selection control, the driving vehicle selection control is used for selecting a simulated driving vehicle, and the driving environment selection control is used for selecting at least one actual driving environment of the simulated driving vehicle at the current position acquired by the device;
the device comprises:
the acquisition module is used for responding to the trigger of the driving vehicle selection control and acquiring an internal live-action image of the selected target vehicle; the internal live-action map is obtained according to a live-action map shot for the internal live-action of the target vehicle at a driving position view angle; and
in response to the triggering of the driving environment selection control, acquiring a panoramic video corresponding to at least one actual driving environment at the current position acquired by the device, wherein each actual driving environment at the current position acquired by the device is acquired according to a panoramic video shot for each actually driven external environment at a driving position viewing angle, and no road intersection exists between the starting position and the ending position of the actually driven external environment corresponding to each panoramic video;
and the display module is used for displaying the rendering effect of the internal real scene image of the target vehicle in the current panoramic video on the graphical user interface so as to simulate the driving of the target vehicle in the actual driving environment corresponding to the current panoramic video.
An embodiment of the present application further provides an electronic device, including: a memory, a processor, and a display component; the electronic equipment provides a graphical user interface through the display component, the graphical user interface is used for displaying a driving vehicle selection control and a driving environment selection control, the driving vehicle selection control is used for selecting a simulated driving vehicle, and the driving environment selection control is used for selecting at least one actual driving environment of the simulated driving vehicle at the current position acquired by the electronic equipment;
the memory for storing a computer program;
the processor is coupled to the memory and the display component for executing the computer program for performing the steps in the above-described method of simulated driving of a vehicle.
Embodiments of the present application also provide a computer-readable storage medium storing computer instructions, which, when executed by one or more processors, cause the one or more processors to perform the steps of the above-described method for simulating driving of a vehicle.
An embodiment of the present application further provides a computer program product, including: a computer program; the computer program is executed by a processor for implementing the steps in the above-described method for simulated driving of a vehicle.
In the embodiment of the application, a driving vehicle selection control is provided for a user to autonomously select a simulated driving vehicle; acquiring an actual driving environment at the current position based on the current position of the electronic terminal; and providing a driving environment selection control for the user to autonomously select the actual driving environment of the acquired current position of the electronic terminal. In the process of simulated driving, an internal real scene graph of a target vehicle selected by a user under a driving position visual angle can be rendered in a panoramic video under the driving position visual angle corresponding to an actual driving environment selected by the user, so that the simulated driving of the target vehicle is realized, and interactive driving experience is provided. On the one hand, because the simulated driving environment is the actual driving environment at the current position of the electronic terminal, the road condition and the environment of the simulated driving environment are the current real driving environment of the user, on the other hand, the simulated driving environment is the actual driving environment at the current position of the electronic terminal, the panoramic video of the actual driving environment at the driving position visual angle is used for embodying the actual driving environment, the internal real scene image of the target vehicle at the driving position visual angle is used for embodying the driving vehicle, the driver can feel more real, the immersion of online vehicle watching is improved, and the online vehicle watching experience of the user is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a schematic flow chart of a method for simulating driving of a vehicle according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a graphical user interface provided by an embodiment of the present application;
FIG. 3 is a schematic view of rendering effects of simulated driving of a vehicle according to an embodiment of the present disclosure;
FIG. 4 is a schematic structural diagram of a vehicle driving simulation device provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In an embodiment of the application, in order to provide an interactive driving experience, a driving vehicle selection control is provided for a user to autonomously select a simulated driving vehicle; acquiring an actual driving environment at the current position based on the current position of the electronic terminal; and providing a driving environment selection control for the user to autonomously select the actual driving environment of the acquired current position of the electronic terminal. In the process of simulated driving, an internal real scene graph of a target vehicle selected by a user under a driving position visual angle can be rendered in a panoramic video under the driving position visual angle corresponding to an actual driving environment selected by the user, so that the simulated driving of the target vehicle is realized, and interactive driving experience is provided. On the one hand, because the simulated driving environment is the actual driving environment at the current position of the electronic terminal, the road condition and the environment of the simulated driving environment are the current real driving environment of the user, on the other hand, the simulated driving environment is the actual driving environment at the current position of the electronic terminal, the panoramic video of the actual driving environment at the driving position visual angle is used for embodying the actual driving environment, the internal real scene image of the target vehicle at the driving position visual angle is used for embodying the driving vehicle, the driver can feel more real, the immersion of online vehicle watching is improved, and the online vehicle watching experience of the user is improved.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
It should be noted that: like reference numerals refer to like objects in the following figures and embodiments, and thus, once an object is defined in one figure or embodiment, further discussion thereof is not required in subsequent figures and embodiments.
Fig. 1 is a schematic flow chart of a vehicle driving simulation method according to an embodiment of the present application. As shown in fig. 1, the method includes:
101. and acquiring an internal live-action map of the selected target vehicle in response to detecting the triggering of the driving vehicle selection control. The interior real scene image is obtained according to the real scene image shot for the interior real scene of the target vehicle under the view angle of the driving position.
102. And responding to the trigger of the detected driving environment selection control, and acquiring the panoramic video corresponding to at least one actual driving environment at the current position acquired by the electronic terminal.
Each actual driving environment at the current position acquired by the electronic terminal is acquired according to the panoramic video shot for each external environment of actual driving at the driving position view angle, and no road intersection exists between the starting position and the ending position of the external environment of actual driving corresponding to each panoramic video.
103. And displaying the rendering effect of the internal real scene image of the target vehicle in the current panoramic video on the graphical user interface so as to simulate the driving of the target vehicle in the actual driving environment corresponding to the current panoramic video.
In this embodiment, the graphical user interface may be provided by an electronic terminal. As shown in fig. 2, the graphical user interface is for displaying a driving vehicle selection control and a driving environment selection control. The driving vehicle selection control is used for selecting a simulated driving vehicle, and the driving environment selection control is used for selecting at least one actual driving environment of the simulated driving vehicle at the current position acquired by the electronic terminal.
In the embodiment of the present application, a specific implementation form of the electronic terminal is not limited. Alternatively, the electronic terminal may be a mobile phone, a tablet computer, a personal computer, a wearable device, or the like. Wherein, the wearable device can be a head-mounted Virtual Reality (VR) device, such as VR eyes, VR helmet, etc.; and may also be a head-worn Augmented Reality (AR) device such as an AR eye, an AR helmet, etc.
The current position P of the electronic terminal is the real-time positioning information of the electronic terminal. In the embodiment of the present application, the granularity of the current position P of the electronic terminal is not limited. Alternatively, the current position P of the electronic terminal may be a city, an area, a street, or geographical location information in point of interest (POI) information, etc. currently located.
In the present embodiment, at least one actual driving environment at the current position P may be acquired based on the current position P of the electronic terminal. In this embodiment, the actual driving environment may be obtained from a panoramic video taken of the actual external driving environment at the driving position perspective. The actual external driving environment refers to the external driving environment of the environment acquisition vehicle when the environment acquisition vehicle is adopted in the actual driving environment.
In this embodiment, the user may trigger the driving vehicle selection control to select the simulated driving vehicle. In the embodiment of the application, the specific implementation form of the control for triggering the driving vehicle selection by the user is not limited. In some embodiments, the user may click on a drive vehicle selection control to trigger the control; in other embodiments, the user may voice-trigger the driving vehicle selection control; in still other embodiments, the eyeball tracking technology may be utilized to track the eyeball of the user, determine that a trigger for driving the vehicle selection control is detected if the staying time of the eyeball of the user in the driving vehicle selection control is greater than or equal to the preset time period, and the like.
For the electronic terminal, in step 101, an internal scene graph of the selected target vehicle may be acquired in response to detecting a trigger of a vehicle driving selection control. The interior real scene of the target vehicle is obtained according to the real scene of the interior real scene of the target vehicle shot under the view angle of the driving seat. The interior real view of the target vehicle from the view point of the driving seat may include: a live-action view taken from the steering wheel of the subject vehicle at the driving position perspective, a live-action view taken from the center console at the driving position perspective, and a live-action view taken from the instrument panel at the driving position perspective, and so on.
In this embodiment, the user may also select the simulated driving environment by triggering the driving environment selection control. For the embodiment of triggering the driving environment selection control by the user, reference may be made to the related content of triggering the driving vehicle selection control, which is not described herein again. In the embodiment of the application, the simulated driving environment is a panoramic video corresponding to a real-scene driving environment.
Accordingly, for the electronic terminal, in step 102, in response to detecting the trigger of the driving environment selection control, the panoramic video a corresponding to at least one actual driving environment in the current position of the electronic terminal may be acquired. Each actual environment at the current position of the electronic terminal is obtained according to the panoramic video shot for each actually driven external environment at the view angle of the driving position. The current positions of the electronic terminals are different, the number of actual driving environments at the current positions is different, and the data of the specific actual driving environments are determined by whether a road intersection exists in the external actual driving environment at the current position and the number of roads intersected at the road intersection. For example, if the current position P of the electronic terminal has no road intersection, the actual driving environment at the current position of the electronic terminal may be 1; if the current position P of the electronic terminal has a road intersection and is a three-way intersection, the actual driving environment at the current position of the electronic terminal may be 3, and so on.
For the external environment of actual driving of each panoramic video under the current position P of the electronic terminal, no road intersection exists between the starting position and the ending position.
In the embodiment of the application, a panoramic camera can be used for shooting a panoramic video of a real driving scene at a fixed moving speed in advance, the panoramic camera is arranged at a driving position in the shooting process, and the set height is consistent with the height of eyes of a driver on a vehicle. Furthermore, a sphere model can be established in the virtual space, and the shot panoramic view is attached to the inner side of the sphere through an editing sphere model shader. Furthermore, a vehicle body model can be placed in the center of the sphere scene to serve as a simulated driving vehicle; and placing a virtual camera at the driver position of the vehicle body model, wherein the setting height of the virtual camera is consistent with the eye height of the driver on the vehicle body model, and obtaining the panoramic video. The observation visual angle of the panoramic video can be adjusted by adjusting the visual angle of the virtual camera.
After the internal real view of the target vehicle at the driving position view angle and the panoramic video a at the driving position view angle corresponding to the actual driving environment at the current position of the electronic terminal are respectively obtained in step 101 and step 102, in step 103, as shown in fig. 3, the rendering effect of the internal real view of the target vehicle in the current panoramic video a may be displayed on a graphical user interface, so as to simulate the driving of the target vehicle in the actual driving environment corresponding to the current panoramic video a. Wherein, the current panoramic video refers to a panoramic video frame currently displayed on the graphical user interface.
Specifically, the panoramic video a corresponding to the actual driving environment acquired in step 102 may be played on a graphical user interface; and in the playing process of the panoramic video A, rendering an internal live-action image of the target vehicle on the current panoramic video frame. With the playing of the panoramic video A, the driving environment in the panoramic video A moves forward frame by frame, and the visual effect that the driving environment moves along with the movement of the vehicle can be presented, so that the driving of the target vehicle in the actual driving environment corresponding to the current panoramic video can be simulated.
In the embodiment, a driving vehicle selection control is provided for a user to autonomously select a simulated driving vehicle; acquiring an actual driving environment at the current position based on the current position of the electronic terminal; and providing a driving environment selection control for the user to autonomously select the actual driving environment of the acquired current position of the electronic terminal. In the process of simulated driving, an internal real scene graph of a target vehicle selected by a user under a driving position visual angle can be rendered in a panoramic video under the driving position visual angle corresponding to an actual driving environment selected by the user, so that the simulated driving of the target vehicle is realized, and interactive driving experience is provided. On the one hand, because the simulated driving environment is the actual driving environment at the current position of the electronic terminal, the road condition and the environment of the simulated driving environment are the current real driving environment of the user, on the other hand, the simulated driving environment is the actual driving environment at the current position of the electronic terminal, the panoramic video of the actual driving environment at the driving position visual angle is used for embodying the actual driving environment, the internal real scene image of the target vehicle at the driving position visual angle is used for embodying the driving vehicle, the driver can feel more real, the immersion of online vehicle watching is improved, and the online vehicle watching experience of the user is improved.
In the embodiment of the application, no road intersection exists between the starting position and the ending position of the external environment of the actual driving corresponding to each panoramic video. In some embodiments, there is a road intersection at the end position of the external environment for actual driving, and in actual driving, the user needs to select the next road at the road intersection, such as controlling the driving vehicle to go straight, turn left or turn right, and the like. In this embodiment of the application, in order to simulate driving of a user in an actual driving environment, the panoramic video obtained in step 102 is played to a destination position, where the panoramic video a of the actual driving environment corresponding to a next road selected by the user needs to be connected.
During the simulated driving process, the user can operate the electronic terminal to send the direction indicating signal to indicate the action at the intersection, such as straight running, left turning or right turning. The electronic terminals are different in implementation form, and different in the way of operating the electronic terminals by users. In some embodiments, the electronic terminal is a mobile phone, a tablet computer, or the like, and the user may rotate the mobile phone, the tablet computer, or the like to send the direction indicating signal. For example, the user may turn the mobile phone to the left, and send a left turn indication signal to the mobile phone; or keeping the direction of the mobile phone unchanged, sending a straight-going indication signal to the mobile phone, and the like. For the electronic terminal, the rotation angle acquired by the gyroscope can be acquired. The rotation angle is a vector angle. Further, the rotation direction of the electronic terminal can be determined according to the rotation angle acquired by the gyroscope; and generating a direction indicating signal according to the rotation direction of the electronic terminal, wherein the direction indicating signal comprises a navigation action indicating the target vehicle.
In other embodiments, the electronic terminal is a head-mounted display device, and the user can send the direction indication signal to the head-mounted display device by turning the head. For example, the user may turn his head to the left, sending a left turn indication signal to the head mounted display device; or keeping the front view direction unchanged, sending a straight line indication signal to the head-mounted display device, and the like. For a head-mounted display device, the rotation angle acquired by the gyroscope may be acquired. The rotation angle is a vector angle. Further, the rotation direction of the electronic terminal can be determined according to the rotation angle acquired by the gyroscope; and generating a direction indicating signal according to the rotation direction of the electronic terminal, wherein the direction indicating signal comprises a navigation action indicating the target vehicle.
For the electronic terminal, in response to the direction indication signal received when the current panoramic video a is played to the corresponding end position, the next panoramic video B to which the simulation target vehicle should be switched may be determined according to the direction indication signal. The next panoramic video B is a panoramic video corresponding to the next actual driving environment and is subsequently connected with the terminal position determined according to the direction indicating signal; the panoramic video B is shot of an external environment of actual driving linked to the end position of the current panoramic video at the driving position viewing angle.
Further, a next panoramic video B to which the simulation target vehicle should be switched can be obtained; and displaying the rendering effect of the internal real scene image of the target vehicle in the next panoramic video B on the graphical user interface so as to simulate the driving of the target vehicle in the actual driving environment corresponding to the next panoramic video B. For a specific implementation of displaying the rendering effect of the internal real view of the target vehicle in the next panoramic video B on the graphical user interface, reference may be made to the related content of step 103, which is not described herein again.
Considering that a user needs to adjust the steering of a steering wheel to adjust the driving direction of the vehicle when actually driving, for simulating the real driving effect of the user, the user can send a corresponding direction indicating signal to the electronic terminal according to the road trend in the current panoramic video A. If the road trend in the current panoramic video A is a left turn, a left turn indication signal and the like can be sent to the electronic terminal. For the implementation of sending the direction indication signal to the electronic terminal, reference may be made to the contents of the related embodiments described above, and details are not described herein again.
And for the electronic terminal, responding to the received direction indication signal, and adjusting the steering of the steering wheel in the internal live-action image according to the direction indication signal so as to enable the steering of the steering wheel in the internal live-action image to correspond to the road direction in the current panoramic video.
In the present embodiment, the driving speed of the simulated driving may be adjusted in order to simulate the vehicle speed. The user may send a driving speed adjustment signal to the electronic terminal. In some embodiments, the electronic terminal is provided with a speed adjustment button; or the graphical user interface is provided with a speed adjustment control. The user may adjust the simulated travel speed via a speed adjustment button or a speed adjustment control on the graphical user interface. For the electronic terminal, a travel speed of the simulated driving may be determined in response to receiving the adjustment signal of the travel speed. The adjustment signal may include a driving speed or a gear stage (also referred to as a gear) corresponding to the driving speed. For the embodiment in which the adjustment signal of the running speed includes the shift stage corresponding to the running speed, it may be determined that the adjustment signal of the running speed includes the simulated running speed corresponding to the shift stage corresponding to the running speed as the running speed of the simulated driving from the correspondence relationship between the preset shift stage and the simulated running speed.
Furthermore, the playing speed of the current panoramic video can be adjusted according to the driving speed of the simulated driving; the playing speed of the current panoramic video is positively correlated with the driving speed of the simulated driving, namely the faster the driving speed of the simulated driving is, the faster the playing speed of the panoramic video is. Optionally, a corresponding relationship between the driving speed of the simulated driving and the playing speed of the panoramic video may be preset, and based on the corresponding relationship, the corresponding relationship between the driving speed of the simulated driving and the playing speed of the panoramic video may be matched to determine the playing speed of the current panoramic video a; further, the play speed of the current panoramic video a may be adjusted.
Further, the rendering effect of the internal real scene image of the target vehicle in the current panoramic video after the playing speed is adjusted can be displayed on the graphical user interface by using the adjusted playing speed, so as to simulate the driving speed of the target vehicle in the actual driving environment corresponding to the current panoramic video a. Therefore, the graphic user interface displays the rendering effect of the internal real scene image of the target vehicle in the current panoramic video after the playing speed is adjusted, and the user can experience the driving experience of adjusting the driving speed in the actual driving environment.
In the actual driving process of the user, four look-around situations may occur, for example, a situation of looking at a rearview mirror or looking outside a window, and the like. In an actual driving environment, the observation visual angle of the eyes of the user changes, and the seen environmental information is different. In order to simulate the driving effect, the user can send an observation visual angle adjusting signal to the electronic terminal. In some embodiments, the user may adjust the viewing perspective by sliding the graphical user interface. Accordingly, the electronic terminal may determine that the observation perspective adjustment signal is received in response to detecting the sliding operation of the graphical user interface. In some head-mounted display devices, the eye tracking technology may also be used to track the eye direction of the user to determine the viewing angle. Accordingly, the user can adjust the viewing angle by turning to the eyeball. Accordingly, the head-mounted display device can determine that the observation angle adjustment signal is received when detecting that the eyeball observation angle changes.
Further, a target viewpoint for which the virtual camera adjusts the viewing direction may be determined in response to receiving the observation perspective adjustment signal. The virtual camera is the virtual camera arranged in the driving position of the vehicle body model when the panoramic video is acquired.
Optionally, the sliding direction of the graphical user interface may be obtained according to a sliding operation for the graphical user interface; determining an adjustment angle of the observation angle according to the sliding direction of the graphical user interface; then, a target viewpoint of the virtual camera for adjusting the observation direction can be determined according to the adjustment angle of the observation angle.
Or, for the head-mounted display device, the eyeball tracking technology can be used for tracking the eyeball direction of the user and determining the observation visual angle of the user; and responding to an observation angle adjusting signal generated by the observation angle adjustment of eyeballs of the user, and determining the observation angle of the user as a target viewpoint of the virtual camera for adjusting the observation direction. The target viewpoint refers to an observation point of the virtual camera, and specifically refers to an observation angle of the virtual camera.
Further, the current panoramic video a may be perspective projected according to the target viewpoint to determine the driving video observed by the target viewpoint of the virtual camera from the current panoramic video a. Furthermore, the driving video observed by the target viewpoint of the virtual camera can be projected onto the graphical user interface so as to play the driving video observed by the target viewpoint of the virtual camera on the graphical user interface, and the scene simulation of the environmental information seen by the change of the observation visual angle of the eyes of the user is realized. In the embodiment of the application, the driving video played by the graphical user interface can also be projected onto other display devices, such as a wall surface, a curtain or other display equipment in communication connection with the electronic terminal. Other display devices may be a display screen, a television set, etc.
It should be noted that the various simulated driving scenes shown in the vehicle simulated driving method are only exemplified by taking the current panoramic video as the panoramic video a, and are not limited thereto. That is, the simulated driving scene illustrated in the above embodiments can be adapted to a panoramic video corresponding to any actual driving environment, such as the next panoramic video B following the panoramic video a in the above embodiments, and so on.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subject of steps 101 and 102 may be device a; for another example, the execution subject of step 101 may be device a, and the execution subject of step 102 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 101, 102, etc., are merely used for distinguishing different operations, and the sequence numbers do not represent any execution order per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel.
Accordingly, embodiments of the present application also provide a computer readable storage medium storing computer instructions, which, when executed by one or more processors, cause the one or more processors to perform the steps of the vehicle simulated driving method described above.
Embodiments of the present application further provide a computer program product, which includes a computer program, and when the computer program is executed by a processor, the computer program causes the processor to implement the steps in the vehicle simulation driving method. In the embodiments of the present application, a specific implementation form of the computer program product is not limited. In some embodiments, the computer program product may be implemented as an application running on the electronic device with online driving, or the like. For example, the user can be realized as a network appointment car APP, a car renting APP, an online car purchasing APP or a used car transaction APP and the like.
In addition to the vehicle simulated driving method, the embodiment of the application also provides a vehicle simulated driving device. The following provides an exemplary description of a vehicle driving simulation device provided in an embodiment of the present application.
In the embodiment of the application, the vehicle driving simulation device can provide a graphical user interface through the device, the graphical user interface is used for displaying a driving vehicle selection control and a driving environment selection control, the driving vehicle selection control is used for selecting a simulated driving vehicle, and the driving environment selection control is used for selecting at least one actual driving environment of the simulated driving vehicle at the current position acquired by the electronic terminal.
As shown in fig. 4, the vehicle driving simulation apparatus includes: an acquisition module 40a and a display module 40 b. The obtaining module 40a is configured to, in response to detecting a trigger of a driving vehicle selection control, obtain an internal live-action map of the selected target vehicle; the internal live-action map is obtained according to a live-action map shot for the internal live-action of the target vehicle under the view angle of the driving seat; and in response to the triggering of the driving environment selection control, acquiring the panoramic video corresponding to at least one actual driving environment at the current position acquired by the device, wherein each actual driving environment at the current position acquired by the device is acquired according to the panoramic video shot for each actually driven external environment at the driving position view angle, and no road intersection exists between the starting position and the ending position of the actually driven external environment corresponding to each panoramic video.
And the display module 40b is used for displaying the rendering effect of the internal real scene image of the target vehicle in the current panoramic video on the graphical user interface so as to simulate the driving of the target vehicle in the actual driving environment corresponding to the current panoramic video.
In some embodiments, the vehicle simulated driving apparatus further comprises: a determination module 40 c. And a road intersection is arranged at the end position of the external environment of the actual driving corresponding to the current panoramic video. The determining module 40c is configured to determine, according to the direction indication signal, a next panoramic video to which the simulation target vehicle should be switched in response to the direction indication signal received when the current panoramic video is played to the corresponding end position, where the next panoramic video is a panoramic video corresponding to a next actual driving environment following the end position determined according to the direction indication signal. Accordingly, an obtaining module 40a is configured to obtain a next panoramic video; and the display module 40b is used for displaying the rendering effect of the internal real scene image of the target vehicle in the next panoramic video on the graphical user interface so as to simulate the driving of the target vehicle in the actual driving environment corresponding to the next panoramic video.
In other embodiments, the determining module 40c is further configured to: in response to receiving the adjustment signal for the travel speed, a travel speed for the simulated driving is determined.
Correspondingly, the vehicle driving simulation device further comprises: and an adjustment module 40 d. The adjusting module 40d is used for adjusting the playing speed of the current panoramic video according to the driving speed of the simulated driving; the playing speed of the current panoramic video is positively correlated with the driving speed of the simulated driving. The display module 40b is configured to display, on the graphical user interface, a rendering effect of the internal real view of the target vehicle in the current panoramic video after the playing speed is adjusted, so as to simulate a driving speed of the target vehicle in an actual driving environment corresponding to the current panoramic video.
In some embodiments, the internal live view comprises: and (3) a live-action image shot for the steering wheel of the target vehicle under the view angle of the driving position. The adjusting module 40d is further configured to adjust, in response to receiving the direction indicating signal, the steering direction of the steering wheel in the internal live-action image according to the direction indicating signal, so that the steering direction of the steering wheel in the internal live-action image corresponds to the road direction in the current panoramic video.
In still other embodiments, the determining module 40c is further configured to determine a target viewpoint of the virtual camera for adjusting the viewing direction in response to receiving the viewing perspective adjustment signal.
Optionally, the vehicle driving simulation device further comprises: and a projection module 40 e. The projection module 40e is configured to perform perspective projection on the current panoramic video according to the target viewpoint, so as to determine a driving video observed by the target viewpoint from the current panoramic video; and projecting the driving video observed by the target viewpoint onto a graphical user interface so as to play the driving video observed by the target viewpoint of the virtual camera on the graphical user interface.
Among them, fig. 4 only shows some modules schematically, which does not mean that the vehicle driving simulation apparatus must include all the modules shown in fig. 4, nor does it mean that the vehicle driving simulation apparatus can include only the components shown in fig. 4.
The vehicle driving simulation device provided by the embodiment can provide a driving vehicle selection control for a user to autonomously select a simulated driving vehicle; acquiring an actual driving environment at the current position based on the current position of the vehicle simulation driving device; and providing a driving environment selection control for the user to autonomously select the actual driving environment of the acquired vehicle simulation driving device at the current position. In the process of simulated driving, an internal real scene graph of a target vehicle selected by a user under a driving position visual angle can be rendered in a panoramic video under the driving position visual angle corresponding to an actual driving environment selected by the user, so that the simulated driving of the target vehicle is realized, and interactive driving experience is provided. On the one hand, because the simulated driving environment is the actual driving environment of the vehicle at the current position of the simulated driving device, the road condition and the environment of the simulated driving environment are the current actual driving environment of the user, on the other hand, the simulated driving environment is the actual driving environment of the vehicle at the current position of the simulated driving device, the panoramic video of the actual driving environment at the driving position visual angle is used for embodying the actual driving environment, the internal real scene image of the target vehicle at the driving position visual angle is used for embodying the driving vehicle, the driver can have more real feeling, the immersion of online vehicle watching is improved, and the online vehicle watching experience of the user is improved.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 5, the electronic apparatus includes: a memory 50a, a processor 50b and a display component 50 c. The electronic device provides a graphical user interface through the display component 50c, the graphical user interface is used for displaying a driving vehicle selection control and a driving environment selection control, the driving vehicle selection control is used for selecting a simulated driving vehicle, and the driving environment selection control is used for selecting at least one actual driving environment of the simulated driving vehicle at the current position acquired by the electronic device.
A memory 50a for storing a computer program.
The processor 50b is coupled to the memory 50a and the display component 50c for executing computer programs for: in response to detecting the triggering of the driving vehicle selection control, acquiring an internal live-action image of the selected target vehicle; the method comprises the steps that an internal live-action image is obtained according to a live-action image shot for the internal live-action of a target vehicle under a driving position visual angle, a panoramic video corresponding to at least one actual driving environment under the current position of the electronic equipment is obtained in response to the triggering of a detected driving environment selection control, wherein each actual driving environment under the current position of the electronic equipment is obtained according to a panoramic video shot for each actual driving external environment under the driving position visual angle, and no road intersection exists between the starting position and the ending position of the actual driving external environment corresponding to each panoramic video; and displaying the rendering effect of the internal real scene image of the target vehicle in the current panoramic video on the graphical user interface through the display component 50c so as to simulate the driving of the target vehicle in the actual driving environment corresponding to the current panoramic video.
In some embodiments, a road intersection exists at the end position of the external environment of actual driving corresponding to the current panoramic video. The processor 50b is further configured to: responding to a direction indicating signal received when the current panoramic video is played to a corresponding end position, determining a next panoramic video to which the simulation target vehicle should be switched according to the direction indicating signal, wherein the next panoramic video is a panoramic video corresponding to a next actual driving environment which is subsequently connected to the end position determined according to the direction indicating signal; the next panoramic video is obtained and the rendering effect of the interior live view of the target vehicle in the next panoramic video is displayed on the graphical user interface by the display component 50c to simulate the driving of the target vehicle in the actual driving environment corresponding to the next panoramic video.
In other embodiments, the processor 50b is further configured to: determining a driving speed of the simulated driving in response to receiving the adjustment signal of the driving speed; adjusting the playing speed of the current panoramic video according to the driving speed of the simulated driving; the playing speed of the current panoramic video is positively correlated with the driving speed of the simulated driving; and displaying the rendering effect of the internal real scene image of the target vehicle in the current panoramic video after the playing speed is adjusted on the graphical user interface by utilizing the adjusted playing speed so as to simulate the driving speed of the target vehicle in the actual driving environment corresponding to the current panoramic video.
In still other embodiments, the internal scene graph comprises: a live-action image shot for the steering wheel of the target vehicle at the view angle of the driving seat; the processor 50b is further configured to: and responding to the received direction indication signal, and adjusting the steering direction of the steering wheel in the internal real scene image according to the direction indication signal so as to enable the steering direction of the steering wheel in the internal real scene image to correspond to the road trend in the current panoramic video.
In some other embodiments, the processor 50b is further configured to: determining a target viewpoint of the virtual camera for adjusting the observation direction in response to receiving the observation angle adjustment signal; performing perspective projection on the current panoramic video according to the target viewpoint so as to determine a driving video observed by the target viewpoint from the current panoramic video; and projecting the driving video observed by the target viewpoint onto a graphical user interface so as to play the driving video observed by the target viewpoint of the virtual camera on the graphical user interface. In some optional embodiments, as shown in fig. 5, the electronic device may further include: communication component 50d, power component 50e, audio component 50f, and the like. Only some of the components are shown schematically in fig. 5, and it is not meant that the electronic device must include all of the components shown in fig. 5, nor that the electronic device can include only the components shown in fig. 5.
The electronic equipment provided by the embodiment can provide the driving vehicle selection control for the user to autonomously select the simulated driving vehicle; acquiring an actual driving environment at the current position based on the current position of the electronic equipment; and providing a driving environment selection control for the user to autonomously select the acquired actual driving environment at the current position of the electronic equipment. In the process of simulated driving, an internal real scene graph of a target vehicle selected by a user under a driving position visual angle can be rendered in a panoramic video under the driving position visual angle corresponding to an actual driving environment selected by the user, so that the simulated driving of the target vehicle is realized, and interactive driving experience is provided. On the one hand, because the simulated driving environment is the actual driving environment of the electronic equipment at the current position, the road condition and the environment of the simulated driving environment are the current real driving environment of the user, on the other hand, the simulated driving environment is the actual driving environment of the electronic equipment at the current position, the panoramic video of the actual driving environment at the driving position visual angle is used for embodying the actual driving environment, the internal real scene image of the target vehicle at the driving position visual angle is used for embodying the driving vehicle, the driver can feel more real, the immersion of online vehicle watching is improved, and the online vehicle watching experience of the user is improved.
In embodiments of the present application, the memory is used to store computer programs and may be configured to store other various data to support operations on the device on which it is located. Wherein the processor may execute a computer program stored in the memory to implement the corresponding control logic. The memory may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
In the embodiments of the present application, the processor may be any hardware processing device that can execute the above described method logic. Alternatively, the processor may be a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or a Micro Controller Unit (MCU); programmable devices such as Field-Programmable Gate arrays (FPGAs), Programmable Array Logic devices (PALs), General Array Logic devices (GAL), Complex Programmable Logic Devices (CPLDs), etc. may also be used; or Advanced Reduced Instruction Set (RISC) processors (ARM), or System On Chips (SOC), etc., but is not limited thereto.
In embodiments of the present application, the communication component is configured to facilitate wired or wireless communication between the device in which it is located and other devices. The device in which the communication component is located can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, 4G, 5G or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component may also be implemented based on Near Field Communication (NFC) technology, Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, or other technologies.
In the embodiment of the present application, the display assembly may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display assembly includes a touch panel, the display assembly may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
In embodiments of the present application, a power supply component is configured to provide power to various components of the device in which it is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
In embodiments of the present application, the audio component may be configured to output and/or input audio signals. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals. For example, for devices with language interaction functionality, voice interaction with a user may be enabled through an audio component, and so forth.
It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A vehicle simulation driving method is characterized in that a graphical user interface is provided through an electronic terminal, the graphical user interface is used for displaying a driving vehicle selection control and a driving environment selection control, the driving vehicle selection control is used for selecting a simulation driving vehicle, and the driving environment selection control is used for selecting at least one actual driving environment of the simulation driving vehicle at a current position acquired by the electronic terminal;
the method comprises the following steps:
in response to detecting the triggering of the driving vehicle selection control, acquiring an internal live-action image of the selected target vehicle; the interior real view is obtained from a real view photographed of the interior real view of the target vehicle at a driving seat angle, an
In response to the detection of the triggering of the driving environment selection control, acquiring a panoramic video corresponding to at least one actual driving environment at the current position of the electronic terminal, wherein each actual driving environment at the current position of the electronic terminal is acquired according to a panoramic video shot for each actually driven external environment at a driving position viewing angle, and no road intersection exists between the starting position and the ending position of the actually driven external environment corresponding to each panoramic video;
and displaying the rendering effect of the internal real scene image of the target vehicle in the current panoramic video on the graphical user interface so as to simulate the driving of the target vehicle in the actual driving environment corresponding to the current panoramic video.
2. The method of claim 1, wherein a road intersection exists at an end position of the external environment of actual driving corresponding to the current panoramic video, the method further comprising:
responding to a direction indicating signal received when the current panoramic video is played to the corresponding end point position, and determining a next panoramic video to which the target vehicle is simulated to be switched according to the direction indicating signal, wherein the next panoramic video is a panoramic video corresponding to a next actual driving environment which is subsequently connected after the end point position determined according to the direction indicating signal;
obtaining the next panoramic video, an
And displaying the rendering effect of the internal real scene image of the target vehicle in the next panoramic video on the graphical user interface so as to simulate the driving of the target vehicle in the actual driving environment corresponding to the next panoramic video.
3. The method of claim 1, further comprising:
determining a driving speed of the simulated driving in response to receiving the adjustment signal of the driving speed;
adjusting the playing speed of the current panoramic video according to the driving speed of the simulated driving; wherein the playing speed of the current panoramic video is positively correlated with the driving speed of the simulated driving;
and displaying the rendering effect of the internal real scene image of the target vehicle in the current panoramic video after the playing speed is adjusted on the graphical user interface by using the adjusted playing speed so as to simulate the driving speed of the target vehicle in the actual driving environment corresponding to the current panoramic video.
4. A method according to any of claims 1-3, wherein the internal scene graph comprises: a live-action image shot for the steering wheel of the target vehicle at the driving seat view angle; the method further comprises the following steps:
and responding to the received direction indication signal, and adjusting the steering of a steering wheel in the internal real scene image according to the direction indication signal so as to enable the steering of the steering wheel in the internal real scene image to correspond to the road trend in the current panoramic video.
5. The method according to any one of claims 1-3, further comprising:
determining a target viewpoint of the virtual camera for adjusting the observation direction in response to receiving the observation angle adjustment signal;
performing perspective projection on the current panoramic video according to the target viewpoint so as to determine a driving video observed by the target viewpoint from the current panoramic video;
projecting the driving video observed by the target viewpoint onto the graphical user interface so as to play the driving video observed by the target viewpoint of the virtual camera on the graphical user interface.
6. A vehicle simulated driving apparatus, wherein a graphical user interface is provided by the apparatus, the graphical user interface being configured to display a driving vehicle selection control for selecting a simulated driving vehicle and a driving environment selection control for selecting at least one actual driving environment of the simulated driving vehicle at a current location obtained by the apparatus;
the device comprises:
the acquisition module is used for responding to the trigger of the driving vehicle selection control and acquiring an internal live-action image of the selected target vehicle; the internal live-action map is obtained according to a live-action map shot for the internal live-action of the target vehicle at a driving position view angle; and
in response to the triggering of the driving environment selection control, acquiring a panoramic video corresponding to at least one actual driving environment at the current position acquired by the device, wherein each actual driving environment at the current position acquired by the device is acquired according to a panoramic video shot for each actually driven external environment at a driving position viewing angle, and no road intersection exists between the starting position and the ending position of the actually driven external environment corresponding to each panoramic video;
and the display module is used for displaying the rendering effect of the internal real scene image of the target vehicle in the current panoramic video on the graphical user interface so as to simulate the driving of the target vehicle in the actual driving environment corresponding to the current panoramic video.
7. The apparatus of claim 6, wherein a road intersection exists at an end position of the external environment of actual driving corresponding to the current panoramic video, the apparatus further comprising:
the determining module is used for responding to a direction indicating signal received when the current panoramic video is played to the corresponding end position, determining a next panoramic video to which the target vehicle is simulated to be switched according to the direction indicating signal, wherein the next panoramic video is a panoramic video corresponding to a next actual driving environment which is subsequently connected behind the end position determined according to the direction indicating signal;
the acquisition module is used for acquiring the next panoramic video;
the display module is configured to display, on the graphical user interface, a rendering effect of the internal real view of the target vehicle in the next panoramic video, so as to simulate driving of the target vehicle in a next actual driving environment corresponding to the next panoramic video.
8. An electronic device, comprising: a memory, a processor, and a display component; the electronic equipment provides a graphical user interface through the display component, the graphical user interface is used for displaying a driving vehicle selection control and a driving environment selection control, the driving vehicle selection control is used for selecting a simulated driving vehicle, and the driving environment selection control is used for selecting at least one actual driving environment of the simulated driving vehicle at the current position acquired by the electronic equipment;
the memory for storing a computer program;
the processor is coupled to the memory and the display component for executing the computer program for performing the steps of the method of any one of claims 1-5.
9. A computer-readable storage medium having stored thereon computer instructions, which, when executed by one or more processors, cause the one or more processors to perform the steps of the method of any one of claims 1-5.
10. A computer program product, comprising: a computer program; the computer program is executed by a processor for implementing the steps in the method of any of claims 1-5.
CN202111110041.2A 2021-09-18 2021-09-18 Vehicle driving simulation method, device, storage medium and program product Pending CN114089890A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111110041.2A CN114089890A (en) 2021-09-18 2021-09-18 Vehicle driving simulation method, device, storage medium and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111110041.2A CN114089890A (en) 2021-09-18 2021-09-18 Vehicle driving simulation method, device, storage medium and program product

Publications (1)

Publication Number Publication Date
CN114089890A true CN114089890A (en) 2022-02-25

Family

ID=80296161

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111110041.2A Pending CN114089890A (en) 2021-09-18 2021-09-18 Vehicle driving simulation method, device, storage medium and program product

Country Status (1)

Country Link
CN (1) CN114089890A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030063133A1 (en) * 2001-09-28 2003-04-03 Fuji Xerox Co., Ltd. Systems and methods for providing a spatially indexed panoramic video
US9568995B1 (en) * 2015-12-30 2017-02-14 Thunder Power Hong Kong Ltd. Remote driving with a virtual reality system
CN109887372A (en) * 2019-04-16 2019-06-14 北京中公高远汽车试验有限公司 Driving training analogy method, electronic equipment and storage medium
CN110109552A (en) * 2019-05-23 2019-08-09 重庆大学 Virtual driving scene modeling method based on true environment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030063133A1 (en) * 2001-09-28 2003-04-03 Fuji Xerox Co., Ltd. Systems and methods for providing a spatially indexed panoramic video
US9568995B1 (en) * 2015-12-30 2017-02-14 Thunder Power Hong Kong Ltd. Remote driving with a virtual reality system
CN106896909A (en) * 2015-12-30 2017-06-27 昶洧新能源汽车发展有限公司 Using the long-range driving of virtual reality system
CN109887372A (en) * 2019-04-16 2019-06-14 北京中公高远汽车试验有限公司 Driving training analogy method, electronic equipment and storage medium
CN110109552A (en) * 2019-05-23 2019-08-09 重庆大学 Virtual driving scene modeling method based on true environment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陆虹;丁世民;潘小龙;李刚;: "虚拟驾驶环境中车辆智能体的驾驶行为模型", 计算机与现代化, no. 06, 15 June 2010 (2010-06-15) *

Similar Documents

Publication Publication Date Title
US11262835B2 (en) Human-body-gesture-based region and volume selection for HMD
US11484790B2 (en) Reality vs virtual reality racing
US9778815B2 (en) Three dimensional user interface effects on a display
US11024083B2 (en) Server, user terminal device, and control method therefor
US9734633B2 (en) Virtual environment generating system
KR102155001B1 (en) Head mount display apparatus and method for operating the same
US11710310B2 (en) Virtual content positioned based on detected object
KR20170013737A (en) Head mount display apparatus and method for operating the same
EP3264228A1 (en) Mediated reality
US10931926B2 (en) Method and apparatus for information display, and display device
EP3422151A1 (en) Methods, apparatus, systems, computer programs for enabling consumption of virtual content for mediated reality
CN113946259B (en) Vehicle information processing method and device, electronic equipment and readable medium
CN114089890A (en) Vehicle driving simulation method, device, storage medium and program product
KR20180055637A (en) Electronic apparatus and method for controlling thereof
KR20230124363A (en) Electronic apparatus and method for controlling thereof
EP3712747A1 (en) Indicator modes
KR102206958B1 (en) display device
CN117478931A (en) Information display method, information display device, electronic equipment and storage medium
CN116499489A (en) Man-machine interaction method, device, equipment and product based on map navigation application
CN117376591A (en) Scene switching processing method, device, equipment and medium based on virtual reality
CN115158344A (en) Display method, display device, vehicle, and medium
CN117435039A (en) Information display method, device, storage medium and equipment
CN112286355A (en) Interactive method and system for immersive content

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination