CN114210051A - Carrier control method, device, equipment and storage medium in virtual scene - Google Patents

Carrier control method, device, equipment and storage medium in virtual scene Download PDF

Info

Publication number
CN114210051A
CN114210051A CN202111536095.5A CN202111536095A CN114210051A CN 114210051 A CN114210051 A CN 114210051A CN 202111536095 A CN202111536095 A CN 202111536095A CN 114210051 A CN114210051 A CN 114210051A
Authority
CN
China
Prior art keywords
virtual
virtual vehicle
orientation
vehicle
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111536095.5A
Other languages
Chinese (zh)
Inventor
邓颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202111536095.5A priority Critical patent/CN114210051A/en
Publication of CN114210051A publication Critical patent/CN114210051A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/422Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle automatically for the purpose of assisting the player, e.g. automatic braking in a driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8017Driving on land or water; Flying

Abstract

The application provides a vehicle control method, a device, equipment and a computer readable storage medium in a virtual scene; the method comprises the following steps: presenting a virtual carrier carrying a virtual object in an interface of a virtual scene; in the process of controlling the virtual vehicle to run, responding to a steering instruction aiming at the virtual vehicle, and controlling the virtual vehicle to steer at a corresponding angle according to the direction indicated by the steering instruction; and adjusting the orientation of the virtual vehicle in response to the deviation of the steered orientation of the virtual vehicle from the driving path of the virtual vehicle, so that the orientation of the adjusted virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle. Through the method and the device, the accuracy of virtual vehicle running in the virtual scene can be improved, and the control efficiency of the virtual vehicle in the virtual scene is improved.

Description

Carrier control method, device, equipment and storage medium in virtual scene
Technical Field
The present application relates to the field of virtualization and human-computer interaction technologies, and in particular, to a vehicle control method and apparatus in a virtual scene, an electronic device, a computer program product, and a computer-readable storage medium.
Background
With the development of computer technology, electronic devices can realize more abundant and vivid virtual scenes. The virtual scene refers to a digital scene outlined by a computer through a digital communication technology, and a user can obtain a fully virtualized feeling (for example, virtual reality) or a partially virtualized feeling (for example, augmented reality) in the aspects of vision, hearing and the like in the virtual scene, and simultaneously can interact with various interactive objects in the virtual scene or control interaction among various objects in the virtual scene to obtain feedback.
In the related art, in a mobile game, if a player drives a virtual vehicle and turns left and right by clicking left and right direction keys, a large deviation is likely to occur in the steering process of the virtual vehicle due to the internal structure of the vehicle. The player needs to click the left-turn and right-turn direction keys for multiple times to adjust the virtual carrier, the technical requirement on the player is high, multiple operations are needed, and the adjustment accuracy is low.
Disclosure of Invention
The embodiment of the application provides a vehicle control method and device in a virtual scene and a computer readable storage medium, which can improve the accuracy of virtual vehicle driving in the virtual scene and improve the control efficiency for the virtual vehicle in the virtual scene.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a vehicle control method in a virtual scene, which comprises the following steps:
presenting a virtual carrier carrying a virtual object in an interface of a virtual scene;
in the process of controlling the virtual vehicle to run, responding to a steering instruction for the virtual vehicle, and controlling the virtual vehicle to steer at a corresponding angle according to the direction indicated by the steering instruction;
and adjusting the orientation of the virtual vehicle in response to the fact that the steered orientation of the virtual vehicle deviates from the driving path of the virtual vehicle, so that the adjusted orientation of the virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle.
An embodiment of the present application provides a vehicle control device in a virtual scene, including:
the presentation module is used for presenting a virtual carrier carrying a virtual object in an interface of a virtual scene;
the steering module is used for responding to a steering instruction aiming at the virtual vehicle in the process of controlling the virtual vehicle to run and controlling the virtual vehicle to steer at a corresponding angle according to the direction indicated by the steering instruction;
the adjusting module is used for responding to the deviation of the direction of the virtual vehicle after the virtual vehicle turns to the driving path of the virtual vehicle, and adjusting the direction of the virtual vehicle so that the direction of the virtual vehicle after the adjustment is consistent with the direction of the driving path of the current position of the virtual vehicle.
In the above solution, the adjusting module is further configured to receive a direction reset instruction for the virtual vehicle;
and responding to the direction resetting instruction, and adjusting the orientation of the virtual vehicle, so that the orientation of the virtual vehicle after adjustment is consistent with the orientation of the driving path of the current position of the virtual vehicle.
In the above scheme, the adjusting module is further configured to present a direction resetting control in the interface of the virtual scene;
and when the direction resetting control is in an activated state, responding to the triggering operation aiming at the direction resetting control, and receiving a direction resetting instruction aiming at the virtual carrier.
In the above scheme, the adjusting module is further configured to present pattern drawing indication information in an interface of the virtual scene, where the pattern drawing indication information is used to indicate that a target pattern is drawn to trigger the direction resetting instruction;
and receiving a direction resetting instruction for the virtual carrier when the drawn pattern is matched with the target pattern in response to the pattern drawing operation based on the pattern drawing instruction information.
In the above scheme, the adjusting module is further configured to present, in the interface of the virtual scene, at least one steering control for controlling the virtual vehicle to steer;
receiving a steering instruction for the virtual vehicle in response to a triggering operation for the steering control.
In the above scheme, the adjusting module is further configured to respond to a pressing operation for the steering control, and when a pressing duration of the pressing operation reaches a duration threshold or a pressing pressure reaches a pressure threshold, control the corresponding steering control to be in a suspended state;
in response to the steering control being in the hovering state being dragged to a first area, a direction reset instruction for the virtual vehicle is received.
In the above scheme, the adjusting module is further configured to control the steering control dragged to the first area to return to the initial position.
In the above scheme, the adjusting module is further configured to present sliding indication information corresponding to the steering control, where the sliding indication information is used to trigger the direction resetting instruction when the steering control is indicated to slide to the second area;
based on the sliding indication information, a sliding operation for the steering control is received, and when the steering control is slid to the second area, a direction resetting instruction for the virtual vehicle is received.
In the above scheme, the adjusting module is further configured to obtain the number of times of triggering a direction resetting instruction for the virtual vehicle when the steering instruction for the virtual vehicle is received again and the steered direction of the virtual vehicle deviates from the driving path of the virtual vehicle again;
and when the triggering times reach a time threshold value, automatically adjusting the orientation of the virtual carrier, so that the orientation of the adjusted virtual carrier is consistent with the orientation of the driving path of the current position of the virtual carrier.
In the above scheme, the adjusting module is further configured to predict, through a neural network model, the number of times that the direction reset instruction is triggered within a first time period from a current time to obtain a prediction result;
and when the prediction result shows that the number of times of triggering the direction resetting instruction is greater than a number threshold value in the first time period, periodically adjusting the orientation of the virtual vehicle in the first time period, so that the orientation of the adjusted virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle.
In the above scheme, the adjusting module is further configured to control the virtual vehicle to be in a stable state in a second time period from the current time;
when a steering instruction for the virtual vehicle is received in the process that the virtual vehicle is in a stable orientation state, the virtual vehicle is controlled to steer according to the direction indicated by the steering instruction, and the orientation of the virtual vehicle after steering is consistent with the orientation of the driving path of the current position of the virtual vehicle.
In the above scheme, the adjusting module is further configured to obtain an orientation adjusting authority of the current login account for the virtual vehicle;
when the current login account is determined to have the orientation adjustment right for the virtual vehicle, automatically adjusting the orientation of the virtual vehicle, so that the adjusted orientation of the virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle.
In the above scheme, the presenting module is further configured to present a direction reset switch in the interface of the virtual scene; accordingly, the number of the first and second electrodes,
in the above solution, the adjusting module is further configured to control to start a direction reset mode for the virtual vehicle in response to an on instruction for the direction reset switch;
and automatically adjusting the orientation of the virtual vehicle in the direction resetting mode, so that the orientation of the virtual vehicle after adjustment is consistent with the orientation of the driving path of the current position of the virtual vehicle.
In the above scheme, the adjusting module is further configured to control the virtual object to equip the target virtual item in response to an equipment instruction for the target virtual item;
controlling the virtual object to throw the target virtual item towards the interactive object in response to a throwing instruction aiming at the target virtual item in the running process of the interactive object carrying the target virtual carrier;
when the target virtual prop is thrown into the sensing range of the target virtual prop, controlling the target virtual carrier to be in a direction resetting forbidden state;
the direction resetting forbidden state is used for keeping the direction of the target virtual vehicle unchanged when the direction of the steered target virtual vehicle deviates from the running path of the target virtual vehicle and triggers a direction resetting instruction for adjusting the direction of the target virtual vehicle.
In the above scheme, the adjusting module is further configured to determine a circle center position corresponding to the driving path;
determining a tangent line taking the current position of the virtual carrier as a tangent point based on the determined circle center position;
and adjusting the orientation of the virtual carrier, so that the orientation of the virtual carrier after adjustment is parallel to the direction of the tangent line.
In the above scheme, the adjusting module is further configured to receive an interactive operation of an interactive object to the virtual vehicle by using a virtual item in a process of controlling the virtual vehicle to travel;
and when the orientation of the virtual vehicle deviates from the driving path of the virtual vehicle as a result of the interactive operation, adjusting the orientation of the virtual vehicle so that the adjusted orientation of the virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle.
An embodiment of the present application provides an electronic device, including:
a memory for storing executable instructions;
and the processor is used for realizing the vehicle control method in the virtual scene provided by the embodiment of the application when the executable instructions stored in the memory are executed.
An embodiment of the present application provides a computer-readable storage medium, which stores executable instructions for causing a processor to execute the computer-readable storage medium to implement the vehicle control method in a virtual scene provided in the embodiment of the present application.
The present application provides a computer program product, which includes a computer program or instructions, and when the computer program or instructions are executed by a processor, the method for controlling a vehicle in a virtual scene provided in the present application is implemented.
The embodiment of the application has the following beneficial effects:
by applying the embodiment of the application, in the driving process of the virtual vehicle, when the direction of the steered virtual vehicle deviates from the driving path of the virtual vehicle, the direction of the virtual vehicle is adjusted to keep the direction of the virtual vehicle consistent with the direction of the driving path of the current position of the virtual vehicle, so that the driving accuracy of the vehicle in the virtual scene can be improved, and the control efficiency of the virtual vehicle in the virtual scene can be improved.
Drawings
Fig. 1 is a schematic diagram illustrating an architecture of a vehicle control system 100 in a virtual scene according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of an electronic device 500 for implementing a vehicle control method in a virtual scene according to an embodiment of the present disclosure;
fig. 3 is a flowchart illustrating a vehicle control method in a virtual scene according to an embodiment of the present disclosure;
FIG. 4 is a schematic view of a steering control provided by an embodiment of the present application;
fig. 5 is a schematic view illustrating an orientation adjustment of a virtual vehicle according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a direction reset control in different states according to an embodiment of the present application;
FIGS. 7A-7B are schematic diagrams of patterns provided by embodiments of the present application;
FIG. 8 is a schematic diagram of a steering control in a hovering state according to an embodiment of the present application;
FIG. 9 is a schematic view of a sliding operation provided by an embodiment of the present application;
FIG. 10 is a schematic view of a voice entry interface provided by an embodiment of the present application;
fig. 11 is a flowchart illustrating a virtual vehicle control method according to an embodiment of the present disclosure;
fig. 12 is a schematic view illustrating a virtual vehicle direction resetting method according to an embodiment of the present application;
fig. 13 is a schematic view of a virtual vehicle in a steady state orientation according to an embodiment of the present disclosure;
fig. 14 is a schematic view illustrating an orientation of an automatically adjusting virtual vehicle according to an embodiment of the present disclosure;
FIG. 15 is a schematic diagram of a direction reset switch provided in an embodiment of the present application;
FIG. 16 is a flow chart of a direction reset disable state provided by an embodiment of the present application;
fig. 17 is a flowchart of a virtual vehicle orientation adjustment method according to an embodiment of the present application;
FIG. 18 is a schematic view of an embodiment of the present disclosure for adjusting the orientation of a virtual vehicle according to a tangent line;
fig. 19 is a schematic steering diagram of a virtual vehicle according to an embodiment of the present application;
fig. 20 is a schematic view illustrating a virtual vehicle direction resetting method according to an embodiment of the present application;
fig. 21 is a flowchart of a vehicle control method in a virtual scene according to an embodiment of the present disclosure.
Detailed Description
In order to make the objectives, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the attached drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
Where similar language of "first/second" appears in the specification, the following description is added, and where reference is made to the term "first \ second \ third" merely for distinguishing between similar items and not for indicating a particular ordering of items, it is to be understood that "first \ second \ third" may be interchanged both in particular order or sequence as appropriate, so that embodiments of the application described herein may be practiced in other than the order illustrated or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be used for the following explanation.
1) The terminal comprises a client and an application program running in the terminal and used for providing various services, such as an instant messaging client and a video playing client.
2) In response to the condition or state on which the performed operation depends, one or more of the performed operations may be in real-time or may have a set delay when the dependent condition or state is satisfied; there is no restriction on the order of execution of the operations performed unless otherwise specified.
3) The virtual scene is a virtual scene displayed (or provided) when an application program runs on the terminal. The virtual scene may be a simulation environment of a real world, a semi-simulation semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, a virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as deserts, cities, etc., and a user may control virtual objects to perform activities within the virtual scene including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. The virtual scene may be displayed at a first-person perspective (e.g., to play a virtual object in a game at the player's own perspective); or displaying the virtual scene at a third person perspective (e.g., a player follows a virtual object in the game to play the game); the virtual scene can also be displayed at a bird's-eye view angle; the above-mentioned viewing angles can be switched arbitrarily.
Taking the example of displaying the virtual scene at the first-person viewing angle, the virtual scene displayed in the human-computer interaction interface may include: according to the viewing position and the viewing angle of the virtual object in the complete virtual scene, the field of view area of the virtual object is determined, and the partial virtual scene in the field of view area in the complete virtual scene is presented, namely, the displayed virtual scene can be a partial virtual scene relative to the panoramic virtual scene. Because the first person viewing angle is the viewing angle which can give impact force to the user, the immersive perception that the user is personally on the scene in the operation process can be realized. Taking the example of displaying the virtual scene from the bird's-eye view angle, the interface of the virtual scene presented in the human-computer interaction interface may include: in response to a zoom operation for the panoramic virtual scene, a partial virtual scene corresponding to the zoom operation is presented in the human-machine interaction interface, i.e., the displayed virtual scene may be a partial virtual scene relative to the panoramic virtual scene. Therefore, the operability of the user in the operation process can be improved, and the efficiency of man-machine interaction can be improved.
4) Virtual objects, the appearance of various people and objects in the virtual scene that can interact, or movable objects in the virtual scene. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in the virtual scene. The virtual object may be an avatar in the virtual scene that is virtual to represent the user. The virtual scene may include a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene and occupying a portion of the space in the virtual scene.
Alternatively, the virtual object may be a user Character controlled by an operation on the client, an Artificial Intelligence (AI) set in the virtual scene fight by training, or a Non-user Character (NPC) set in the virtual scene interaction. Alternatively, the virtual object may be a virtual character that is confrontationally interacted with in a virtual scene. Optionally, the number of virtual objects participating in the interaction in the virtual scene may be preset, or may be dynamically determined according to the number of clients participating in the interaction.
Taking a shooting game as an example, the user may control the virtual object to freely fall, glide, open a parachute to fall, run, jump, crawl over land, or the like in the sky of the virtual scene, or may control the virtual object to swim, float, or dive in the sea. Of course, the user may also control the virtual object to ride the vehicle-like virtual item to move in the virtual scene, for example, the vehicle-like virtual item may be a virtual car, a virtual aircraft, a virtual yacht, or the like; the user may also control the virtual object to perform antagonistic interaction with other virtual objects through the attack-type virtual item, for example, the virtual item may be a virtual machine a, a virtual tank, a virtual fighter, and the like, which is only exemplified in the above scenario, and this is not limited in this embodiment of the present application.
5) Scene data, representing various features that objects in the virtual scene are exposed to during the interaction, may include, for example, the location of the objects in the virtual scene. Of course, different types of features may be included depending on the type of virtual scene; for example, in a virtual scene of a game, scene data may include a time required to wait for various functions arranged in the virtual scene (depending on the number of times the same function can be used within a certain time), and attribute values indicating various states of a game character may include, for example, a life value (also referred to as a red amount), a magic value (also referred to as a blue amount), a state value, a blood amount, and the like.
Based on the above explanations of terms and terms related to the embodiments of the present application, the following describes a vehicle control system in a virtual scene provided by the embodiments of the present application. Referring to fig. 1, fig. 1 is a schematic diagram of an architecture of a vehicle control system 100 in a virtual scene provided in an embodiment of the present application, in order to support an exemplary application, terminals (exemplary terminals 400-1 and 400-2 are shown) are connected to a server 200 through a network 300, where the network 300 may be a wide area network or a local area network, or a combination of the two, and data transmission is implemented using a wireless or wired link.
The terminal (such as the terminal 400-1 and the terminal 400-2) is used for sending an acquisition request of scene data of the virtual scene to the server 200 based on the view interface receiving the triggering operation of entering the virtual scene;
the server 200 is configured to receive an acquisition request of scene data, and return the scene data of a virtual scene to the terminal in response to the acquisition request;
terminals (such as terminal 400-1 and terminal 400-2) for receiving scene data of a virtual scene, rendering a picture of the virtual scene based on the obtained scene data, and presenting the picture of the virtual scene on a graphical interface (for example, graphical interface 410-1 and graphical interface 410-2 are shown); the virtual scene can also present a virtual object interaction environment, a virtual carrier carrying the virtual object, an interaction object, a virtual carrier on which the interaction object is carried, a steering control for controlling steering of the virtual carrier and the like in the picture of the virtual scene, and the content presented in the picture of the virtual scene is obtained by rendering based on returned scene data of the virtual scene.
In practical application, the server 200 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a Content Delivery Network (CDN), a big data and artificial intelligence platform, and the like. The terminals (e.g., terminal 400-1 and terminal 400-2) may be, but are not limited to, a smart phone, a tablet computer, a laptop computer, a desktop computer, a smart speaker, a smart television, a smart watch, and the like. The terminals (e.g., terminal 400-1 and terminal 400-2) and the server 200 may be directly or indirectly connected through wired or wireless communication, and the application is not limited thereto.
In actual applications, the terminals (including the terminal 400-1 and the terminal 400-2) are installed and run with applications supporting virtual scenes. The application program may be any one of a First-Person shooter game (FPS), a third-Person shooter game, a driving game with steering operation as a dominant behavior, a Multiplayer Online tactical sports game (MOBA), a Two-dimensional (2D) game application, a Three-dimensional (3D) game application, a virtual reality application program, a Three-dimensional map program, a simulation program, or a Multiplayer gunfight survival game. The application may also be a stand-alone application, such as a stand-alone 3D game program.
The method comprises the steps that an electronic game scene is taken as an exemplary scene, a user can operate on a terminal in advance, the terminal can download a game configuration file of the electronic game after detecting the operation of the user, the game configuration file can comprise an application program, interface display data or virtual scene data and the like of the electronic game, and therefore the user can call the game configuration file when logging in the electronic game on the terminal and render and display an electronic game interface. A user may perform a touch operation on a terminal, and after the terminal detects the touch operation, the terminal may determine game data corresponding to the touch operation, and render and display the game data, where the game data may include virtual scene data, behavior data of a virtual object in the virtual scene, and the like.
In practical application, a terminal (including the terminal 400-1 and the terminal 400-2) receives a trigger operation for entering a virtual scene based on a view interface, and sends an acquisition request of scene data of the virtual scene to the server 200; the server 200 receives the acquisition request of the scene data, responds to the acquisition request, and returns the scene data of the virtual scene to the terminal; the method comprises the steps that a terminal receives scene data of a virtual scene, renders pictures of the virtual scene based on the scene data, and presents a virtual carrier carrying a virtual object (namely a virtual character corresponding to a user logging in the electronic game) in an interface of the virtual scene;
further, the terminal responds to a steering command for the virtual vehicle in the process of controlling the virtual vehicle to run, and controls the virtual vehicle to steer at a corresponding angle according to the direction indicated by the steering command; and the terminal responds to the deviation of the orientation of the virtual vehicle after the virtual vehicle turns from the running path of the virtual vehicle, and adjusts the orientation of the virtual vehicle so that the orientation of the virtual vehicle after the adjustment is consistent with the orientation of the running path of the current position of the virtual vehicle. Therefore, the accuracy of the running of the carrier in the virtual scene can be improved, and the control efficiency of the carrier in the virtual scene is improved.
The embodiments of the present application can also be implemented by means of Cloud Technology (Cloud Technology), which refers to a hosting Technology for unifying series resources such as hardware, software, and network in a wide area network or a local area network to implement data calculation, storage, processing, and sharing.
The cloud technology is a general term of network technology, information technology, integration technology, management platform technology, application technology and the like applied based on a cloud computing business model, can form a resource pool, is used as required, and is flexible and convenient. Cloud computing technology will become an important support. Background services of the technical network system require a large amount of computing and storage resources.
Referring to fig. 2, fig. 2 is a schematic structural diagram of an electronic device 500 for implementing a vehicle control method in a virtual scene according to an embodiment of the present disclosure. In practical applications, the electronic device 500 may be a server or a terminal shown in fig. 1, and the electronic device 500 is taken as the terminal shown in fig. 1 as an example to explain the electronic device implementing the vehicle control method in the virtual scene according to the embodiment of the present application, where the electronic device 500 provided in the embodiment of the present application includes: at least one processor 510, memory 550, at least one network interface 520, and a user interface 530. The various components in the electronic device 500 are coupled together by a bus system 540. It is understood that the bus system 540 is used to enable communications among the components. The bus system 540 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 540 in fig. 2.
The Processor 510 may be an integrated circuit chip having Signal processing capabilities, such as a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like, wherein the general purpose Processor may be a microprocessor or any conventional Processor, or the like.
The user interface 530 includes one or more output devices 531 enabling presentation of media content, including one or more speakers and/or one or more visual display screens. The user interface 530 also includes one or more input devices 532, including user interface components to facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The memory 550 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard disk drives, optical disk drives, and the like. Memory 550 optionally includes one or more storage devices physically located remote from processor 510.
The memory 550 may comprise volatile memory or nonvolatile memory, and may also comprise both volatile and nonvolatile memory. The nonvolatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a Random Access Memory (RAM). The memory 550 described in embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 550 can store data to support various operations, examples of which include programs, modules, and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 551 including system programs for processing various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and processing hardware-based tasks;
a network communication module 552 for communicating to other computing devices via one or more (wired or wireless) network interfaces 520, exemplary network interfaces 520 including: bluetooth, wireless compatibility authentication (WiFi), and Universal Serial Bus (USB), etc.;
a presentation module 553 for enabling presentation of information (e.g., a user interface for operating peripherals and displaying content and information) via one or more output devices 531 (e.g., a display screen, speakers, etc.) associated with the user interface 530;
an input processing module 554 to detect one or more user inputs or interactions from one of the one or more input devices 532 and to translate the detected inputs or interactions.
In some embodiments, the vehicle control device in the virtual scene provided in the embodiments of the present application may be implemented in software, and fig. 2 illustrates a vehicle control device 555 in the virtual scene stored in a memory 550, which may be software in the form of programs and plug-ins, and includes the following software modules: a presentation module 5551, a steering module 5553 and an adjustment module 5553, which are logical and thus can be arbitrarily combined or further split depending on the implemented functionality, the functionality of the individual modules will be explained below.
In other embodiments, the vehicle control Device in the virtual scene provided in this Application may be implemented by combining software and hardware, and as an example, the vehicle control Device in the virtual scene provided in this Application may be a processor in the form of a hardware decoding processor, which is programmed to execute the vehicle control method in the virtual scene provided in this Application, for example, the processor in the form of the hardware decoding processor may employ one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs), Field Programmable Gate Arrays (FPGAs), or other electronic elements.
Based on the above description of the vehicle control system and the electronic device in the virtual scene provided in the embodiments of the present application, the vehicle control method in the virtual scene provided in the embodiments of the present application is described below. In some embodiments, the vehicle control method in the virtual scene provided by the embodiments of the present application may be implemented by a server or a terminal alone, or implemented by a server and a terminal in cooperation. In some embodiments, the terminal or the server may implement the vehicle control method in the virtual scene provided by the embodiments of the present application by running a computer program. For example, the computer program may be a native program or a software module in an operating system; can be a local (Native) Application program (APP), i.e. a program that needs to be installed in an operating system to run, such as a client supporting a virtual scene, e.g. a game APP; or may be an applet, i.e. a program that can be run only by downloading it to the browser environment; but also an applet that can be embedded into any APP. In general, the computer programs described above may be any form of application, module or plug-in.
The vehicle control method in the virtual scene provided in the embodiment of the present application is described below by taking a terminal embodiment as an example. Referring to fig. 3, fig. 3 is a schematic flowchart of a vehicle control method in a virtual scene according to an embodiment of the present application, where the vehicle control method in the virtual scene according to the embodiment of the present application includes:
in step 101, the terminal presents a virtual carrier carrying a virtual object in an interface of a virtual scene.
In actual implementation, an application client supporting a virtual scene may be installed on the terminal, and when a user opens the application client on the terminal and the terminal runs the application client, the terminal presents an interface of the virtual scene (such as a driving game scene), and presents a virtual carrier carrying a virtual object in the interface of the virtual scene. The virtual vehicle can assist the virtual object to move in the virtual scene, and common virtual vehicles include a virtual vehicle, a virtual ship, a virtual airplane, and the like. In practical applications, the virtual object may be an avatar in a virtual scene corresponding to a user account currently logged in the application client, for example, the virtual object may be a virtual object controlled by a user entering a driving game or a simulated virtual scene, and of course, the virtual scene may further include other virtual objects or interactive objects, which may be controlled by other users or controlled by a robot program.
In step 102, in the process of controlling the virtual vehicle to run, in response to a steering command for the virtual vehicle, the virtual vehicle is controlled to steer by a corresponding angle according to a direction indicated by the steering command.
In practical application, in the process of controlling the virtual vehicle carrying the virtual object to run by the terminal, after receiving a steering command for the virtual vehicle, the terminal can control the virtual vehicle to steer at a corresponding angle according to the direction indicated by the steering command.
In some embodiments, the terminal may receive the steering instruction for the virtual vehicle by: the terminal presents at least one steering control for controlling the virtual carrier to steer in an interface of a virtual scene; in response to a triggering operation for the steering control, a steering instruction for the virtual vehicle is received.
In actual implementation, the number of steering controls presented in the virtual interface may be any one of a plurality of presented steering controls (left-turn and right-turn), or may be a specific steering control suitable for a virtual environment in which the current virtual vehicle is located, for example, only a left-turn control is presented in the virtual scene a.
Illustratively, taking a game scene as an example, referring to fig. 4, fig. 4 is a schematic view of a steering control provided by an embodiment of the present application. In the figure, in the interface of the virtual scene, a virtual vehicle "ship" (indicated by the number 1 in the figure) carrying the virtual object and in a driving state, and two steering controls (a left-turning control and a right-turning control indicated by the number 2) are presented. The terminal can respond to the click operation aiming at the left turning control and control the virtual carrier 'ship' to execute the left turning operation; the terminal can also respond to the click operation aiming at the right turning control and control the virtual carrier 'ship' to execute the right turning operation.
In step 103, in response to the steered orientation of the virtual vehicle deviating from the driving path of the virtual vehicle, the orientation of the virtual vehicle is adjusted so that the adjusted orientation of the virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle.
In actual implementation, the terminal responds to the deviation of the steered orientation of the virtual vehicle from the driving path of the virtual vehicle, and the orientation of the virtual vehicle can be adjusted so that the orientation of the adjusted virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle. The method for adjusting the orientation of the virtual vehicle by the trigger terminal may be that the terminal responds to a direction reset instruction for the virtual vehicle to adjust the orientation of the virtual vehicle; the terminal may automatically adjust the orientation of the virtual vehicle when detecting that the orientation of the steered virtual vehicle deviates from the driving path of the virtual vehicle and when the condition for automatic adjustment is satisfied. The conditions for automatic adjustment may relate to the functional authority that the player has, the current level of the player, the number of historical adjustments made by the player, and the like.
A manner of adjusting the orientation of the virtual vehicle by the terminal in response to the direction resetting instruction is described, in some embodiments, referring to fig. 5, fig. 5 is a flowchart of an orientation adjustment manner of the virtual vehicle provided in an embodiment of the present application, and based on fig. 3, step 103 may be implemented by step 1031a to step 1032 a.
In step 1031a, the terminal receives a direction reset instruction for the virtual vehicle.
In practical applications, after receiving the direction reset instruction for the virtual vehicle, the terminal may adjust the orientation of the virtual vehicle so that the virtual vehicle can keep consistent with the orientation of the driving path at the current position. The method for triggering the direction resetting instruction includes a plurality of triggering modes, which mainly include player active triggering and terminal automatic detection, wherein the player active triggering can be that the terminal actively triggers the direction resetting instruction when the player finds that the direction of the virtual vehicle after steering deviates from the driving path of the virtual vehicle after receiving the steering instruction to steer the virtual vehicle. The passive triggering may be that the terminal automatically triggers a direction reset instruction for the virtual vehicle to adjust the direction of the virtual vehicle when detecting that the direction of the steered virtual vehicle deviates from the driving path of the virtual vehicle.
The manner in which the player actively triggers the direction reset instruction will be described. In some embodiments, the terminal presents a direction reset control in an interface of the virtual scene; and when the direction resetting control is in an activated state, the terminal responds to the triggering operation of the player aiming at the direction resetting control and receives a direction resetting instruction aiming at the virtual carrier.
In practical implementation, the direction resetting control has a cooling period (e.g., 10 seconds), the direction resetting control is in a cooling state at this time, when the cooling period is over, the direction resetting control is controlled to be switched from a dormant state to an active state, otherwise, the direction resetting control is in the cooling state, and the player cannot operate the direction resetting control. It should be noted that the terminal may present the direction resetting control in the cooling state and the activated state in different display styles in the interface of the virtual scene, and the specific form and the display style of the direction resetting control are not limited in the present application.
Exemplarily, taking a virtual scene as a driving game scene as an example, referring to fig. 6, fig. 6 is a schematic diagram of direction resetting controls in different states provided in an embodiment of the present application, in the diagram, a virtual vehicle "boat" (indicated by reference numeral 1 in the figure) carrying a virtual object and a direction resetting control (a display style of the direction resetting control in a cooling state indicated by reference numeral 2 in the figure) are presented in a game scene interface a, and during normal driving of the virtual vehicle, the direction resetting control may be in a cooling state, and when a terminal responds to a steering instruction of a player for the virtual vehicle "boat", the virtual vehicle is controlled to steer (left steering or right steering, etc.); after finding that the orientation of the virtual vehicle after being turned deviates from the driving path of the virtual vehicle, the player may activate the direction resetting control, so that the direction resetting control is switched from the cooling state to the activation state (in the figure, the number 3 indicates a display style of the direction resetting control in the activation state), and after the direction resetting control is activated, the terminal receives a direction resetting instruction for the current virtual vehicle.
In some embodiments, the terminal may further implement the triggering operation for the direction reset instruction by: the terminal presents pattern drawing indication information in an interface of a virtual scene, wherein the pattern drawing indication information is used for indicating drawing of a target pattern to trigger the direction resetting instruction; the terminal responds to the pattern drawing operation based on the pattern drawing indication information, receives a direction reset instruction aiming at the virtual carrier when the drawn pattern is matched with the target pattern, and executes subsequent operation aiming at the virtual carrier based on the received direction reset instruction.
In actual implementation, when the terminal detects that the orientation of the virtual vehicle deviates, the terminal may present pattern drawing instruction information in the virtual interface to remind a player to draw a corresponding pattern in the virtual interface according to the pattern drawing instruction information. When the pattern drawn by the player coincides with the target drawn pattern indicated by the pattern drawing instruction information, it can be regarded that the player successfully triggers the direction reset instruction for the virtual vehicle. The player may draw the pattern directly in the pattern drawing instruction information, or may draw the target pattern corresponding to the pattern drawing instruction information in a predetermined drawing area in the virtual interface. It should be noted that the pattern drawing instruction information may be a pattern of any shape.
Fig. 7A-7B are schematic diagrams illustrating pattern rendering provided by an embodiment of the present application, and referring to fig. 7A, in the game scene interface a, a virtual vehicle "boat" (shown by reference numeral 1 in the figure) carrying a virtual object is presented, and when a terminal responds to a steering instruction for the virtual vehicle (a player clicks a left or right steering control shown by reference numeral 2 in the figure), the virtual vehicle is controlled to steer (left steering or right steering, etc.); after finding that the orientation of the virtual vehicle after steering deviates from the driving path of the virtual vehicle, the player may present pattern drawing indication information (the "heart-shaped" pattern shown by number 3 in the figure, the pattern drawing indication information may be any figure), the player may draw directly on the "heart-shaped" (as shown in fig. 7A, draw directly in the direction of the arrow in the "heart-shaped" indication map until the pattern drawing is completed), and may draw a heart-shaped pattern in a pattern drawing area (as shown by number 4 in the figure) in the virtual interface as shown in fig. 7B, after the drawing is completed, the terminal determines whether the pattern drawn by the player is consistent with the target drawing pattern indicated by the pattern drawing indication information, and the terminal may also calculate the similarity between the pattern drawn by the player and the target drawing pattern through the neural network model, and when the similarity between the two reaches a similarity threshold value, it can be determined that the player successfully triggers a direction reset instruction for the virtual vehicle, and the terminal receives the direction reset instruction and executes subsequent operations for the virtual vehicle.
In some embodiments, when the steering instruction for the virtual vehicle is triggered according to the steering control, the terminal may further implement a triggering operation for the direction resetting instruction by: the terminal responds to the pressing operation aiming at the steering control, and when the pressing time of the pressing operation reaches a time threshold or the pressing pressure reaches a pressure threshold, the corresponding steering control is controlled to be in a suspension state; in response to the steering control in the hovering state being dragged to the first area, a direction reset instruction for the virtual vehicle is received.
In actual implementation, in order to reduce the operation complexity of a player or the screen occupation ratio of a control, when the steering control is presented in the virtual interface, the triggering of the direction reset instruction can be realized according to other operations except the clicking operation performed on the steering control. The virtual vehicle is controlled to execute different operations according to different operation types aiming at the steering control, for example, the virtual vehicle is controlled by the terminal to execute the steering operation by executing the clicking operation (clicking the steering control) aiming at the steering control; the method includes the steps that a pressing operation (namely pressing or long-pressing of a turning control) is executed aiming at the turning control, the terminal controls the turning control to be switched from a fixed state to a floating state, the turning control can move at the moment, a player drags the turning control in the floating state to a first area (which can be a fixed area defined in a virtual interface), when the turning control is located in the first area, the terminal can receive a direction resetting instruction aiming at a virtual carrier, and follow-up operation aiming at the virtual carrier is executed based on the direction resetting instruction. In practical application, the terminal can adopt different display styles to present the steering control in a fixed state and a suspension state in a virtual interface. Meanwhile, the demarcation for the first area can be set according to actual conditions. The display style of the steering control and the specific defining mode of the first region are not limited in the embodiment of the application.
For example, referring to fig. 8, fig. 8 is a schematic view of a steering control in a floating state provided in an embodiment of the present application, in which a virtual vehicle "car" carrying a virtual object and a steering control (a left steering key and a right steering key shown in the number 1 in the figure) are presented in a game interface, after finding that the orientation of the virtual vehicle after steering deviates from a driving path of the virtual vehicle, a player performs a pressing operation (or a long pressing operation) on the steering control, and when a pressing duration of the pressing operation reaches a duration threshold (for example, 2s, the duration threshold may be preset according to an actual situation), the player controls the steering control to be switched from a fixed state to the floating state, and presents a first area (shown in the number 2 in the figure), the player drags the steering control to the first area, and the terminal may receive a direction reset instruction and perform a subsequent operation based on the direction reset instruction.
In some embodiments, after the terminal responds to the direction reset instruction triggered based on the steering control, and the orientation of the virtual vehicle is adjusted, the normal steering function of the steering control can be restored by that the terminal controls the steering control dragged to the first area to be restored to the initial position.
In practical implementation, after the terminal responds to the direction resetting instruction, the pressing operation on the steering control is released, so that the pressed steering control is restored to the initial position, and the released steering control is controlled to be switched from the suspension state to the fixed state.
In some embodiments, the terminal may further trigger the direction reset instruction by a pressing operation and a sliding operation based on the steering control, which are implemented as follows: the terminal presents sliding indication information corresponding to the steering control, wherein the sliding indication information is used for triggering a direction reset instruction when the steering control is indicated to slide to the second area; and receiving a sliding operation aiming at the steering control based on the sliding indication information, and receiving a direction resetting instruction aiming at the virtual vehicle when the steering control is slid to the second area.
In actual implementation, the terminal receives a pressing operation of a player on the steering control, obtains the pressing duration or the pressure of the pressing operation, and when the pressing duration reaches a duration threshold or the pressure reaches a pressure threshold, can present sliding indication information in the interface for indicating the player to slide the steering control to the sliding second area, and trigger a direction reset instruction for the virtual carrier, and the terminal executes subsequent operations according to the received direction reset instruction. The second region may be the same region as the first region or may be two different regions. In this embodiment, the state of the steering control is not changed, i.e., the steering control is always in a fixed state.
For example, referring to fig. 9, fig. 9 is a schematic view of a sliding operation provided in an embodiment of the present application, in a virtual interface corresponding to a game scene, a virtual vehicle "boat" carrying a virtual object and two steering controls are presented, after a player finds that a steered direction of the virtual vehicle deviates from a travel path of the virtual vehicle, a pressing operation is performed on the steering controls, when a pressing duration of the pressing operation reaches a duration threshold (e.g., 2s), a sliding indication information "move to be resettable" is presented in the virtual interface, and a second area (numbered 1 in the figure) is presented, the player performs the sliding operation, a terminal acquires a position of the sliding operation, and when the position of the sliding operation is within the second area, the terminal may receive a direction resetting instruction for the virtual vehicle, and the terminal performs a subsequent operation based on the direction resetting instruction.
In some embodiments, the terminal may also trigger a direction reset instruction for the virtual vehicle by: the terminal presents a voice input function item in an interface of a virtual scene; and receiving a direction reset instruction in a voice form aiming at the virtual vehicle based on the voice input function item.
In actual implementation, the terminal may receive a direction reset instruction for the virtual vehicle based on the voice entry function item through the voice entry function item presented in the virtual interface. It should be noted that, the direction reset instruction in the form of voice may be a voice password list preset by the terminal, and the player may trigger the direction reset instruction only by inputting the voice password in the password list, so that the speed of analyzing the voice password by the terminal may be increased, and the efficiency of triggering the direction reset instruction may be increased.
For example, referring to fig. 10, fig. 10 is a schematic view of a voice input interface provided in an embodiment of the present application, in the virtual interface, a virtual vehicle "boat" carrying a virtual object is presented, after a terminal performs a steering operation on the "boat", a player finds that the moving virtual vehicle deviates from a normal moving path, at this time, the player may press a voice input function item (or a voice input icon, which is indicated by reference number 1 in the figure) in the virtual interface, and when a password such as "reset" is inputted by voice, the terminal receives a voice input, analyzes the voice input, and triggers a direction reset instruction. The process of analyzing the voice input by the terminal may be based on a neural network model, predicting the voice input content, obtaining a direction reset instruction output by the player, and generating a direction reset instruction.
In step 1032a, the terminal responds to the direction resetting instruction, and adjusts the orientation of the virtual vehicle, so that the orientation of the adjusted virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle.
In some embodiments, referring to fig. 11, fig. 11 is a flowchart illustrating a virtual vehicle control method provided in an embodiment of the present application, and based on fig. 3, after the terminal performs step 103 to adjust the orientation of the virtual vehicle, the terminal may further continue to perform the steps illustrated in fig. 11.
And 104a, when the terminal receives the steering instruction for the virtual vehicle again and the direction of the steered virtual vehicle deviates from the running path of the virtual vehicle again, acquiring the triggering times of the direction resetting instruction for the virtual vehicle.
In actual implementation, the terminal responds to the direction trigger instruction, adjusts the direction of the virtual vehicle for multiple times, and when the terminal receives the steering instruction again, the direction of the virtual vehicle can be automatically adjusted according to the relation between the historical adjustment times and the time threshold value. The terminal obtains the historical adjustment times (namely the historical trigger times of the direction reset instruction) for the orientation of the virtual carrier in a preset time period, and compares the historical adjustment times with the time threshold value.
And 105a, when the triggering times reach the time threshold, automatically adjusting the orientation of the virtual carrier by the terminal, so that the orientation of the adjusted virtual carrier is consistent with the orientation of the driving path of the current position of the virtual carrier.
In actual implementation, when the historical trigger times of the direction reset instruction reach a preset time threshold, the terminal automatically adjusts the orientation of the virtual vehicle. In practical applications, in a driving game in which steering operation is dominant, in order to ensure fairness and reasonableness of the game, it is a conditional restriction that the terminal automatically adjusts the orientation of the virtual vehicle, for example, the player is freely allocated the number of automatic adjustments corresponding to the player level, or the number of toy payment purchase adjustments.
In some embodiments, referring to fig. 12, fig. 12 is a schematic view of a virtual vehicle direction resetting method provided in an embodiment of the present application, and based on fig. 3, after the terminal performs step 103 to adjust the orientation of the virtual vehicle, the terminal may further continue to perform the steps shown in fig. 12.
And 104b, the terminal predicts the times of triggering the direction reset instruction in the first time period from the current time through the neural network model to obtain a prediction result. And 105b, when the prediction result represents that the number of times of triggering the direction resetting instruction is larger than the number threshold value in the first time period, periodically adjusting the orientation of the virtual vehicle in the first time period, so that the orientation of the adjusted virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle.
In practical implementation, the terminal may collect driving data of each sample virtual carrier in each sample virtual scene pair in the sample virtual scene pair, collect sample scene data of each sample virtual scene in the sample virtual scene pair, construct a training sample according to the collected sample driving data, sample scene data and direction reset data, train the neural network model with the training sample as an input of the neural network model to be trained and with the direction reset times of the virtual carrier as annotation data, obtain the trained neural network model, predict the number of times of triggering a direction reset instruction in a first time period from a current time according to the trained neural network model, obtain the direction reset times of the current virtual carrier, and when a prediction result is characterized in the first time period, and when the number of times of triggering the direction reset instruction is greater than the number threshold, periodically adjusting the orientation of the virtual vehicle in a first time period. In some embodiments, referring to fig. 13, fig. 13 is a schematic view illustrating that the virtual vehicle provided in the embodiment of the present application is in a stable orientation state, and based on fig. 3, after the terminal performs the orientation adjustment on the virtual vehicle in step 103, the terminal may further continue to perform the steps illustrated in fig. 13.
And 104c, the terminal controls the virtual vehicle to be in a stable state in a second time period from the current time.
The second time period may be set in advance according to actual conditions, for example, the time point after the terminal adjusts the orientation of the virtual vehicle in response to the direction reset command is T1, and the preset second time period is T, and the unit may be time, minute, second, millisecond, and the like. That is, during a time period T from the time point T1, the orientation of the virtual vehicle is in a steady state, that is, in a state of being aligned with the orientation of the travel path at the current position of the virtual vehicle. And in the T time period, the terminal forcibly controls the virtual vehicle to be in a stable orientation state.
And 105c, in the process that the virtual vehicle is in the stable orientation state, when a steering command for the virtual vehicle is received, controlling the virtual vehicle to steer according to the direction indicated by the steering command, wherein the orientation of the steered virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle.
In actual implementation, the terminal may control the virtual vehicle to be in a stable state, and when the virtual vehicle is in the stable state, and receives the steering command for the virtual vehicle again, directly control the virtual vehicle to steer without deviating.
In some embodiments, referring to fig. 14, fig. 14 is a schematic view of an orientation of an automatically adjusted virtual vehicle provided in an embodiment of the present application, and based on fig. 3, step 103 in the figure may be implemented by steps 1031b to 1032 b.
And step 1031b, the terminal acquires the orientation adjustment authority of the current login account for the virtual carrier.
In actual application, account numbers of different levels often have different functional permissions for the same virtual scene application, so that in actual implementation, a terminal can obtain the orientation adjustment permission of a current login account number for a virtual vehicle, and for increasing the user stickiness of the virtual scene application, for users of lower levels or new users, the use degree of the users on the virtual scene application can be improved by endowing some high-level users with the functional permissions.
For example, in the driving game a in which the steering operation is the active behavior, the new user often has a reduced preference degree for the driving game a because the control manner of the virtual vehicle is not well known. Therefore, in order to increase the user's preference for the driving game a, the orientation adjustment authority for the virtual vehicle may be assigned to a new user who registers the driving game a or a user with a lower ranking. It should be noted that, in order to ensure fairness and fairness of users at different levels, a preset number of direction adjustments may be provided for a new user or a user at a low level, and after use is completed, the new user or the user at a low level needs to continue purchasing and using in a payment manner.
And 1032b, when the current login account is determined to have the orientation adjustment right for the virtual vehicle, automatically adjusting the orientation of the virtual vehicle, so that the orientation of the adjusted virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle.
In actual implementation, when the current login account has the orientation adjustment authority for the virtual vehicle, the terminal can automatically adjust the orientation of the virtual vehicle when detecting that the orientation of the virtual vehicle deviates from the normal driving path.
For example, assuming that a new user U of the driving game a is given 5 times of opportunities to automatically adjust the orientation of the virtual vehicle, the user U may automatically determine whether to use the automatic adjustment when the orientation of the virtual vehicle needs to be adjusted, and the number of uses of the user U per time of the automatic adjustment of the virtual vehicle is reduced until the 5 times of opportunities for the automatic adjustment are used up. At this time, the user U no longer has the orientation adjustment authority under the condition that the level is not changed, and when the orientation adjustment of the virtual vehicle is needed again, the user U can purchase the automatic adjustment times in a payment manner.
In some embodiments, the terminal may also initiate automatic adjustment of the orientation of the virtual vehicle by: the terminal presents a direction reset switch in an interface of a virtual scene; and controlling to start a direction reset mode for the virtual vehicle in response to an opening instruction for the direction reset switch. Correspondingly, the terminal can also adjust the orientation of the virtual vehicle by: and in the direction resetting mode, the terminal automatically adjusts the orientation of the virtual carrier, so that the orientation of the adjusted virtual carrier is consistent with the orientation of the driving path of the current position of the virtual carrier.
In actual implementation, the terminal can further start a direction reset mode for the virtual vehicle based on a direction reset switch arranged in the virtual scene interface, and when the virtual vehicle is in the direction reset mode, the terminal can automatically adjust the orientation of the virtual vehicle.
For example, referring to fig. 15, fig. 15 is a schematic view of a direction reset switch provided in an example of the present application, in the view, a virtual vehicle "ship" carrying a virtual object is presented in an interface of a virtual scene, and a direction reset switch (numbered 1 in the figure) is presented, a player clicks the direction reset switch, a terminal controls to turn on a direction reset mode for the virtual vehicle, and in the direction reset mode, when the terminal detects that the orientation of the virtual vehicle deviates, the orientation of the virtual vehicle may be automatically adjusted, so that the orientation of the adjusted virtual vehicle coincides with the orientation of the driving path at the current position of the virtual vehicle.
In some embodiments, the terminal may further control the virtual vehicle of the other party to be in the direction resetting proceeding state in the following manner, referring to fig. 16, where fig. 16 is a flow chart of the direction resetting disabling state provided by the embodiments of the present application, and is described with reference to the steps shown in fig. 16.
Step 201, the terminal responds to an equipment instruction aiming at the target virtual item, and controls the virtual object to equipment the target virtual item.
Illustratively, for game application a, there is player B and player C, and during the course of the game, the terminal controls player B to arm a virtual pistol in response to player B arming instructions for a virtual item (e.g., "virtual pistol").
Step 202, in the process that the interactive object of the virtual object carries the target virtual carrier to run, the terminal responds to the throwing instruction aiming at the target virtual item, and controls the virtual object to throw the target virtual item towards the interactive object.
Taking the example above, player B holds a virtual pistol and drives a virtual vehicle vessel B to chase a player C that also holds a virtual pistol and drives a virtual vehicle vessel C, and the terminal controls player B to launch a virtual item "bullet" toward player C with player B as home and player C as opponent.
Step 203, when the target virtual item is thrown to the sensing range of the target virtual item, the terminal controls the target virtual carrier to be in a direction reset forbidden state. The direction reset disabled state is used for keeping the direction of the target virtual vehicle unchanged when the direction of the steered target virtual vehicle deviates from the traveling path of the target virtual vehicle and a direction reset instruction for adjusting the direction of the target virtual vehicle is triggered.
In the above example, when the virtual item "bullet" launched by the player B is thrown into the sensing range of the "bullet" and hits the virtual carrier of the player C, the terminal controls the virtual carrier ship C of the player C to be in the direction reset disabled state at this time, and keeps the orientation of the virtual carrier ship C when being hit unchanged, that is, after the virtual carrier ship C is hit, the player C is prohibited from having the direction reset authority for the virtual carrier ship C, and no functional item or control related to the direction reset is presented in the virtual interface of the terminal where the player C is located.
In some embodiments, referring to fig. 17, fig. 17 is a schematic view of a virtual vehicle orientation adjustment method provided in the embodiment of the present application, and based on fig. 3, step 103 in the figure can be further implemented through steps 1031c to 1033 c.
And step 1031c, the terminal determines the circle center position corresponding to the driving path.
For example, referring to fig. 18, fig. 18 is a schematic diagram illustrating the adjustment of the orientation of the virtual vehicle according to the tangent line provided by the embodiment of the present application, in which reference numeral 1 shows a normal traveling path of the virtual vehicle, and reference numeral 2 shows a possible traveling path of the virtual vehicle after the virtual vehicle finds the steering operation at the position a. Because the virtual vehicle deviates at the position A, the circle center position O point corresponding to the driving path at the position A can be obtained, and a circle is drawn by taking O as the origin and the length of OA as the radius, wherein A is a point on the circumference.
And 1032c, the terminal determines a tangent line taking the current position of the virtual carrier as a tangent point based on the determined circle center position.
In the above example, a tangent line AB with O as the origin and the current position a of the virtual vehicle as the tangent point and a tangent line AC with a as the tangent point on the travel path 2 are obtained, and it can be determined that ≈ BAC is the angle at which the virtual vehicle deviates at a.
Step 1033c, the terminal adjusts the orientation of the virtual vehicle so that the orientation of the adjusted virtual vehicle is parallel to the tangential direction.
In the above example, the terminal adjusts the orientation of the virtual vehicle, and adjusts the angle BAC of the orientation of the virtual vehicle.
In some embodiments, the terminal may also implement the directional reset for the virtual vehicle by: the method comprises the steps that in the process of controlling the virtual vehicle to run, a terminal receives interactive operation of an interactive object on the virtual vehicle by using a virtual item; and when the orientation of the virtual vehicle deviates from the running path of the virtual vehicle as a result of the interactive operation, adjusting the orientation of the virtual vehicle so that the orientation of the adjusted virtual vehicle is consistent with the orientation of the running path of the current position of the virtual vehicle.
According to the method and the device, when the orientation of the virtual vehicle carrying the virtual object deviates from the driving path of the virtual vehicle, the orientation of the virtual vehicle can be adjusted in a mode of active triggering of a user or automatic detection of a terminal; when a user actively triggers a direction reset instruction aiming at the virtual carrier, the participation degree of the user can be improved; the terminal is controlled to adjust the virtual carrier according to the orientation adjustment authority of the user, so that the interest degree of the user in the application of the virtual scene can be improved, and the fairness of the application of the virtual scene for users of different grades can be ensured; and the terminal responds to the direction resetting instruction, adjusts the orientation of the virtual carrier and ensures that the orientation of the virtual carrier can be consistent with the orientation of the driving path of the current position of the virtual carrier. Therefore, the control of the player on the virtual carrier can be facilitated, and the running accuracy of the virtual carrier is guaranteed.
Next, an exemplary application of the embodiment of the present application in a practical application scenario will be described.
With the development of the technology and the improvement of the performance of the mobile equipment, the operation direction of the human-computer interface is inevitably developed to a more convenient and efficient control mode, so that a user has a more pleasant human-computer interaction mode. In a mobile phone game, if a player drives a virtual carrier, when the player clicks a left turn key and a right turn key to realize the left turn and the right turn of the virtual carrier, the head of the virtual carrier is turned to be large, so that the rotating amplitude of the virtual carrier is easily deviated greatly, and the virtual carrier is deviated from a normal driving path. In this case, the player often wants to level the driving route of the virtual vehicle by clicking the right steering key quickly after clicking the left steering key. However, in the related art, by clicking the left and right steering keys, the effect of leveling the carrier route cannot be achieved.
Based on this, the embodiment of the present application provides a vehicle control method in a virtual scene, which is a method for implementing a virtual vehicle to quickly straighten a line by adjusting remote sensing in a game application deployed in a mobile terminal. Therefore, when the player drives the virtual carrier, the virtual carrier line straightening operation can be realized by executing the relevant operations aiming at the left and right steering keys, the operation is fast and convenient, and the user experience is better. The straightening of the virtual vehicle line can be regarded as resetting the direction of the virtual vehicle when the direction of the virtual vehicle deviates from the normal driving route of the virtual vehicle.
Taking a terminal as a mobile phone and a virtual scene as a game scene as an example, a player opens a game application deployed in the mobile phone to present a virtual carrier carrying a virtual object. During the running process of the virtual vehicle, the player can control the virtual vehicle to run towards the left side direction by controlling a left steering key in the game interface, and can also control the virtual vehicle to run towards the right side direction by controlling a right steering key in the game interface.
For example, referring to fig. 19, fig. 19 is a schematic view of turning a virtual vehicle provided in the embodiment of the present application, in which a virtual vehicle "boat" carrying a virtual object corresponding to a player is in a driving process, when the player detects that the "boat" needs to turn left, the player may click a left turn key in a game interface, and at this time, the terminal (or the game server) monitors that the left turn key is triggered, and controls the virtual vehicle "boat" to turn left to drive. Then, in practical applications, the virtual vehicle "ship" tends to deviate from a normal driving path in the steering process of the virtual vehicle "ship" due to the large size of the virtual vehicle "ship" and the overall structure of the virtual vehicle "ship".
When the player finds that the orientation of the virtual vehicle "ship" deviates, the direction of the virtual vehicle can be reset or corrected through other operations based on the steering key (the clicking operation of the steering key is a transformation operation for the virtual vehicle, and other operations except the clicking operation, such as pressing, can be used to give other functions to the steering key), so that the virtual vehicle can normally run.
For example, referring to fig. 20, fig. 20 is a schematic view of a virtual vehicle direction resetting manner provided in the embodiment of the present application, in the figure, when a player finds that a virtual vehicle "ship" deviates, the player may press a left (or right) steering key for a long time, when the long press of the long press operation reaches a long time threshold (for example, 2s), a prompt message "move to a car head where the player can straighten" or "move to a car head where the player can perform direction resetting" is presented in a game interface, the player may perform a sliding operation at this time, determine a sliding position of the sliding operation, when the sliding position is in a target area where the direction resetting can be performed, trigger a direction resetting instruction, after receiving the direction resetting instruction for the virtual vehicle, the terminal (or the server) performs direction correction on the virtual vehicle, and performs operation of straightening the virtual vehicle in a corresponding direction (when the virtual vehicle is a car head and a body of the virtual vehicle), keeping the running direction of the virtual vehicle parallel to the road surface.
In some embodiments, referring to fig. 21, fig. 21 is a flowchart of a vehicle control method in a virtual scene provided in an embodiment of the present application, a game application is deployed on a mobile terminal (mobile phone), the game application is started, step 301 is executed, and a vehicle state is entered, that is, a virtual vehicle (such as a ship, a car, etc.) carrying a virtual object is presented in a game interface, where the virtual vehicle is in a driving state; then, when the player finds that the virtual vehicle needs to turn, the player executes step 302 to operate a turning button, that is, the player can trigger a turning instruction for the virtual vehicle by clicking a left turning button or a right turning button, and the terminal controls the virtual vehicle to execute the turning operation in response to the turning instruction; then, the player may execute step 303 to determine whether the vehicle body is biased and needs to be straightened, that is, the player may determine whether the orientation of the virtual vehicle after performing the steering operation deviates from the normal traveling path of the virtual vehicle, if not, the virtual vehicle continues to travel normally, if yes, the player may execute step 304, long press the steering key, that is, the player presses the steering key too long, and when the long press time of the initial long press operation reaches a preset time threshold, the player may execute step 305 to trigger a reminding operation of straightening the vehicle body, that is, as shown in fig. 20, in the game interface, a prompt message of "moving to the front of the straightenable vehicle at this point" is presented; then, the player may execute step 306 again to determine whether the vehicle body of the vehicle performs the straightening operation, if not, the straightening operation is not performed, the virtual vehicle travels normally, if so, execute step 307 to move the finger to the prompt area to achieve the vehicle body straightening, that is, when the vehicle body straightening operation needs to be performed, the player performs the sliding operation (moving the finger), when the final position of the sliding operation falls into the prompt area, the straightening operation for the vehicle body may be triggered, that is, a direction reset instruction is triggered, and the terminal adjusts the virtual vehicle in response to the direction reset instruction to achieve the virtual vehicle straightening, so that one straightening operation for the virtual vehicle is finished.
In practical implementation, the operation of straightening (or direction resetting) the virtual vehicle mainly includes the following steps: 1. firstly, detecting and judging whether a virtual carrier has steering operation or not; 2. after the virtual vehicle executes steering operation, long-pressing a steering key to trigger straightening operation; 3. judging whether straightening is needed according to the vehicle body line; 4. after the virtual vehicle is judged to need to be straightened, the fingers are dragged and moved to the straightening area, and the vehicle body straightening operation is achieved. In the process, after the terminal processor collects the event data, the gesture action can be executed and judged to be long-time pressing or sliding; and the control unit executes corresponding operation after receiving the processing result of the processor.
The embodiment of the application has wide application in a driving game scene taking steering operation as a leading behavior, can quickly and efficiently finish the straightening operation of the virtual carrier which is originally realized by a plurality of steps, can improve the operation efficiency, can obviously improve the operation experience of a player, and provides more game fun for the player.
It is understood that, in the embodiments of the present application, the data related to the user information and the like need to be approved or approved by the user when the embodiments of the present application are applied to specific products or technologies, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards of the related countries and regions.
Continuing with the exemplary structure of the vehicle control device 555 in the virtual scene provided in the embodiments of the present application implemented as a software module, in some embodiments, as shown in fig. 2, the software module in the vehicle control device 555 in the virtual scene stored in the memory 550 may include:
a presenting module 5551, configured to present, in an interface of a virtual scene, a virtual vehicle carrying a virtual object;
a steering module 5552, configured to, in a process of controlling the virtual vehicle to travel, in response to a steering command for the virtual vehicle, control the virtual vehicle to steer by a corresponding angle according to a direction indicated by the steering command;
an adjusting module 5553, configured to adjust the orientation of the virtual vehicle in response to the steered orientation of the virtual vehicle deviating from the driving path of the virtual vehicle, so that the adjusted orientation of the virtual vehicle coincides with the orientation of the driving path of the current position of the virtual vehicle.
In some embodiments, the adjusting module is further configured to receive a direction reset instruction for the virtual vehicle; and responding to the direction resetting instruction, and adjusting the orientation of the virtual vehicle, so that the orientation of the virtual vehicle after adjustment is consistent with the orientation of the driving path of the current position of the virtual vehicle.
In some embodiments, the adjusting module is further configured to present a direction reset control in the interface of the virtual scene; and when the direction resetting control is in an activated state, responding to the triggering operation aiming at the direction resetting control, and receiving a direction resetting instruction aiming at the virtual carrier.
In some embodiments, the adjusting module is further configured to present pattern drawing indication information in an interface of the virtual scene, where the pattern drawing indication information is used to indicate that a target pattern is drawn to trigger the direction resetting instruction; and receiving a direction resetting instruction for the virtual carrier when the drawn pattern is matched with the target pattern in response to the pattern drawing operation based on the pattern drawing instruction information.
In some embodiments, the adjusting module is further configured to present, in the interface of the virtual scene, at least one steering control for controlling the virtual vehicle to steer; receiving a steering instruction for the virtual vehicle in response to a triggering operation for the steering control.
In some embodiments, the adjusting module is further configured to, in response to a pressing operation on the steering control, control the corresponding steering control to be in a floating state when a pressing duration of the pressing operation reaches a duration threshold or a pressing pressure magnitude reaches a pressure threshold; in response to the steering control being in the hovering state being dragged to a first area, a direction reset instruction for the virtual vehicle is received.
In some embodiments, the adjustment module is further configured to control the steering control dragged to the first area to return to an initial position.
In some embodiments, the adjusting module is further configured to present sliding indication information corresponding to the steering control, where the sliding indication information is used to trigger the direction resetting instruction when the steering control is indicated to slide to the second area; based on the sliding indication information, a sliding operation for the steering control is received, and when the steering control is slid to the second area, a direction resetting instruction for the virtual vehicle is received.
In some embodiments, the adjusting module is further configured to, when the steering instruction for the virtual vehicle is received again, obtain the number of times of triggering the direction resetting instruction for the virtual vehicle when the steered direction of the virtual vehicle deviates from the driving path of the virtual vehicle again; and when the triggering times reach a time threshold value, automatically adjusting the orientation of the virtual carrier, so that the orientation of the adjusted virtual carrier is consistent with the orientation of the driving path of the current position of the virtual carrier.
In some embodiments, the adjusting module is further configured to predict, through a neural network model, a number of times that the direction reset instruction is triggered within a first time period from a current time to obtain a prediction result; and when the prediction result shows that the number of times of triggering the direction resetting instruction is greater than a number threshold value in the first time period, periodically adjusting the orientation of the virtual vehicle in the first time period, so that the orientation of the adjusted virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle.
In some embodiments, the adjusting module is further configured to control the virtual vehicle to be in a stable state during a second time period from a current time; when a steering instruction for the virtual vehicle is received in the process that the virtual vehicle is in a stable orientation state, the virtual vehicle is controlled to steer according to the direction indicated by the steering instruction, and the orientation of the virtual vehicle after steering is consistent with the orientation of the driving path of the current position of the virtual vehicle.
In some embodiments, the adjusting module is further configured to obtain an orientation adjustment permission of a current login account for the virtual vehicle; when the current login account is determined to have the orientation adjustment right for the virtual vehicle, automatically adjusting the orientation of the virtual vehicle, so that the adjusted orientation of the virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle.
In some embodiments, the presenting module is further configured to present a direction reset switch in the interface of the virtual scene;
accordingly, in some embodiments, the adjusting module is further configured to control to start a direction reset mode for the virtual vehicle in response to an on command for the direction reset switch; and automatically adjusting the orientation of the virtual vehicle in the direction resetting mode, so that the orientation of the virtual vehicle after adjustment is consistent with the orientation of the driving path of the current position of the virtual vehicle.
In some embodiments, the adjusting module is further configured to control the virtual object to equip the target virtual item in response to an equipment instruction for the target virtual item; controlling the virtual object to throw the target virtual item towards the interactive object in response to a throwing instruction aiming at the target virtual item in the running process of the interactive object carrying the target virtual carrier; when the target virtual prop is thrown into the sensing range of the target virtual prop, controlling the target virtual carrier to be in a direction resetting forbidden state; the direction resetting forbidden state is used for keeping the direction of the target virtual vehicle unchanged when the direction of the steered target virtual vehicle deviates from the running path of the target virtual vehicle and triggers a direction resetting instruction for adjusting the direction of the target virtual vehicle.
In some embodiments, the adjusting module is further configured to determine a circle center position corresponding to the driving path; determining a tangent line taking the current position of the virtual carrier as a tangent point based on the determined circle center position; and adjusting the orientation of the virtual carrier, so that the orientation of the virtual carrier after adjustment is parallel to the direction of the tangent line.
In some embodiments, the adjusting module is further configured to receive an interactive operation of an interactive object on the virtual vehicle using a virtual item in a process of controlling the virtual vehicle to travel; and when the orientation of the virtual vehicle deviates from the driving path of the virtual vehicle as a result of the interactive operation, adjusting the orientation of the virtual vehicle so that the adjusted orientation of the virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device executes the vehicle control method in the virtual scene in the embodiment of the present application.
Embodiments of the present application provide a computer-readable storage medium storing executable instructions, which when executed by a processor, will cause the processor to execute a vehicle control method in a virtual scene provided by embodiments of the present application, for example, the vehicle control method in the virtual scene shown in fig. 3.
In some embodiments, the computer-readable storage medium may be memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories.
In some embodiments, executable instructions may be written in any form of programming language (including compiled or interpreted languages), in the form of programs, software modules, scripts or code, and may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext Markup Language (HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
By way of example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
In summary, the embodiment of the application can be widely applied to driving game scenes with steering operation as a main behavior, so that the virtual vehicle straightening operation which is originally realized by multiple steps can be quickly and efficiently completed, the operation efficiency can be improved, the operation experience of a player can be remarkably improved, and more game fun can be provided for the player.
The above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present application are included in the protection scope of the present application.

Claims (20)

1. A method for controlling vehicles in a virtual scene, the method comprising:
presenting a virtual carrier carrying a virtual object in an interface of a virtual scene;
in the process of controlling the virtual vehicle to run, responding to a steering instruction for the virtual vehicle, and controlling the virtual vehicle to steer at a corresponding angle according to the direction indicated by the steering instruction;
and adjusting the orientation of the virtual vehicle in response to the fact that the steered orientation of the virtual vehicle deviates from the driving path of the virtual vehicle, so that the adjusted orientation of the virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle.
2. The method of claim 1, wherein the adjusting the orientation of the virtual vehicle such that the orientation of the virtual vehicle after the adjustment is consistent with the orientation of the driving path of the current position of the virtual vehicle comprises:
receiving a direction reset instruction for the virtual vehicle;
and responding to the direction resetting instruction, and adjusting the orientation of the virtual vehicle, so that the orientation of the virtual vehicle after adjustment is consistent with the orientation of the driving path of the current position of the virtual vehicle.
3. The method of claim 2, wherein the receiving a direction reset instruction for the virtual vehicle comprises:
presenting a direction reset control in an interface of the virtual scene;
and when the direction resetting control is in an activated state, responding to the triggering operation aiming at the direction resetting control, and receiving a direction resetting instruction aiming at the virtual carrier.
4. The method of claim 2, wherein the receiving a direction reset instruction for the virtual vehicle comprises:
presenting pattern drawing indication information in an interface of the virtual scene, wherein the pattern drawing indication information is used for indicating a drawing target pattern to trigger the direction resetting instruction;
and receiving a direction resetting instruction for the virtual carrier when the drawn pattern is matched with the target pattern in response to the pattern drawing operation based on the pattern drawing instruction information.
5. The method of claim 2, further comprising:
presenting, in an interface of the virtual scene, at least one steering control for controlling the virtual vehicle to steer;
receiving a steering instruction for the virtual vehicle in response to a triggering operation for the steering control.
6. The method of claim 5, wherein the receiving a direction reset instruction for the virtual vehicle comprises:
responding to the pressing operation of the steering control, and controlling the corresponding steering control to be in a suspension state when the pressing time of the pressing operation reaches a time threshold or the pressing pressure reaches a pressure threshold;
in response to the steering control being in the hovering state being dragged to a first area, a direction reset instruction for the virtual vehicle is received.
7. The method of claim 6, wherein after the adjusting the orientation of the virtual vehicle, the method further comprises:
controlling the steering control dragged to the first area to return to an initial position.
8. The method of claim 5, wherein the receiving a direction reset instruction for the virtual vehicle comprises:
presenting sliding indication information corresponding to the steering control, wherein the sliding indication information is used for triggering the direction resetting instruction when the steering control is indicated to slide to a second area;
based on the sliding indication information, a sliding operation for the steering control is received, and when the steering control is slid to the second area, a direction resetting instruction for the virtual vehicle is received.
9. The method according to claim 2, wherein after adjusting the orientation of the virtual vehicle so that the adjusted orientation of the virtual vehicle coincides with the orientation of the travel path of the current position of the virtual vehicle, the method further comprises:
when a steering instruction for the virtual vehicle is received again, and the direction of the virtual vehicle after steering deviates from the running path of the virtual vehicle again, acquiring the triggering times of a direction resetting instruction for the virtual vehicle;
and when the triggering times reach a time threshold value, automatically adjusting the orientation of the virtual carrier, so that the orientation of the adjusted virtual carrier is consistent with the orientation of the driving path of the current position of the virtual carrier.
10. The method according to claim 2, wherein after adjusting the orientation of the virtual vehicle so that the adjusted orientation of the virtual vehicle coincides with the orientation of the travel path of the current position of the virtual vehicle, the method further comprises:
predicting the number of times of triggering the direction reset instruction in a first time period from the current moment through a neural network model to obtain a prediction result;
and when the prediction result shows that the number of times of triggering the direction resetting instruction is greater than a number threshold value in the first time period, periodically adjusting the orientation of the virtual vehicle in the first time period, so that the orientation of the adjusted virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle.
11. The method according to claim 2, wherein after adjusting the orientation of the virtual vehicle in response to the direction reset command so that the adjusted orientation of the virtual vehicle coincides with the orientation of the travel path of the current position of the virtual vehicle, the method further comprises:
controlling the virtual vehicle to be in a stable state in a second time period from the current moment;
when a steering instruction for the virtual vehicle is received in the process that the virtual vehicle is in a stable orientation state, the virtual vehicle is controlled to steer according to the direction indicated by the steering instruction, and the orientation of the virtual vehicle after steering is consistent with the orientation of the driving path of the current position of the virtual vehicle.
12. The method of claim 1, wherein the adjusting the orientation of the virtual vehicle such that the orientation of the virtual vehicle after the adjustment is consistent with the orientation of the driving path of the current position of the virtual vehicle comprises:
acquiring orientation adjustment permission of a current login account for the virtual vehicle;
when the current login account is determined to have the orientation adjustment right for the virtual vehicle, automatically adjusting the orientation of the virtual vehicle, so that the adjusted orientation of the virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle.
13. The method of claim 1, further comprising:
presenting a direction reset switch in an interface of the virtual scene;
controlling to start a direction reset mode for the virtual vehicle in response to an on command for the direction reset switch;
the adjusting the orientation of the virtual vehicle so that the orientation of the virtual vehicle after the adjustment is consistent with the orientation of the driving path of the current position of the virtual vehicle comprises:
and automatically adjusting the orientation of the virtual vehicle in the direction resetting mode, so that the orientation of the virtual vehicle after adjustment is consistent with the orientation of the driving path of the current position of the virtual vehicle.
14. The method of claim 1, further comprising:
controlling the virtual object to equip the target virtual item in response to an equipment instruction for the target virtual item;
controlling the virtual object to throw the target virtual item towards the interactive object in response to a throwing instruction aiming at the target virtual item in the running process of the interactive object carrying the target virtual carrier;
when the target virtual prop is thrown into the sensing range of the target virtual prop, controlling the target virtual carrier to be in a direction resetting forbidden state;
the direction resetting forbidden state is used for keeping the direction of the target virtual vehicle unchanged when the direction of the steered target virtual vehicle deviates from the running path of the target virtual vehicle and triggers a direction resetting instruction for adjusting the direction of the target virtual vehicle.
15. The method of claim 1, wherein the adjusting the orientation of the virtual vehicle such that the orientation of the virtual vehicle after the adjustment is consistent with the orientation of the driving path of the current position of the virtual vehicle comprises:
determining the circle center position corresponding to the driving path;
determining a tangent line taking the current position of the virtual carrier as a tangent point based on the determined circle center position;
and adjusting the orientation of the virtual carrier, so that the orientation of the virtual carrier after adjustment is parallel to the direction of the tangent line.
16. The method of claim 1, further comprising:
receiving interactive operation of an interactive object on the virtual vehicle by using a virtual item in the process of controlling the virtual vehicle to run;
and when the orientation of the virtual vehicle deviates from the driving path of the virtual vehicle as a result of the interactive operation, adjusting the orientation of the virtual vehicle so that the adjusted orientation of the virtual vehicle is consistent with the orientation of the driving path of the current position of the virtual vehicle.
17. A bullet screen issuing device, characterized in that the device comprises:
the presentation module is used for presenting a virtual carrier carrying a virtual object in an interface of a virtual scene;
the steering module is used for responding to a steering instruction aiming at the virtual vehicle in the process of controlling the virtual vehicle to run and controlling the virtual vehicle to steer at a corresponding angle according to the direction indicated by the steering instruction;
the adjusting module is used for responding to the deviation of the direction of the virtual vehicle after the virtual vehicle turns to the driving path of the virtual vehicle, and adjusting the direction of the virtual vehicle so that the direction of the virtual vehicle after the adjustment is consistent with the direction of the driving path of the current position of the virtual vehicle.
18. An electronic device, characterized in that the electronic device comprises:
a memory for storing executable instructions;
a processor, configured to execute the executable instructions stored in the memory to implement the vehicle control method in the virtual scene according to any one of claims 1 to 16.
19. A computer-readable storage medium storing executable instructions, wherein the executable instructions when executed by a processor implement the vehicle control method in a virtual scene according to any one of claims 1 to 16.
20. A computer program product comprising a computer program or instructions which, when executed by a processor, implement the vehicle control method in a virtual scene of any one of claims 1 to 16.
CN202111536095.5A 2021-12-15 2021-12-15 Carrier control method, device, equipment and storage medium in virtual scene Pending CN114210051A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111536095.5A CN114210051A (en) 2021-12-15 2021-12-15 Carrier control method, device, equipment and storage medium in virtual scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111536095.5A CN114210051A (en) 2021-12-15 2021-12-15 Carrier control method, device, equipment and storage medium in virtual scene

Publications (1)

Publication Number Publication Date
CN114210051A true CN114210051A (en) 2022-03-22

Family

ID=80702569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111536095.5A Pending CN114210051A (en) 2021-12-15 2021-12-15 Carrier control method, device, equipment and storage medium in virtual scene

Country Status (1)

Country Link
CN (1) CN114210051A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116013130A (en) * 2023-01-19 2023-04-25 扬州浩海蓝生海洋装备有限公司 VR simulation-based deep sea Christmas tree operation method and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116013130A (en) * 2023-01-19 2023-04-25 扬州浩海蓝生海洋装备有限公司 VR simulation-based deep sea Christmas tree operation method and system
CN116013130B (en) * 2023-01-19 2024-01-30 扬州浩海蓝生海洋装备有限公司 VR simulation-based deep sea Christmas tree operation method and system

Similar Documents

Publication Publication Date Title
CN112090069B (en) Information prompting method and device in virtual scene, electronic equipment and storage medium
JP2023538962A (en) Virtual character control method, device, electronic device, computer-readable storage medium, and computer program
US20220266136A1 (en) Method and apparatus for state switching in virtual scene, device, medium, and program product
CN112416196B (en) Virtual object control method, device, equipment and computer readable storage medium
CN112121434B (en) Interaction method and device of special effect prop, electronic equipment and storage medium
CN112121414B (en) Tracking method and device in virtual scene, electronic equipment and storage medium
CN112402963B (en) Information sending method, device, equipment and storage medium in virtual scene
JP7391448B2 (en) Virtual object control method, device, equipment, storage medium and computer program product
CN112057860B (en) Method, device, equipment and storage medium for activating operation control in virtual scene
CN112402959A (en) Virtual object control method, device, equipment and computer readable storage medium
CN113633964A (en) Virtual skill control method, device, equipment and computer readable storage medium
CN112138385B (en) Virtual shooting prop aiming method and device, electronic equipment and storage medium
CN114210051A (en) Carrier control method, device, equipment and storage medium in virtual scene
US20230330525A1 (en) Motion processing method and apparatus in virtual scene, device, storage medium, and program product
CN113703654B (en) Camouflage processing method and device in virtual scene and electronic equipment
CN113018862B (en) Virtual object control method and device, electronic equipment and storage medium
CN113144617B (en) Control method, device and equipment of virtual object and computer readable storage medium
CN112717403B (en) Virtual object control method and device, electronic equipment and storage medium
CN114296597A (en) Object interaction method, device, equipment and storage medium in virtual scene
CN114146414A (en) Virtual skill control method, device, equipment, storage medium and program product
CN113769379A (en) Virtual object locking method, device, equipment, storage medium and program product
CN113769396B (en) Interactive processing method, device, equipment, medium and program product of virtual scene
CN112891930B (en) Information display method, device, equipment and storage medium in virtual scene
WO2024060924A1 (en) Interaction processing method and apparatus for virtual scene, and electronic device and storage medium
CN114146413A (en) Virtual object control method, device, equipment, storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40069402

Country of ref document: HK