CN111659116A - Virtual vehicle control method, device, equipment and medium - Google Patents

Virtual vehicle control method, device, equipment and medium Download PDF

Info

Publication number
CN111659116A
CN111659116A CN202010627561.XA CN202010627561A CN111659116A CN 111659116 A CN111659116 A CN 111659116A CN 202010627561 A CN202010627561 A CN 202010627561A CN 111659116 A CN111659116 A CN 111659116A
Authority
CN
China
Prior art keywords
virtual
prop
vehicle
virtual prop
virtual vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010627561.XA
Other languages
Chinese (zh)
Inventor
刘智洪
梁超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010627561.XA priority Critical patent/CN111659116A/en
Publication of CN111659116A publication Critical patent/CN111659116A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/44Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Abstract

The application discloses a control method, device, equipment and medium of a virtual vehicle, and relates to the field of virtual environments. The method comprises the following steps: displaying a virtual environment screen, the virtual environment screen comprising: a virtual vehicle equipped with a first virtual prop for creating a defensive effect against an attack of a second virtual prop for attacking the virtual vehicle; responding to a use instruction of a first virtual prop, and controlling the virtual vehicle to use the first virtual prop; controlling the first virtual prop to defend the second virtual prop in response to the second virtual prop attacking the virtual vehicle. The method can improve the man-machine interaction efficiency of the user for controlling the airplane to avoid missile operation.

Description

Virtual vehicle control method, device, equipment and medium
Technical Field
The present disclosure relates to the field of virtual environments, and in particular, to a method, an apparatus, a device, and a medium for controlling a virtual vehicle.
Background
In an application program based on a three-dimensional virtual environment, such as a first-person shooting game, a user can control a virtual character in the virtual environment to fly an airplane and move in the virtual environment.
In the related art, an RPG (Rocket-assisted Grenade) weapon is provided, a virtual character located on the ground can use the RPG weapon to lock an airplane in the air and launch a guided missile, and the guided missile locked by the airplane can chase the airplane until the guided missile hits the airplane or the guided missile collides with other obstacles. Therefore, when the aircraft piloted by the virtual character is missile-locked, the aircraft must be hit by the missile unless the user uses complex terrain, controls the aircraft to make high-difficulty flights (fast multiple turns, etc.), and makes the missile hit an obstacle.
In the related technology, a user controls an airplane to avoid a missile, the operation needs to be controlled by virtue of terrain advantages and high difficulty, the condition and the operation of the airplane to avoid the missile are too harsh, and the human-computer interaction efficiency is low.
Disclosure of Invention
The embodiment of the application provides a control method, a control device, control equipment and a control medium of a virtual vehicle, which can simplify the operation of a user for controlling an airplane to avoid a missile and improve the human-computer interaction efficiency. The technical scheme is as follows:
in one aspect, a method for controlling a virtual vehicle is provided, where the method includes:
displaying a virtual environment screen, the virtual environment screen comprising: a virtual vehicle equipped with a first virtual prop for creating a defensive effect against an attack of a second virtual prop for attacking the virtual vehicle;
responding to a use instruction of a first virtual prop, and controlling the virtual vehicle to use the first virtual prop;
controlling the first virtual prop to defend the second virtual prop in response to the second virtual prop attacking the virtual vehicle.
In another aspect, an apparatus for controlling a virtual vehicle is provided, the apparatus including:
a display module, configured to display a virtual environment screen, where the virtual environment screen includes: a virtual vehicle equipped with a first virtual prop for creating a defensive effect against an attack of a second virtual prop for attacking the virtual vehicle;
the control module is used for responding to a use instruction of a first virtual prop and controlling the virtual vehicle to use the first virtual prop;
the control module is further configured to control the first virtual prop to defend the second virtual prop in response to the second virtual prop attacking the virtual vehicle.
In another aspect, a computer device is provided, which includes a processor and a memory, in which at least one instruction, at least one program, a set of codes, or a set of instructions is stored, which is loaded and executed by the processor to implement the control method of a virtual vehicle as described above.
In another aspect, a computer-readable storage medium is provided, in which at least one instruction, at least one program, a set of codes, or a set of instructions is stored, which is loaded and executed by a processor to implement the method for controlling a virtual vehicle as described above.
In another aspect, embodiments of the present application provide a computer program product or a computer program, which includes computer instructions stored in a computer-readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device executes the control method of the virtual vehicle provided in the above-mentioned optional implementation manner.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
by arranging the bulletproof smoke (the first virtual prop) on the airplane, when the airplane is hit by the missile (the second virtual prop) after using the bulletproof smoke, the bulletproof smoke can defend the missile, and the airplane is protected from being hit down by the missile. When a user is piloting the airplane, the user finds that the airplane is locked or chased by the missile, the user does not need to carry out complex flight operation to throw off the missile, only needs to control the airplane to use bulletproof smoke, and when the missile hits the airplane, the bulletproof smoke can provide defense for the airplane, absorb the damage of the missile and protect the airplane. Therefore, the operation of avoiding the pursuit of the guided missile by the user is simplified, the escape probability of the airplane locked by the guided missile is improved, and the human-computer interaction efficiency of avoiding the guided missile operation by the user is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a terminal provided in an exemplary embodiment of the present application;
FIG. 2 is a block diagram of a computer system provided in an exemplary embodiment of the present application;
fig. 3 is a schematic user interface diagram of a control method of a virtual vehicle according to an exemplary embodiment of the present application;
fig. 4 is a schematic user interface diagram of a control method of a virtual vehicle according to another exemplary embodiment of the present application;
fig. 5 is a schematic user interface diagram of a control method for a virtual vehicle according to another exemplary embodiment of the present application;
fig. 6 is a schematic user interface diagram of a control method for a virtual vehicle according to another exemplary embodiment of the present application;
fig. 7 is a schematic user interface diagram of a control method for a virtual vehicle according to another exemplary embodiment of the present application;
fig. 8 is a schematic user interface diagram of a control method for a virtual vehicle according to another exemplary embodiment of the present application;
fig. 9 is a schematic user interface diagram of a control method for a virtual vehicle according to another exemplary embodiment of the present application;
fig. 10 is a flowchart of a method for controlling a virtual vehicle according to another exemplary embodiment of the present application;
FIG. 11 is a schematic view of a camera model corresponding to a perspective provided by another exemplary embodiment of the present application;
fig. 12 is a flowchart of a method for controlling a virtual vehicle according to another exemplary embodiment of the present application;
fig. 13 is a schematic user interface diagram of a control method for a virtual vehicle according to another exemplary embodiment of the present application;
fig. 14 is a schematic diagram of a crash box of a method for controlling a virtual vehicle according to another exemplary embodiment of the present application;
fig. 15 is a schematic diagram of a crash box of a method for controlling a virtual vehicle according to another exemplary embodiment of the present application;
fig. 16 is a schematic diagram of a detection ray of a control method of a virtual vehicle according to another exemplary embodiment of the present application;
fig. 17 is a flowchart of a method for controlling a virtual vehicle according to another exemplary embodiment of the present application;
fig. 18 is a flowchart of a method for controlling a virtual vehicle according to another exemplary embodiment of the present application;
fig. 19 is a block diagram of a control apparatus of a virtual vehicle according to another exemplary embodiment of the present application;
fig. 20 is a block diagram of a terminal provided in an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, terms referred to in the embodiments of the present application are briefly described:
virtual environment: is a virtual environment that is displayed (or provided) when an application is run on the terminal. The virtual environment may be a simulated world of a real world, a semi-simulated semi-fictional world, or a purely fictional world. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application. The following embodiments are illustrated with the virtual environment being a three-dimensional virtual environment.
Virtual roles: refers to a movable object in a virtual environment. The movable object may be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in a three-dimensional virtual environment. Optionally, the virtual character is a three-dimensional volumetric model created based on animated skeletal techniques. Each virtual character has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment. Illustratively, the virtual character has a life value, and when the life value of the virtual character is zero, the virtual character can not continue to move in the virtual world. For example, the life value is a criterion for determining whether the virtual character can move in the virtual world, and may also be referred to as a signal value, a red bar, and the like.
First Person shooter game (FPS): the shooting game is a shooting game that a user can play from a first-person perspective, and a screen of a virtual environment in the game is a screen that observes the virtual environment from a first virtual character perspective. In the game, at least two virtual characters carry out a single-game fighting mode in a virtual environment, the virtual characters achieve the purpose of survival in the virtual environment by avoiding attacks launched by other virtual characters or/and dangers (such as poison circle, marshland, bomb and the like) existing in the virtual environment, when the life value of the virtual characters in the virtual environment is zero, the life of the virtual characters in the virtual environment is ended, and the virtual characters which finally survive in the virtual environment are winners. Optionally, each client may control one or more virtual characters in the virtual environment, with the time when the first client joins the battle as a start time and the time when the last client exits the battle as an end time. Optionally, the competitive mode of the battle may include a single battle mode, a double group battle mode or a multi-person group battle mode, and the battle mode is not limited in the embodiment of the present application.
User interface UI (user interface) controls, any visual control or element that can be seen on the user interface of the application, for example, controls such as a picture, an input box, a text box, a button, a label, and the like, wherein some of the UI controls respond to the operation of the user, for example, the user triggers the use control to control the virtual vehicle to use the first virtual prop. The UI control referred to in the embodiments of the present application includes, but is not limited to: a use control, a firing control, and an aiming control.
The method provided by the application can be applied to the application program with the virtual environment and the virtual role. Illustratively, an application that supports a virtual environment is one in which a user can control the movement of a virtual character within the virtual environment. By way of example, the methods provided herein may be applied to: any one of a Virtual Reality (VR) application program, an Augmented Reality (AR) program, a three-dimensional map program, a military Simulation program, a Virtual Reality Game, an Augmented Reality Game, a First-Person shooter Game (FPS), a Third-Person shooter Game (TPS), a Multiplayer online tactical Game (MOBA), and a strategic Game (SLG).
Illustratively, a game in the virtual environment is composed of one or more maps of game worlds, the virtual environment in the game simulates a scene of a real world, a user can control a virtual character in the game to perform actions such as walking, running, jumping, shooting, fighting, driving, attacking other virtual characters by using virtual weapons, and the like in the virtual environment, the interactivity is strong, and a plurality of users can form a team on line to perform a competitive game.
In some embodiments, the application may be a shooting game, a racing game, a role playing game, an adventure game, a sandbox game, a tactical competition game, a military simulation program, or the like. The client can support at least one operating system of a Windows operating system, an apple operating system, an android operating system, an IOS operating system and a LINUX operating system, and the clients of different operating systems can be interconnected and intercommunicated. In some embodiments, the client is a program adapted to a mobile terminal having a touch screen.
In some embodiments, the client is an application developed based on a three-dimensional engine, such as the three-dimensional engine being a Unity engine.
The terminal in the present application may be a desktop computer, a laptop computer, a mobile phone, a tablet computer, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4) player, and so on. The terminal is installed and operated with a client supporting a virtual environment, such as a client of an application supporting a three-dimensional virtual environment. The application program may be any one of a Battle Royal (BR) game, a virtual reality application program, an augmented reality program, a three-dimensional map program, a military simulation program, a third person shooter game, a first person shooter game, and a multiplayer online tactic competition game. Alternatively, the application may be a stand-alone application, such as a stand-alone 3D game program, or may be a network online application.
Fig. 1 is a schematic structural diagram of a terminal according to an exemplary embodiment of the present application. As shown in fig. 1, the terminal includes a processor 101, a touch screen 102, and a memory 103.
The processor 101 may be at least one of a single-core processor, a multi-core processor, an embedded chip, and a processor having instruction execution capabilities.
The touch screen 102 includes a general touch screen or a pressure sensitive touch screen. The normal touch screen can measure a pressing operation or a sliding operation applied to the touch screen 102; a pressure sensitive touch screen can measure the degree of pressure exerted on the touch screen 102.
The memory 103 stores an executable program of the processor 101. Illustratively, the memory 103 stores a virtual environment program a, an application program B, an application program C, a touch pressure sensing module 18, and a kernel layer 19 of an operating system. The virtual environment program a is an application program developed based on the three-dimensional virtual environment module 17. Optionally, the virtual environment program a includes, but is not limited to, at least one of a game program, a virtual reality program, a three-dimensional map program, and a three-dimensional presentation program developed by a three-dimensional virtual environment module (also referred to as a virtual environment module) 17. For example, when the operating system of the terminal adopts an android operating system, the virtual environment program a is developed by adopting Java programming language and C # language; for another example, when the operating system of the terminal is the IOS operating system, the virtual environment program a is developed using the Object-C programming language and the C # language.
The three-dimensional Virtual environment module 17 is a module supporting multiple operating system platforms, and schematically, the three-dimensional Virtual environment module may be used for program development in multiple fields, such as a game development field, a Virtual Reality (VR) field, and a three-dimensional map field, and the specific type of the three-dimensional Virtual environment module 17 is not limited in the embodiment of the present application, and in the following embodiment, the three-dimensional Virtual environment module 17 is a module developed by using a Unity engine as an example.
The touch (and pressure) sensing module 18 is a module for receiving a touch event (and a pressure touch event) reported by the touch screen driver 191, and optionally, the touch sensing module may not have a pressure sensing function and does not receive a pressure touch event. The touch event includes: the type of touch event and the coordinate values, the type of touch event including but not limited to: a touch start event, a touch move event, and a touch down event. The pressure touch event comprises the following steps: a pressure value and a coordinate value of the pressure touch event. The coordinate value is used for indicating a touch position of the pressure touch operation on the display screen. Optionally, an abscissa axis is established in the horizontal direction of the display screen, and an ordinate axis is established in the vertical direction of the display screen to obtain a two-dimensional coordinate system.
Illustratively, the kernel layer 19 includes a touch screen driver 191 and other drivers 192. The touch screen driver 191 is a module for detecting a pressure touch event, and when the touch screen driver 191 detects the pressure touch event, the pressure touch event is transmitted to the pressure sensing module 18.
Other drivers 192 may be drivers associated with the processor 101, drivers associated with the memory 103, drivers associated with network components, drivers associated with sound components, and the like.
Those skilled in the art will appreciate that the foregoing is merely a general illustration of the structure of the terminal. A terminal may have more or fewer components in different embodiments. For example, the terminal may further include a gravitational acceleration sensor, a gyro sensor, a power supply, and the like.
Fig. 2 shows a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 200 includes: terminal 210, server cluster 220.
The terminal 210 is installed and operated with a client 211 supporting a virtual environment, and the client 211 may be an application supporting a virtual environment. When the terminal runs the client 211, a user interface of the client 211 is displayed on the screen of the terminal 210. The client can be any one of an FPS game, a TPS game, a military simulation program, an MOBA game, a tactical competitive game and an SLG game. In the present embodiment, the client is an FPS game for example. The terminal 210 is a terminal used by the first user 212, and the first user 212 uses the terminal 210 to control a first virtual character located in the virtual environment to perform an activity, and the first virtual character may be referred to as a master virtual character of the first user 212. The activities of the first avatar include, but are not limited to: adjusting at least one of body posture, crawling, walking, running, riding, flying, jumping, driving, picking up, shooting, attacking, throwing. Illustratively, the first avatar is an avatar, such as a simulated persona or an animated persona.
The device types of the terminal 210 include: at least one of a smartphone, a tablet, an e-book reader, an MP3 player, an MP4 player, a laptop portable computer, and a desktop computer.
Only one terminal is shown in fig. 2, but there are a plurality of other terminals 240 in different embodiments. In some embodiments, there is at least one other terminal 240 corresponding to the developer, a development and editing platform of the client of the virtual environment is installed on the other terminal 240, the developer can edit and update the client on the other terminal 240, and transmit the updated client installation package to the server cluster 220 through a wired or wireless network, and the terminal 210 can download the client installation package from the server cluster 220 to update the client.
The terminal 210 and the other terminals 240 are connected to the server cluster 220 through a wireless network or a wired network.
The server cluster 220 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. Server cluster 220 is used to provide background services for clients that support a three-dimensional virtual environment. Optionally, the server cluster 220 undertakes primary computing work and the terminals undertake secondary computing work; or, the server cluster 220 undertakes the secondary computing work, and the terminal undertakes the primary computing work; or, the server cluster 220 and the terminal perform cooperative computing by using a distributed computing architecture.
Optionally, the terminal and the server are both computer devices.
In one illustrative example, server cluster 220 includes servers 221 and 226, where servers 221 include a processor 222, a user account database 223, a combat service module 224, and a user-oriented Input/Output Interface (I/O Interface) 225. The processor 222 is configured to load an instruction stored in the server 221, and process data in the user account database 221 and the combat service module 224; the user account database 221 is used for storing data of user accounts used by the terminal 210 and the other terminals 240, such as head images of the user accounts, nicknames of the user accounts, fighting capacity indexes of the user accounts, and service areas where the user accounts are located; the fight service module 224 is used for providing a plurality of fight rooms for the users to fight against; the user-facing I/O interface 225 is used to establish communication with the terminal 210 through a wireless network or a wired network to exchange data.
With reference to the above description of the virtual environment and the description of the implementation environment, a method for controlling a virtual vehicle according to an embodiment of the present application is described, and an execution subject of the method is exemplified by a client running on a terminal shown in fig. 1. The terminal runs an application program, which is a program supporting a virtual environment.
The application provides an exemplary embodiment of applying the control method of the virtual vehicle to the FPS game.
A variety of virtual vehicles are provided in the virtual environment, including flying-type virtual vehicles, such as fighters, airplanes, helicopters, and the like. The user can control the master avatar to find the flying virtual vehicle in the virtual environment, for example, as shown in fig. 3, the user controls the master avatar 301 to find the helicopter 302 in the virtual environment. As shown in fig. 4, the user controls the master avatar 301 to approach the helicopter 302, and UI controls for entering the helicopter 302 are displayed on the user interface: a driver's seat 303 and a passenger seat 304. The user triggers the UI control of the pilot 303, that is, the master control virtual character 301 can be controlled to enter the pilot of the helicopter 302, and the piloting helicopter 302 moves in the virtual environment.
Illustratively, as shown in fig. 5, a piloting interface for piloting a helicopter by a master virtual character is provided, and a virtual environment picture in the piloting interface is a picture obtained by observing a virtual environment from a perspective of the helicopter 302. Also included in the piloting interface are UI controls for piloting helicopter 302: a move control 305, an up control 306, a down control 307, a use control 308, and a parachute jumping control 309. Movement controls 305, up controls 306, and down controls 307 are used to control the movement of helicopter 302 in the virtual environment. Usage control 308 is used to control helicopter 302 to use anti-ballistic smoke. The parachuting control 309 is used to control the master virtual character to leave the helicopter 302, when the helicopter is airborne, the master virtual character leaves the helicopter and parachutes, and when the helicopter is on the ground, the master virtual character leaves the helicopter.
For example, virtual weapons with pursuit functions, such as missiles, rocket barrels, etc., are also provided in shooting games, and generally such virtual weapons are composed of two parts: launching props and launching parts. The virtual weapon with the pursuit function comprises three steps when in use: aiming, locking and shooting. For example, as shown in fig. 6, when launching prop 402 is used by other virtual character 401, aiming box 403 of launching prop 402 is displayed on the user interface, the user adjusts the viewing angle range of other virtual character 401 to make aiming box 403 of launching prop 402 aim at the helicopter to be attacked and stay, during the stay time, launching prop 402 automatically searches for a target in aiming box 403 and locks one of the targets, and when the target is locked, the target is marked by locking box 404, for example, as shown in fig. 6, launching prop locks the helicopter with locking box 404. Illustratively, the user may adjust the target at which launching prop 402 is locked by adjusting the position of sight block 403. Illustratively, it takes time for launching prop 402 to lock onto a target, and thus, the user needs to aim at the target of interest with aiming block 403 and stay there for a while to be able to launch the prop to lock onto the target.
For example, the launching prop can be launched out of the launching member in two states, one aiming state and one locked state. The injection component injected in the aiming state can not chase the target and can only inject along the aiming direction; the injection member injected in the locked state may hit the target until it hits the target or hits another obstacle. When the launching prop locks the target (helicopter), the user controls the launching prop to eject the ejection part (second virtual prop), and the ejection part locks the target to eject so as to chase the target. For example, the shooting component can be controlled to hit the target by adjusting the flight direction of the shooting component in a guidance mode according to the position of the target. When the launching prop does not lock the target, the user controls the launching prop to launch the launching component, and the launching component cannot chase the target at the moment and only launches along the current aiming direction.
Since the flying virtual vehicle is large in size and moves in the air, and few obstacles in the air exist, the virtual character on the ground can easily lock the flying virtual vehicle by using a virtual weapon with a pursuing function. Once the virtual vehicle is locked by the virtual weapon, the virtual vehicle has difficulty escaping from the pursuit of the virtual weapon, and is knocked down with a high probability. In order to improve the safety degree of the flying virtual vehicle in the air, the embodiment assembles the first virtual prop for preventing missiles for the flying virtual vehicle, for example, the first virtual prop for preventing guided smoke, a protective cover, a heat source bomb, an electronic interference prop, an anti-radiation missile, a bait prop and the like, so as to improve the defense capability of the flying virtual vehicle.
For example, when another virtual character uses a virtual weapon to lock a helicopter piloted by a master virtual character controlled by a first user, as shown in fig. 7, a prompt 310 to lock helicopter 302 is displayed on the user interface of the first user and usage control 308 is highlighted, prompting the first user to trigger usage control 308 to use the first virtual prop to prevent missiles. Taking the virtual prop as an example of bulletproof smoke, after the helicopter uses the bulletproof smoke, as shown in fig. 8, bulletproof smoke 311 is generated around the helicopter 302, and at this time, if other virtual characters launch missiles to the helicopter, the bulletproof smoke 311 defends the missiles and protects the helicopter from being attacked by the missiles.
Illustratively, the use of bulletproof smoke can provide protection for the helicopter for a period of time, and after the period of time, the bulletproof smoke can automatically dissipate. Illustratively, the use of anti-ballistic smoke has cooling time and use times limitations. For example, each flying virtual vehicle can only use bulletproof smoke three times in one game, and after each use of bulletproof smoke, the flying virtual vehicle can be reused after waiting for 1 minute of cooling time.
For example, if the bulletproof smoke of the flying virtual vehicle is exhausted and then locked by a virtual weapon and chased, as shown in fig. 9, the user may control the main control virtual character 301 to parachute and leave the flying virtual vehicle, so as to prevent the missile from damaging the main control virtual character 301 after hitting the virtual vehicle.
Fig. 10 is a flowchart of a method for controlling a virtual vehicle according to an exemplary embodiment of the present application. The execution subject of the method is exemplified by a client running on the terminal shown in fig. 1, the client being a client supporting a virtual environment, and the method includes at least the following steps.
Step 501, displaying a virtual environment picture, wherein the virtual environment picture comprises: the virtual vehicle is provided with a first virtual prop, the first virtual prop is used for generating a defense effect on the attack of a second virtual prop, and the second virtual track is used for attacking the virtual vehicle.
Illustratively, after the battle is started, the client displays a user interface for the battle, which includes a virtual environment screen. Illustratively, the user interface may also include UI controls positioned over the virtual environment screen. Exemplary, the user interface of the battle may further include: a team forming interface for friend team forming, a matching interface for matching the virtual role with other virtual roles, a game loading interface for loading game information of the game, and the like.
Illustratively, after the game is started, the user controls the master control virtual role to move in the virtual environment, and the client observes the virtual environment from the perspective of the master control virtual role to acquire a virtual environment picture. The master virtual role in this embodiment is a virtual role controlled by the client. For example, when the master virtual character drives the virtual vehicle, the client observes the virtual environment from the perspective of the virtual vehicle or the master virtual character to acquire a virtual environment picture.
For example, the virtual environment picture in step 501 is a picture acquired by observing the virtual environment from the perspective of the virtual vehicle or the master virtual character.
The perspective is an observation angle when the virtual character or the virtual vehicle is observed in the virtual environment from a first person perspective or a third person perspective.
Optionally, taking an example that the viewing angle is a viewing angle of a virtual character as an example, in the embodiment of the present application, the viewing angle is an angle when the virtual character is observed through a camera model in a virtual environment.
Optionally, the camera model automatically follows the virtual character in the virtual environment, that is, when the position of the virtual character in the virtual environment changes, the camera model changes while following the position of the virtual character in the virtual environment, and the camera model is always within the preset distance range of the virtual character in the virtual environment. Optionally, the relative positions of the camera model and the virtual character do not change during the automatic following process.
The camera model is a three-dimensional model positioned around a virtual character in a virtual environment, and when a first-person visual angle is adopted, the camera model is positioned near the head of the virtual character or positioned at the head of the virtual character; when a third person perspective view is adopted, the camera model can be located behind the virtual character and bound with the virtual character, or located at any position away from the virtual character by a preset distance, the virtual character located in the virtual environment can be observed from different angles through the camera model, and optionally, when the third person perspective view is the shoulder-crossing perspective view of the first person, the camera model is located behind the virtual character (such as the head and the shoulder of the virtual character). Optionally, the viewing angle includes other viewing angles, such as a top viewing angle, in addition to the first person viewing angle and the third person viewing angle; the camera model may be positioned over the head of the virtual character when a top view is used, which is a view of viewing the virtual environment from an aerial top view. Optionally, the camera model is not actually displayed in the virtual environment, i.e. the camera model is not displayed in the virtual environment view displayed by the user interface.
To illustrate an example where the camera model is located at any position away from the virtual character by a preset distance, optionally, one virtual character corresponds to one camera model, and the camera model may rotate with the virtual character as a rotation center, for example: the camera model is rotated with any point of the virtual character as a rotation center, the camera model rotates not only angularly but also shifts in displacement during the rotation, and the distance between the camera model and the rotation center is kept constant during the rotation, that is, the camera model rotates on the surface of a sphere with the rotation center as the sphere center, wherein any point of the virtual character can be the head, the trunk or any point around the virtual character, which is not limited in the embodiment of the present application. Optionally, when the virtual character is observed by the camera model, the center of the view angle of the camera model points to the direction in which the point of the spherical surface on which the camera model is located points to the center of the sphere.
Optionally, the camera model may also observe the virtual character at a preset angle in different directions of the virtual character.
Referring to fig. 11, schematically, a point is determined in the virtual character 11 as a rotation center 12, and the camera model rotates around the rotation center 12, and optionally, the camera model is configured with an initial position, which is a position above and behind the virtual character (for example, a rear position of the brain). Illustratively, as shown in fig. 11, the initial position is position 13, and when the camera model rotates to position 14 or position 15, the direction of the angle of view of the camera model changes as the camera model rotates.
Optionally, the virtual environment displayed by the virtual environment screen includes: ladder, vertical ladder, climbing rock area, mountain, flat ground, river, lake, sea, desert, marsh, quicksand, sky, plant, building, vehicle.
Illustratively, the virtual environment screen includes a virtual vehicle driven by the master virtual character. Illustratively, the virtual vehicle is a flying vehicle, a land vehicle, or a water vehicle. For example, a flying vehicle includes: fighters, airplanes, helicopters, airships, hot air balloons, paragliders, rockets, etc.; the land vehicle includes: vehicles, tanks, skateboards, and the like. The water vehicle includes: boats, aircraft carriers, submarines, naval vessels, etc. The virtual vehicle may also be an amphibious or triphibious vehicle, for example. For example, after the start of the game, the virtual vehicle is randomly set at any position in the virtual environment, or the virtual vehicle is fixedly set at a designated position in the virtual environment. After the master control virtual role enters the game, the virtual carrier can be searched in the virtual environment, and after the virtual carrier is found, the master control virtual role enters the virtual carrier and drives the virtual carrier to move in the virtual environment.
For example, the virtual vehicle has a state value that is reduced when the virtual vehicle is under attack, and the virtual character cannot drive the virtual vehicle when the state value of the virtual vehicle is below a threshold. For example, when the virtual vehicle is a flying vehicle and the virtual vehicle is attacked in the air, and the state value is lower than the threshold value, the virtual vehicle may crash.
For example, taking the virtual vehicle as a flying vehicle, the master virtual character may drive the virtual vehicle to move arbitrarily in the three-dimensional space in the virtual environment. Illustratively, the virtual vehicle is equipped with a virtual item, for example, a first virtual item. For example, the virtual vehicle is equipped with virtual props, which are used, launched or released by the virtual vehicle, and the virtual props can only be used, launched or released on the virtual vehicle. For example, any virtual character located in the virtual vehicle may control the virtual vehicle to use its own equipped virtual prop. Illustratively, the virtual vehicle comprises a driving seat and a passenger seat, and the use of the virtual prop can be controlled by a virtual character positioned at the driving seat and can also be controlled by a virtual character positioned at the passenger seat. That is, the master virtual character may be driving the virtual vehicle or riding the virtual vehicle.
Illustratively, the first virtual prop is a defensive virtual prop equipped on a virtual vehicle. Illustratively, a first virtual item is used to defend against a second virtual item having pursuit capabilities. Illustratively, the second virtual prop is a virtual prop for attacking the virtual vehicle. For example, the second virtual prop may be a virtual weapon such as a missile, bomb, electromagnetic pulse, firearm, mine, grenade, etc. Illustratively, the second virtual prop is a virtual prop with a pursuit function, e.g., the second virtual prop may be a missile, rocket, bullet, shell, bomb, etc. with a pursuit function. For example, the second virtual prop may be at least one of an earth-air missile, an individual air-defense missile, an air-air missile, a gliding bomb, a guided projectile.
For example, the first virtual prop may provide a protective layer for the virtual vehicle after being used, and when the virtual vehicle is hit by the second virtual prop, the protective layer may absorb an attack of the second virtual prop to protect the virtual vehicle from the attack. For example, the first virtual prop may be a virtual prop that generates at least one of a bulletproof smoke, a protective cover, an energy wave, and a protective layer after use.
For example, the first virtual prop may also be a virtual prop that protects the virtual vehicle from the attack of the second virtual prop by other means. Aiming at different states when the second virtual prop with the pursuing function is used, the first virtual prop with different functions can be used for defending the attack of the second virtual prop.
For example, when the second virtual prop has a pursuit function, the second virtual prop may be launched in a targeting state or a locking state. When the second virtual prop is launched in the aiming state, the second virtual prop does not have the chasing function and is launched out according to the aiming direction; when the second virtual prop is launched in the locked state, the second virtual prop can chase the locked target.
Therefore, at the stage when the second virtual prop aims at the virtual vehicle to lock, the virtual vehicle can control the virtual vehicle to hide by using the first virtual prop, or electronic interference is emitted, so that the second virtual prop cannot lock the virtual vehicle. Illustratively, the virtual vehicle stealth includes at least one of a visual stealth and a radar wave stealth, the visual stealth may be implemented by changing a painting of a three-dimensional virtual model of the virtual vehicle, the radar wave stealth may make the radar of the second virtual prop unable to perceive the virtual vehicle, and the radar wave stealth may be implemented by reducing a probability that the second virtual prop identifies the target object. The electronic interference may interfere with the radar of the second virtual item, reducing the probability that the second virtual item locks the virtual vehicle.
After the second virtual prop is in a locked state and is ejected, when the second virtual prop tracks the virtual vehicle through guidance, the virtual vehicle can control the virtual vehicle to hide by using the first virtual prop, or emit electronic interference, or emit a heat source bullet, a foil strip or a bait, so that the second virtual prop cannot hit the virtual vehicle. Illustratively, when the second virtual prop is an infrared-guided virtual prop, the heat source projectile creates a plurality of heat sources in the air that interfere with the second virtual prop chasing the virtual vehicle. When the second virtual prop is radar-guided, the foil strip can create multiple reflection sources in the air that interfere with the second virtual prop pursuing the virtual vehicle. For example, the bait may simulate radar and infrared signals of the virtual vehicle, and transmitting the bait may cause a second virtual item to chase the bait, thereby protecting the virtual vehicle. For example, the first virtual prop may also hit the second virtual prop by launching other virtual props, so that the second virtual prop cannot hit the virtual vehicle.
For example, if the second virtual item hits the virtual vehicle, a negative effect may be generated on the virtual vehicle, where the negative effect includes: the method comprises the following steps of reducing the state value of the virtual carrier, reducing the abrasion degree of the virtual carrier, destroying certain parts of the virtual carrier to enable the virtual carrier to lose certain functions, reducing the moving speed of the virtual carrier, improving the oil consumption of the virtual carrier, enabling the virtual carrier to stop in place and not move, and destroying at least one of the virtual carrier.
For example, the defense effect of the first virtual prop is to protect the virtual vehicle from the negative effect of the second virtual prop. For example, the virtual vehicle is protected such that the second virtual item does not lower the state value of the virtual vehicle, or destroy the virtual vehicle.
For example, as shown in fig. 5, a virtual environment screen is displayed, where the virtual environment screen includes a virtual vehicle (helicopter 302), the virtual vehicle is equipped with a first virtual item, and a user can control the virtual vehicle to use the first virtual item by triggering a use control 308 on the user interface.
Step 502, in response to the instruction for using the first virtual item, controlling the virtual vehicle to use the first virtual item.
Illustratively, the usage instruction of the first virtual prop is an instruction generated according to a usage operation of a user. The use instruction is used for controlling the virtual vehicle to use the first virtual prop. For example, the user operation may be an operation of triggering a UI control (use control) on the user interface, or an operation input through another input device, for example, a click operation of a mouse, an operation on a touch pad, an operation on a keyboard, a voice operation, an action operation through a virtual reality or augmented reality device, or the like.
Illustratively, the client controls the virtual vehicle to use the first virtual prop according to the use instruction, so as to generate a defense effect and protect the virtual vehicle.
For example, as shown in fig. 5, the user triggers the use control 308 to control the virtual vehicle to use the first virtual prop, and as shown in fig. 8, the virtual vehicle (helicopter 302) uses the first virtual prop to generate bulletproof smoke 311 around the virtual vehicle.
Step 503, in response to the second virtual item attacking the virtual vehicle, controlling the first virtual item to defend the second virtual item.
For example, after the virtual vehicle uses the first virtual item to generate the defense effect, if the second virtual item attacks the virtual vehicle, the first virtual item defends the second virtual item, so that the second virtual item cannot generate a negative effect on the virtual vehicle, or the negative effect of the second virtual item on the virtual vehicle is reduced.
For example, the manner in which the first virtual item defends against the second virtual item may be any of those described above. In this embodiment, only the first virtual item is used to generate the protection layer, and the protection layer defends the second virtual item for example.
In summary, according to the method provided by this embodiment, by setting the bulletproof smoke (the first virtual prop) on the airplane, when the airplane is hit by the missile (the second virtual prop) after using the bulletproof smoke, the bulletproof smoke will defend the missile, and the airplane is protected from being hit by the missile. When a user is piloting the airplane, the user finds that the airplane is locked or chased by the missile, the user does not need to carry out complex flight operation to throw off the missile, only needs to control the airplane to use bulletproof smoke, and when the missile hits the airplane, the bulletproof smoke can provide defense for the airplane, absorb the damage of the missile and protect the airplane. Therefore, the operation of avoiding the pursuit of the guided missile by the user is simplified, the escape probability of the airplane locked by the guided missile is improved, and the human-computer interaction efficiency of avoiding the guided missile operation by the user is improved.
For example, in an alternative embodiment, the first virtual prop is a virtual prop that can generate a protective layer around the virtual vehicle, the first virtual prop is available for a period of time after use, and each virtual vehicle can only use the first virtual prop a limited number of times. In another optional embodiment, the second virtual prop is a virtual prop with a pursuit function, and when the virtual vehicle is locked, a prompt message that the virtual vehicle is locked is further displayed on the user interface, so that the user controls the virtual vehicle to defend using the first virtual prop.
Fig. 12 is a flowchart of a method for controlling a virtual vehicle according to an exemplary embodiment of the present application. Taking the execution subject of the method as an example of the client running on the terminal shown in fig. 1, where the client is a client supporting a virtual environment, based on the exemplary embodiment shown in fig. 10, step 502 further includes step 601 to step 602, step 502 further includes step 5021, and step 503 further includes step 5031.
Step 601, responding to the launched prop to lock the virtual carrier, and displaying a prompt message that the virtual carrier is locked.
Illustratively, the second virtual prop is a virtual prop launched using the launch prop. Namely, the second virtual prop and the launching prop are matched props, and the second virtual prop is a launching part of the launching prop. For example, the launching prop is a rocket tube and the second virtual prop is a rocket or a projectile. The mode of using the launching prop to lock the target and launch the second virtual prop is as follows: aiming a target, locking the target and launching a second virtual prop. For example, the other virtual characters are equipped with launching props, after the second virtual prop is filled, the launching props are used for aiming, the client automatically identifies targets in an aiming range and locks one of the targets, after the launching props lock the targets, the user triggers the shooting control to launch the second virtual prop, and the second virtual prop at the moment can lock the targets to chase after the targets are locked.
The launching prop can be a virtual prop used by other virtual characters, or a virtual prop used by a server or a client under automatic control. The other virtual characters are the virtual characters which use the launching prop to launch a second virtual prop to attack the virtual carrier. Illustratively, the other virtual roles may be virtual roles controlled by other clients or virtual roles controlled by the server. For example, the launching prop is a virtual prop automatically controlled by a server or a client, and is characterized in that: other virtual characters can arrange the launching prop at a certain position in the virtual environment, the launching prop can automatically search for a target entering the attack range of the launching prop, and the target is locked to launch a second virtual prop.
Exemplarily, if the second virtual prop is launched after the launched prop locks the virtual vehicle, the second virtual prop chase after the virtual vehicle; if the second virtual prop is launched after the launching prop aims at the virtual vehicle, the second virtual prop is launched along the aiming direction, and the virtual vehicle cannot be chased.
For example, the client determines that the launching prop locks the virtual vehicle according to a locking instruction sent by the server, where the locking instruction is generated after an aiming duration of the launching prop aiming at the virtual vehicle reaches a time threshold.
For example, when the launching prop is a virtual prop controlled by another virtual character, the other virtual character uses the launching prop to lock the virtual vehicle in the following manner: aiming the virtual carrier, and enabling the duration of the virtual carrier in the aiming range to reach a time threshold value, so that the launching prop locks the virtual carrier. Illustratively, other clients corresponding to other virtual characters report the current aiming state of the other virtual characters to the server in real time according to the operation of the user, the server records the time length of each target in the aiming range, and when the time length meets a time threshold, the launching prop is determined to lock the target. When a plurality of targets meet the time threshold value at the same time, the server also judges the occupied area of each target in the aiming range and determines the target with the largest occupied area as the locked target. For example, the process of launching the prop to lock the target is dynamic, and after one target is locked by the launching prop, if another target occupying a larger area of the aiming range meets a time threshold, the launching prop will lock a new target again.
For example, the server may determine the locked targets from the aiming range according to at least one of the information of the time for each target to stay in the aiming range, the area occupied by the aiming range, the type of the target, and the distance between the target and the launching prop.
When other virtual roles control the launching prop to launch a second virtual prop, if the launching prop currently has a locked target, the second virtual prop can chase the locked target; and if the launching prop does not have a locked target at present, the second virtual prop is launched along the aiming direction.
For example, as shown in fig. 6, a user interface corresponding to another virtual character is provided, an aiming box 403 is displayed on the user interface, the range in the aiming box 403 is the aiming range of the launching prop 402, when the other virtual character aims at the helicopter, and the time length of the helicopter in the aiming box 403 reaches a time threshold, the launching prop 402 locks the helicopter, and the helicopter is circled by using a locking box 404 to inform a user controlling the other virtual character that the target currently locked by the launching prop is the helicopter, and if the user triggers a shooting control 405 to shoot a second virtual prop at this time, the second virtual prop can chase the locked helicopter.
For example, when the launching prop is fixed at a certain position in the virtual environment and automatically searches for a virtual prop which is attacked by a target entering the attack range, the server may further determine that the launching prop locks the virtual vehicle in response to a time period that the virtual vehicle enters the attack range of the launching prop and satisfying a time threshold.
For example, when the virtual vehicle is locked, the server may send a lock instruction to the client, where the lock instruction is used to prompt the client that the virtual vehicle currently driven by the master virtual character is locked, so that the client may display a prompt in the user interface that the virtual vehicle is locked. The prompt information is used for prompting a user that the current virtual carrier is locked by the virtual prop with the pursuing function, and the virtual carrier is possibly attacked, so that the user can be well protected in advance.
Illustratively, the prompt message may be at least one of text message, icon, special effect, animation displayed on the user interface, voice broadcast message played by a speaker, and prompt tone. For example, as shown in fig. 7, when helicopter 302 is locked, a "locked" prompt 310 is displayed on the user interface.
For example, the step of determining that the virtual vehicle is locked may be performed by the client according to the control information of the other virtual roles sent by the server; the server may determine that the virtual vehicle is locked, and when the virtual vehicle is locked, the server may send a lock instruction to the client, and the client may determine that the virtual vehicle is locked according to the lock instruction.
In an optional embodiment, after the launching prop locks the virtual vehicle and launches the second virtual prop, the client may further display a distance between the second virtual prop and the virtual vehicle, or display a time that the second virtual prop is expected to reach the virtual vehicle.
For example, after the second virtual prop is launched by the launching prop locking virtual vehicle, the client may further control the second virtual prop to periodically launch a ray in the direction of the virtual vehicle, and when the ray collides with the three-dimensional virtual model of the virtual vehicle, the length of the ray is obtained, where the length is the distance between the second virtual prop and the virtual vehicle. For example, the client may further calculate, according to the flight speed of the second virtual item, a time when the second virtual item arrives at the virtual vehicle, and display the distance or the time on the user interface, so as to prompt the user how long the user is expected to be attacked by the second virtual item, so that the user defends using the first virtual item.
For example, after the launching prop locks the virtual prop and launches the second virtual prop, the client may further periodically obtain the positions of the second virtual prop and the virtual vehicle, calculate the distance between the second virtual prop and the virtual vehicle, and display the distance on the user interface, so that the user defends the second virtual prop using the first virtual prop in advance.
Step 602, locking the virtual vehicle in response to the launched prop, and displaying the use control according to the target display mode.
Illustratively, in the user interface, a use control corresponding to the first virtual item is further displayed on the virtual environment screen. If the using instruction of the first virtual prop is generated according to the triggering operation received by the using control, the client can also perform distinguishing display on the using control after the virtual carrier is determined to be locked so as to prompt the user to use the first virtual prop to defend possible attacks.
For example, the target display mode may be at least one of highlighting the use control, changing the style, color, size and position of the use control, and controlling the use control to flash. The target display mode is used for highlighting the use control, so that a user can trigger the use control to use the first virtual prop in time.
For example, as shown in fig. 7, when the virtual vehicle is locked, the client displays a "locked" prompt 310 on the user interface while highlighting the use control 308.
In an optional embodiment, in response to the launched prop locking the virtual vehicle, the client may further obtain the number of the first virtual props currently available to the virtual vehicle, and when the number of the first virtual props currently available to the virtual vehicle is 0, the client may further display a parachute jumping control according to the target mode. For example, the client highlights the parachute jumping control. So as to prompt the user that the current virtual carrier can not use the first virtual prop for defense, and prompt the user to avoid the attack of the second virtual prop in a parachute jumping mode.
Step 5021, responding to the use instruction of the first virtual prop, controlling the virtual carrier to use the first virtual prop in a target range, wherein the target range comprises a range determined according to the position of the virtual carrier.
For example, the client may display the special usage effect of the first virtual item within the target range around the virtual vehicle. Illustratively, the target range is a range around the virtual vehicle.
For example, taking the example that the first virtual item can provide a protective layer for the virtual vehicle after being used, the virtual vehicle may use the first virtual item in the target range, and display a special use effect in the target range, where the special use effect is used to prompt the user that the current virtual vehicle is protected by the first virtual item.
For example, the use effect may be at least one of a protective film effect, a smoke effect, a streamer effect, an icon, a text, an animation, and a change in paint color of the virtual vehicle.
The target range is a range determined according to the position of the virtual vehicle. For example, the target range may be a fixed range determined according to the position of the virtual vehicle when using the first virtual prop, or may be a movable range determined according to the real-time moving position of the virtual vehicle. For example, the target range is a three-dimensional space range in the virtual environment, and the shape of the target range may be arbitrary, for example, the target range may be any one of a spherical range, a cubic range, a rectangular parallelepiped range, a conical range, and an irregular range. Illustratively, the target range is a defense range in which the first virtual prop has a defense effect.
For example, the target range may be a spherical range centered on the virtual carrier, and the protective film special effect may be displayed on the edge of the spherical range.
For another example, the virtual vehicle may generate smoke around the virtual vehicle by using the first virtual prop, where the range occupied by the smoke is a target range, that is, the target range includes a range of the smoke when the virtual vehicle sprays the smoke and a range covered by the smoke when the smoke spreads. As shown in fig. 8, helicopter 302 uses a first virtual prop to display bulletproof smoke 311 around helicopter 302. After helicopter 302 has used the first virtual prop, the virtual character on the ground can see that helicopter 302 is surrounded by bulletproof smoke 311, as shown in fig. 13.
For example, the virtual vehicle may only use the first virtual item a limited number of times, i.e., the virtual vehicle may have a corresponding number of times that the first virtual item is available for use. The available number of times is the number of times that the first virtual item can be used by the current virtual vehicle. And responding to the virtual carrier using the first virtual prop, and reducing the available times of the virtual carrier. For example, if the available number of times of the current virtual vehicle is 3, the available number of times of the virtual vehicle is reduced by one every time the first virtual prop is used, and if the available number of times is 0, the virtual vehicle cannot use the first virtual prop.
For example, only three first virtual props can be installed on the virtual vehicle, the number of the first virtual props is reduced by 1 every time the virtual vehicle uses the first virtual props for 3 times, and the virtual vehicle does not have any first virtual props.
For example, in an alternative implementation, the virtual character may further increase the number of times the virtual vehicle can use the first virtual item by filling the virtual vehicle with the first virtual item.
Illustratively, the use of the first virtual item also has a cooling time, i.e., the virtual vehicle cannot continuously use the first virtual item in a short time. After the virtual vehicle uses the first virtual prop, the cooling time of the first virtual prop is counted down, and after the cooling time is finished, the virtual vehicle can use the first virtual prop again.
For example, as shown in fig. 7, it is displayed on usage control 308 that the available times that helicopter 302 can use the first virtual item is 3, and after helicopter 302 uses the first virtual item, as shown in fig. 8, the available times on usage control 308 becomes 2, and usage control 308 becomes an un-triggerable state, where usage control 308 in the un-triggerable state cannot receive a trigger operation of the user, that is, the user cannot control the virtual vehicle to use the first virtual item.
Step 5031, in response to the collision of the second virtual item with the collision box, controlling the first virtual item to defend the second virtual item.
Illustratively, a collision box is arranged in the target range, and the collision box is used to detect whether the second virtual prop enters the defense range of the first virtual prop, so as to determine whether the first virtual prop can defend the second virtual prop.
Illustratively, a collision box is arranged in the target range, and the collision box is used for detecting the collision of the second virtual prop with the target range. And in response to the collision of the three-dimensional virtual model of the second virtual prop and the collision box, the client determines that the second virtual prop enters the breeding range of the first virtual prop, and controls the first virtual prop to defend the second virtual prop.
The crash box is a three-dimensional box disposed over the target area. Illustratively, the crash box is an invisible box, i.e., the crash box is not visible to the user in the virtual environment view. Illustratively, the crash box is sized and shaped according to the target zone size and shape. For example, the crash box is the same size and shape as the target area. Alternatively, the size of the crash box is slightly smaller than the size of the target zone. Or the size of the collision box is slightly larger than that of the target range, so that the collision box wraps the target range. For example, the number of crash boxes may be arbitrary, and the client may wrap the entire target area with one crash box or fill the target area with a plurality of crash boxes. For example, since the target scope has a variety of alternative implementations, the crash box on the target scope may wrap around the virtual vehicle, or may partially wrap around the virtual vehicle.
Illustratively, to simplify the calculation process, the collision box is generally provided in a regular shape, for example, a cube, a cuboid, a sphere, a pyramid, a cylinder, and the like.
For example, as shown in fig. 14, when the target range is a rectangular parallelepiped range centered on the virtual vehicle, a rectangular parallelepiped crash box 312 is provided outside the helicopter 302, and the crash box 312 encloses the entire helicopter 302.
As another example, as shown in fig. 15, when the target range is an irregular range determined according to the position of the virtual vehicle, a plurality of crash boxes 312 are provided outside the helicopter 302, and the crash boxes 312 half-wrap the entire helicopter 302.
For example, the crash box may detect a crash in the virtual environment, and when the three-dimensional virtual model collides with a surface of the crash box, the crash box generates crash information, which includes: and at least one of information, collision point and collision time of the three-dimensional virtual model. The information of the virtual model includes: at least one of the type of the three-dimensional virtual model, the size of the three-dimensional virtual model, and the material of the three-dimensional virtual model. Illustratively, the second virtual prop has a three-dimensional virtual model in the virtual environment, and when the virtual model of the second virtual prop collides with the collision box, the collision box acquires collision information, and determines that the second virtual prop enters the target range according to the collision information.
For example, the second virtual prop is a virtual prop which flies to a target after being used, for example, the second virtual prop is a virtual prop which is launched, thrown or ejected for use, and after being launched, the second virtual prop periodically launches detection rays in the flying direction, and the detection rays are used for detecting a collision.
And controlling the first virtual prop to defend the second virtual prop in response to the detection ray of the second virtual prop colliding with the collision box.
The client may further control the second virtual prop to periodically emit a detection ray after the emission, where the detection ray is emitted along a flight direction of the second virtual prop, that is, in front of the second virtual prop (in a normal direction). Illustratively, the detection radiation emitting distance may be arbitrary. Illustratively, the emitting distance of the detection ray is short, so that the detection ray can rapidly and frequently detect an object which is possibly collided in front of the second virtual prop. For example, as shown in fig. 16, detection rays 314 are periodically emitted from the front end of the second virtual prop 313, and the detection rays are used to detect an obstacle in front of the second virtual prop, so as to determine a target that the second virtual prop can hit.
Illustratively, when the detection ray collides with the collision box, collision information can be generated, and the client determines that the second virtual prop enters the defense range of the first virtual prop according to the collision information, so as to control the first virtual prop to defend the second virtual prop.
For example, the protection of the virtual vehicle by the first virtual item is time-efficient, and the first virtual item can protect the virtual vehicle from being attacked by the second virtual item within the effective time after the time, as shown in fig. 17, step 502 further includes step 701, step 503 further includes step 5032, and step 503 further includes step 801 and step 802.
Step 701, timing the use duration of the first virtual item.
Illustratively, when the virtual vehicle uses the first virtual item, the client starts to time the usage duration (effective duration) of the first virtual item. And determining whether the first virtual prop is effective currently or not according to the using time length. The length of use is the length of time that the first virtual prop has been in effect. The effective duration is the longest time that the first virtual prop can protect the virtual vehicle. When the use duration of the first virtual prop reaches the effective duration, the first virtual prop is destroyed, and the virtual vehicle is not protected any more.
Step 5032, in response to the second virtual item attacking the virtual vehicle and the usage duration of the first virtual item being less than the validity duration, controlling the first virtual item to defend the second virtual item.
For example, the first virtual item may defend against the second virtual item only if the second virtual item attacks the virtual vehicle while the first virtual item is in an active state.
Step 801, in response to that the usage duration of the first virtual item reaches the effective duration, destroying the first virtual item.
For example, the first virtual item may disappear from the virtual environment after the usage duration reaches the effective duration, and no longer provides a defense effect for the virtual vehicle. For example, the first virtual prop can continuously protect the virtual vehicle from being attacked by the second virtual prop within 1 minute after use, and after 1 minute, the defense effect of the first virtual prop disappears, and at this time, the second virtual prop hitting the virtual vehicle can generate a negative effect on the virtual vehicle.
Illustratively, the special use effect of the first virtual prop is cancelled to be displayed in response to the use duration of the first virtual prop reaching the effective duration.
Step 802, in response to the number of the second virtual items defended by the first virtual item reaching the number threshold, destroying the first virtual item.
For example, the destruction of the first virtual item may be determined according to the effective time of the first virtual item, or may be determined according to the number of the second virtual items defended by the first virtual item. And when the number of the second virtual props defended by the first virtual props reaches a number threshold value, the first virtual props are destroyed. For example, the number threshold is 1, i.e., the first virtual item can only defend the virtual vehicle against the attack of the second virtual item once.
For example, the destruction of the first virtual prop may be further determined according to the damage value of the second virtual prop, when the second virtual prop is defended once by the first virtual prop, the defense value of the first virtual prop is reduced according to the damage value of the second virtual prop, and when the defense value of the first virtual prop is lower than a threshold value, the first virtual prop is destroyed.
In summary, in the method provided in this embodiment, after the virtual vehicle uses the first virtual item, the use special effect is displayed around the virtual vehicle, so as to inform the user that the virtual vehicle is currently protected by the first virtual item, and the user does not need to perform other operations of avoiding the second virtual item.
In the method provided by this embodiment, the collision box is set in the target range, and the collision box is used to detect whether the second virtual prop enters the defense range of the first virtual prop, so as to control the first virtual prop to defend the second virtual prop. And the collision box is used for collision detection, so that the information such as the time when the second virtual prop enters the defense range and the collision point can be accurately acquired, and the first virtual prop is accurately controlled to defend the second virtual prop.
According to the method provided by the embodiment, the detection rays are periodically emitted after the second virtual prop is emitted, the distance between the second virtual prop and the collision box is detected by using the detection rays, when the detection rays collide with the collision box, the defense range of the second virtual prop entering the first virtual prop is determined, and the first virtual prop is controlled to defend the second virtual prop. And the detection ray is used for replacing the three-dimensional virtual model of the second virtual prop, so that the logical operation of the client during collision detection can be simplified, and the operation load of the client is reduced.
According to the method provided by the embodiment, when the virtual carrier is locked by the launched prop, the prompt information that the virtual carrier is locked is displayed on the user interface, so that the user is informed that the virtual carrier is locked and may be attacked, the user can judge whether to use the first virtual prop to defend or whether to jump to escape in advance, the reaction time is provided for the user, and the operation difficulty of the user in escaping from missile pursuit is reduced.
According to the method provided by the embodiment, when the virtual carrier is locked by the launched prop, the use control corresponding to the first virtual prop is displayed in a distinguishing manner, so that the user is prompted to use the first virtual prop to avoid pursuing of the second virtual prop, the escape operation of the user is facilitated, the difficulty of the escape operation of the user is reduced, and the human-computer interaction efficiency is improved.
According to the method provided by the embodiment, when the virtual carrier is locked by the launching prop, whether the virtual carrier can use the first virtual prop is detected, when the virtual carrier cannot use the first virtual prop, the parachute jumping control is displayed in a distinguishing manner, the user is prompted to avoid pursuing of the second virtual prop through parachute jumping, escape operation of the user is facilitated, difficulty of escape operation of the user is reduced, and human-computer interaction efficiency is improved.
In the method provided by this embodiment, the valid duration is set for the first virtual item, and the duration of the first virtual item is measured after the first virtual item is used, so that protection can be provided for the virtual vehicle only when the first virtual item is valid. When the use time of the first virtual prop reaches effective time, or when the number of the second virtual props defended by the first virtual prop reaches a quantity threshold value, the first virtual prop is destroyed, so that the first virtual prop no longer protects the virtual carrier, and the local balance is prevented from being influenced by the overlarge defense force of the virtual carrier.
The method provided by the embodiment limits the defense capability of the virtual vehicle by limiting the times of using the first virtual prop by the virtual vehicle, and avoids the influence on the local balance due to the overlarge defense capability of the virtual vehicle.
An exemplary embodiment of a control method for using a virtual vehicle provided by the present application in a first-person shooter game is given.
Fig. 18 is a flowchart of a method for controlling a virtual vehicle according to an exemplary embodiment of the present application. The execution subject of the method is exemplified by a client running on the terminal shown in fig. 1, the client being a client supporting a virtual environment. The method comprises the following steps.
And step 901, controlling the virtual character to find the fighter.
For example, various virtual vehicles are provided for virtual characters in a virtual environment, wherein bulletproof smoke is arranged on a fighter plane to defend against the attack of missiles.
Step 902, determine whether the virtual character is near a fighter. If so, go to step 903; otherwise, go to step 901.
In step 903, a button for driving the fighter is displayed on the user interface.
And 904, judging whether the user clicks a button, if so, performing 905, and otherwise, performing 903.
And step 905, controlling the virtual character to enter the fighter and displaying the UI control corresponding to the fighter in driving.
Step 906, judging whether the fighter is locked by the missile, if so, performing step 907, otherwise, performing step 905.
Step 907, display the alert on the user interface and highlight the ballistic abatement control.
Illustratively, a prompt message that the fighter is locked is displayed on the user interface, and a use control corresponding to the bulletproof smoke is highlighted.
And 908, judging whether the user clicks the bulletproof control, if so, performing 909, and otherwise, performing 907.
And step 909, controlling the fighter to use the bulletproof smoke and generate smoke around the fighter.
Step 910, judging whether the fighter is attacked by the missile, if so, performing step 911; otherwise, go to step 909.
And step 911, determining that the missile attack is invalid.
In summary, according to the method provided by this embodiment, by setting the bulletproof smoke on the fighter, when the fighter is locked, the user is prompted that the fighter is currently locked, and the use control corresponding to the bulletproof smoke is highlighted, so that the user uses the bulletproof smoke to generate smoke around the fighter, and when the fighter is hit by the missile, the attack of the missile is invalidated, thereby protecting the fighter from the attack influence of the missile, simplifying the operation of the user for controlling the fighter to avoid the missile, and improving the human-computer interaction efficiency.
The above embodiments describe the above method based on the application scenario of the game, and the following describes the above method by way of example in the application scenario of military simulation.
The simulation technology is a model technology which reflects system behaviors or processes by simulating real world experiments by using software and hardware.
The military simulation program is a program specially constructed for military application by using a simulation technology, and is used for carrying out quantitative analysis on sea, land, air and other operational elements, weapon equipment performance, operational actions and the like, further accurately simulating a battlefield environment, presenting a battlefield situation and realizing the evaluation of an operational system and the assistance of decision making.
In one example, soldiers establish a virtual battlefield at a terminal where military simulation programs are located and fight in a team. The soldier controls the virtual character in the virtual battlefield environment to perform at least one operation of standing, squatting, sitting, lying on the back, lying on the stomach, lying on the side, walking, running, climbing, driving, shooting, throwing, attacking, injuring, reconnaissance, close combat and other actions in the virtual battlefield environment. The battlefield virtual environment comprises: at least one natural form of flat ground, mountains, plateaus, basins, deserts, rivers, lakes, oceans and vegetation, and site forms of buildings, vehicles, ruins, training fields and the like. The virtual roles include: virtual characters, virtual animals, cartoon characters and the like, wherein each virtual character has the shape and the volume of the virtual character in the three-dimensional virtual environment and occupies a part of the space in the three-dimensional virtual environment.
Based on the above situation, in one example, soldier a controls virtual character a to drive a fighter to move in a virtual environment, when the fighter is locked by a guided weapon, a prompt that the fighter is locked is displayed on a user interface, and meanwhile, a use control corresponding to bulletproof smoke is highlighted, so that soldier a triggers the use control to control the fighter to use the bulletproof smoke, and when a missile attacks the fighter, the bulletproof smoke can protect the fighter from being attacked by the missile.
In summary, in the embodiment, the control method of the virtual vehicle is applied to a military simulation program, and is used for simulating a situation that a soldier avoids a guided weapon in a flight process, so that the soldier can be better trained.
In the following, embodiments of the apparatus of the present application are referred to, and for details not described in detail in the embodiments of the apparatus, the above-described embodiments of the method can be referred to.
Fig. 19 is a block diagram of a control apparatus of a virtual vehicle according to an exemplary embodiment of the present application. The device comprises:
a display module 1001, configured to display a virtual environment screen, where the virtual environment screen includes: a virtual vehicle equipped with a first virtual prop for creating a defensive effect against an attack of a second virtual prop for attacking the virtual vehicle;
a control module 1002, configured to control, in response to a usage instruction of a first virtual item, a virtual vehicle to use the first virtual item;
the control module 1002 is further configured to control the first virtual prop to defend the second virtual prop in response to the second virtual prop attacking the virtual vehicle.
In an optional embodiment, the control module 1002 is further configured to, in response to the instruction for using the first virtual prop, control the virtual vehicle to use the first virtual prop within a target range, where the target range includes a range determined according to the position of the virtual vehicle.
In an alternative embodiment, a crash box is provided within the target area; the device further comprises:
a detecting module 1003, configured to detect a collision generated by the second virtual item and the collision box;
the control module 1002 is further configured to control the first virtual prop to defend the second virtual prop in response to the second virtual prop colliding with the collision box.
In an optional embodiment, the second virtual prop periodically emits a detection ray to the flight direction after emitting;
the detecting module 1003 is further configured to detect a collision between the detection ray of the second virtual item and the collision box;
the control module 1002 is further configured to control the first virtual prop to defend the second virtual prop in response to a collision of the detection ray of the second virtual prop with the collision box.
In an alternative embodiment, the second virtual item is a virtual item launched using a launch item;
the display module 1001 is further configured to lock the virtual vehicle in response to the launch prop, and display a prompt that the virtual vehicle is locked.
In an optional embodiment, the apparatus further comprises:
a receiving module 1010, configured to receive a locking instruction sent by a server;
a determining module 1004, configured to determine, according to the locking instruction sent by the server, that the launched prop locks the virtual vehicle, where the locking instruction is generated after an aiming duration for the launched prop to aim at the virtual vehicle reaches a time threshold.
In an alternative embodiment, the second virtual item is a virtual item launched using a launch item; a use control corresponding to the first virtual prop is also displayed on the virtual environment picture, and a use instruction of the first virtual prop is generated according to a trigger operation received by the use control;
the display module 1001 is further configured to lock the virtual vehicle in response to the launching prop, and display the use control according to a target display mode.
In an optional embodiment, the first virtual item corresponds to an effective time length; the device further comprises:
a timing module 1005, configured to time a duration of use of the first virtual item;
the control module 1002 is further configured to, in response to the second virtual item attacking the virtual vehicle, and the usage duration of the first virtual item is less than the valid duration, control the first virtual item to defend the second virtual item.
In an optional embodiment, the first virtual item corresponds to an effective time length; the device further comprises:
a destroying module 1006, configured to destroy the first virtual item in response to the usage duration of the first virtual item reaching the effective duration;
or the like, or, alternatively,
the destroying module 1006 is further configured to destroy the first virtual item in response to the number of the second virtual items defended by the first virtual item reaching a number threshold.
In an alternative embodiment, the virtual vehicle corresponds to a number of times available to use the first virtual prop; the device further comprises:
a count module 1007 configured to reduce the number of times the virtual vehicle is available in response to the virtual vehicle using the first virtual prop.
It should be noted that: the control device for a virtual vehicle provided in the above embodiment is only illustrated by the division of the above functional modules, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the functions described above. In addition, the control device of the virtual vehicle and the control method of the virtual vehicle provided in the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments and are not described herein again.
Fig. 20 is a block diagram illustrating a structure of a terminal 2000 according to an exemplary embodiment of the present application. The terminal 2000 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio layer iii, motion video Experts compression standard Audio layer 3), an MP4 player (Moving Picture Experts Group Audio layer IV, motion video Experts compression standard Audio layer 4), a notebook computer, or a desktop computer. Terminal 2000 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and the like.
In general, terminal 2000 includes: a processor 2001 and a memory 2002.
The processor 2001 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 2001 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 2001 may also include a main processor and a coprocessor, the main processor being a processor for processing data in an awake state, also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 2001 may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, the processor 2001 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
The memory 2002 may include one or more computer-readable storage media, which may be non-transitory. The memory 2002 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 2002 is used to store at least one instruction for execution by the processor 2001 to implement the control method of the virtual vehicle provided by the method embodiments in the present application.
In some embodiments, terminal 2000 may further optionally include: a peripheral interface 2003 and at least one peripheral. The processor 2001, memory 2002 and peripheral interface 2003 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 2003 through a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 2004, a touch display 2005, a camera 2006, an audio circuit 2007, a positioning assembly 2008, and a power supply 2009.
The peripheral interface 2003 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 2001 and the memory 2002. In some embodiments, the processor 2001, memory 2002 and peripheral interface 2003 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 2001, the memory 2002, and the peripheral interface 2003 may be implemented on separate chips or circuit boards, which are not limited in this embodiment.
The Radio Frequency circuit 2004 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuit 2004 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 2004 converts an electric signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electric signal. Optionally, the radio frequency circuit 2004 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 2004 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 2004 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 2005 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 2005 is a touch display screen, the display screen 2005 also has the ability to capture touch signals on or over the surface of the display screen 2005. The touch signal may be input to the processor 2001 as a control signal for processing. At this point, the display 2005 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 2005 may be one, providing the front panel of terminal 2000; in other embodiments, the display screens 2005 can be at least two, respectively disposed on different surfaces of the terminal 2000 or in a folded design; in still other embodiments, display 2005 may be a flexible display disposed on a curved surface or a folded surface of terminal 2000. Even more, the display screen 2005 can be arranged in a non-rectangular irregular figure, i.e. a shaped screen. The Display screen 2005 can be made of a material such as an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), and the like.
Camera assembly 2006 is used to capture images or video. Optionally, camera assembly 2006 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 2006 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 2007 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 2001 for processing or inputting the electric signals to the radio frequency circuit 2004 so as to realize voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be provided at different positions of the terminal 2000. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 2001 or the radio frequency circuit 2004 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 2007 may also include a headphone jack.
The positioning component 2008 is configured to locate a current geographic location of the terminal 2000 to implement navigation or LBS (location based Service). The positioning component 2008 may be a positioning component based on a GPS (global positioning System) in the united states, a beidou System in china, or a galileo System in russia.
Power supply 2009 is used to power the various components in terminal 2000. The power supply 2009 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power supply 2009 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 2000 also includes one or more sensors 2010. The one or more sensors 2010 include, but are not limited to: acceleration sensor 2011, gyro sensor 2012, pressure sensor 2013, fingerprint sensor 2014, optical sensor 2015, and proximity sensor 2016.
The acceleration sensor 2011 can detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the terminal 2000. For example, the acceleration sensor 2011 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 2001 may control the touch display screen 2005 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal acquired by the acceleration sensor 2011. The acceleration sensor 2011 may also be used for acquisition of motion data of a game or a user.
The gyroscope sensor 2012 can detect the body direction and the rotation angle of the terminal 2000, and the gyroscope sensor 2012 and the acceleration sensor 2011 can cooperate to acquire the 3D motion of the user on the terminal 2000. The processor 2001 may implement the following functions according to the data collected by the gyro sensor 2012: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 2013 may be disposed on the side bezel of terminal 2000 and/or underlying touch screen display 2005. When the pressure sensor 2013 is disposed on the side frame of the terminal 2000, the holding signal of the user to the terminal 2000 can be detected, and the processor 2001 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 2013. When the pressure sensor 2013 is disposed at a lower layer of the touch display screen 2005, the processor 2001 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 2005. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 2014 is used for collecting fingerprints of the user, and the processor 2001 identifies the identity of the user according to the fingerprints collected by the fingerprint sensor 2014, or the fingerprint sensor 2014 identifies the identity of the user according to the collected fingerprints. Upon identifying that the user's identity is a trusted identity, the processor 2001 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 2014 may be disposed on the front, back, or side of the terminal 2000. When a physical key or vendor Logo is provided on the terminal 2000, the fingerprint sensor 2014 may be integrated with the physical key or vendor Logo.
The optical sensor 2015 is used to collect ambient light intensity. In one embodiment, the processor 2001 may control the display brightness of the touch display 2005 according to the ambient light intensity collected by the optical sensor 2015. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 2005 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 2005 is turned down. In another embodiment, the processor 2001 may also dynamically adjust the shooting parameters of the camera assembly 2006 according to the ambient light intensity collected by the optical sensor 2015.
The proximity sensor 2016, also known as a distance sensor, is typically disposed on a front panel of the terminal 2000. The proximity sensor 2016 is used to collect a distance between a user and a front surface of the terminal 2000. In one embodiment, the touch display 2005 is controlled by the processor 2001 to switch from a bright screen state to a dark screen state when the proximity sensor 2016 detects that the distance between the user and the front surface of the terminal 2000 is gradually reduced; when the proximity sensor 2016 detects that the distance between the user and the front surface of the terminal 2000 is gradually increasing, the touch display 2005 is controlled by the processor 2001 to switch from a rest screen state to a bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 20 is not intended to be limiting of terminal 2000 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
The present application further provides a computer device comprising a processor and a memory, wherein the memory stores at least one instruction, at least one program, a set of codes, or a set of instructions, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by the processor to implement the control method of the virtual vehicle provided by any of the above exemplary embodiments.
The present application further provides a computer-readable storage medium, which stores at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by the processor to implement the method for controlling a virtual vehicle provided in any of the above exemplary embodiments.
The present application also provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device executes the control method of the virtual vehicle provided in the above-mentioned optional implementation manner.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. A method for controlling a virtual vehicle, the method comprising:
displaying a virtual environment screen, the virtual environment screen comprising: a virtual vehicle equipped with a first virtual prop for creating a defensive effect against an attack of a second virtual prop for attacking the virtual vehicle;
responding to a use instruction of a first virtual prop, and controlling the virtual vehicle to use the first virtual prop;
controlling the first virtual prop to defend the second virtual prop in response to the second virtual prop attacking the virtual vehicle.
2. The method of claim 1, wherein said controlling the virtual vehicle to use the first virtual prop in response to the instruction to use the first virtual prop comprises:
and responding to a use instruction of a first virtual prop, controlling the virtual vehicle to use the first virtual prop in a target range, wherein the target range comprises a range determined according to the position of the virtual vehicle.
3. The method of claim 2, wherein a collision box is disposed within the target area; the controlling the first virtual prop to defend against the second virtual prop in response to the second virtual prop attacking the virtual vehicle comprises:
controlling the first virtual prop to defend against the second virtual prop in response to the second virtual prop colliding with the collision box.
4. The method of claim 3, wherein the second virtual prop periodically emits detection rays in the direction of flight after exiting;
said controlling said first virtual prop to defend said second virtual prop in response to said second virtual prop colliding with said collision box, comprising:
controlling the first virtual prop to defend the second virtual prop in response to the detected ray of the second virtual prop colliding with the collision box.
5. The method of any one of claims 1 to 4, wherein the second virtual prop is a virtual prop launched using a launch prop, the method further comprising:
responding to the launching prop to lock the virtual carrier, and displaying prompt information that the virtual carrier is locked.
6. The method of claim 5, further comprising:
and determining that the launching prop locks the virtual vehicle according to a locking instruction sent by a server, wherein the locking instruction is generated after the aiming time of the launching prop aiming at the virtual vehicle reaches a time threshold.
7. The method of any one of claims 1 to 4, wherein the second virtual item is a virtual item launched using a launch item; a use control corresponding to the first virtual prop is also displayed on the virtual environment picture, and a use instruction of the first virtual prop is generated according to a trigger operation received by the use control; the method further comprises the following steps:
and responding to the launching prop to lock the virtual carrier, and displaying the use control according to a target display mode.
8. The method according to any one of claims 1 to 4, wherein the first virtual item corresponds to a validity period; the method further comprises the following steps:
timing the use duration of the first virtual prop;
the controlling the first virtual prop to defend against the second virtual prop in response to the second virtual prop attacking the virtual vehicle comprises:
responding to the second virtual prop attacking the virtual carrier, and controlling the first virtual prop to defend the second virtual prop, wherein the use duration of the first virtual prop is less than the effective duration.
9. The method according to any one of claims 1 to 4, wherein the first virtual item corresponds to a validity period; the method further comprises the following steps:
in response to the use duration of the first virtual item reaching the effective duration, destroying the first virtual item;
or the like, or, alternatively,
and in response to the number of the second virtual props defended by the first virtual prop reaching a quantity threshold, destroying the first virtual prop.
10. The method of any one of claims 1 to 4, wherein the virtual vehicle corresponds to a number of times the first virtual prop is available for use; the method further comprises the following steps:
reducing the number of times the virtual vehicle is available in response to the virtual vehicle using the first virtual prop.
11. An apparatus for controlling a virtual vehicle, the apparatus comprising:
a display module, configured to display a virtual environment screen, where the virtual environment screen includes: a virtual vehicle equipped with a first virtual prop for creating a defensive effect against an attack of a second virtual prop for attacking the virtual vehicle;
the control module is used for responding to a use instruction of a first virtual prop and controlling the virtual vehicle to use the first virtual prop;
the control module is further configured to control the first virtual prop to defend the second virtual prop in response to the second virtual prop attacking the virtual vehicle.
12. The apparatus of claim 11, wherein the control module is further configured to control the virtual vehicle to use the first virtual prop within a target range in response to the instruction for using the first virtual prop, the target range including a range determined according to a position of the virtual vehicle.
13. The apparatus of claim 12, wherein a crash box is disposed within the target area;
the control module is further configured to control the first virtual prop to defend the second virtual prop in response to the second virtual prop colliding with the collision box.
14. A computer device, the computer comprising: a processor and a memory, the memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement the control method of the virtual vehicle according to any one of claims 1 to 10.
15. A computer-readable storage medium, wherein at least one instruction, at least one program, a set of codes, or a set of instructions is stored in the storage medium, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by a processor to implement the method for controlling a virtual vehicle according to any one of claims 1 to 10.
CN202010627561.XA 2020-07-02 2020-07-02 Virtual vehicle control method, device, equipment and medium Pending CN111659116A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010627561.XA CN111659116A (en) 2020-07-02 2020-07-02 Virtual vehicle control method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010627561.XA CN111659116A (en) 2020-07-02 2020-07-02 Virtual vehicle control method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN111659116A true CN111659116A (en) 2020-09-15

Family

ID=72390757

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010627561.XA Pending CN111659116A (en) 2020-07-02 2020-07-02 Virtual vehicle control method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN111659116A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112107861A (en) * 2020-09-18 2020-12-22 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic equipment
CN112546627A (en) * 2020-12-22 2021-03-26 网易(杭州)网络有限公司 Route guiding method, device, storage medium and computer equipment
CN112870701A (en) * 2021-03-16 2021-06-01 网易(杭州)网络有限公司 Control method and device of virtual role
CN113633987A (en) * 2021-08-18 2021-11-12 腾讯科技(深圳)有限公司 Object control method and device, storage medium and electronic equipment
CN113769385A (en) * 2021-09-17 2021-12-10 腾讯科技(深圳)有限公司 Virtual object transfer method and related device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105378785A (en) * 2013-06-11 2016-03-02 娱美德Io有限公司 Method and apparatus for automatically targeting target objects in a computer game
CN110448891A (en) * 2019-08-08 2019-11-15 腾讯科技(深圳)有限公司 Control the method, apparatus and storage medium of virtual objects operation remote dummy stage property
CN111111171A (en) * 2019-12-17 2020-05-08 腾讯科技(深圳)有限公司 Operation control method, operation control device, storage medium, and electronic device
CN111265873A (en) * 2020-01-16 2020-06-12 腾讯科技(深圳)有限公司 Using method, device, equipment and storage medium of virtual prop
CN111298441A (en) * 2020-01-21 2020-06-19 腾讯科技(深圳)有限公司 Using method, device, equipment and storage medium of virtual prop

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105378785A (en) * 2013-06-11 2016-03-02 娱美德Io有限公司 Method and apparatus for automatically targeting target objects in a computer game
CN110448891A (en) * 2019-08-08 2019-11-15 腾讯科技(深圳)有限公司 Control the method, apparatus and storage medium of virtual objects operation remote dummy stage property
CN111111171A (en) * 2019-12-17 2020-05-08 腾讯科技(深圳)有限公司 Operation control method, operation control device, storage medium, and electronic device
CN111265873A (en) * 2020-01-16 2020-06-12 腾讯科技(深圳)有限公司 Using method, device, equipment and storage medium of virtual prop
CN111298441A (en) * 2020-01-21 2020-06-19 腾讯科技(深圳)有限公司 Using method, device, equipment and storage medium of virtual prop

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
"Unity3D《太空战机》如何给战机加上防护罩" *
WX917763: "鹰击长空F键是什么作用" *
云天2411: ""战地飞机是怎么机动躲避导弹"" *
优酷视频: "盘点:电影中驾驶飞机躲导弹的危险行为,感觉跳伞好危险啊", pages 11 *
吃人的仙人球: "被人遗忘的游戏——鹰击长空1", pages 00 - 15 *
逗游网: "《鹰击长空》关卡任务经验技巧" *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112107861A (en) * 2020-09-18 2020-12-22 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic equipment
CN112107861B (en) * 2020-09-18 2023-03-24 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic equipment
CN112546627A (en) * 2020-12-22 2021-03-26 网易(杭州)网络有限公司 Route guiding method, device, storage medium and computer equipment
CN112546627B (en) * 2020-12-22 2024-04-09 网易(杭州)网络有限公司 Route guiding method, device, storage medium and computer equipment
CN112870701A (en) * 2021-03-16 2021-06-01 网易(杭州)网络有限公司 Control method and device of virtual role
CN112870701B (en) * 2021-03-16 2024-02-23 网易(杭州)网络有限公司 Virtual character control method and device
CN113633987A (en) * 2021-08-18 2021-11-12 腾讯科技(深圳)有限公司 Object control method and device, storage medium and electronic equipment
CN113633987B (en) * 2021-08-18 2024-02-09 腾讯科技(深圳)有限公司 Object control method and device, storage medium and electronic equipment
CN113769385A (en) * 2021-09-17 2021-12-10 腾讯科技(深圳)有限公司 Virtual object transfer method and related device
CN113769385B (en) * 2021-09-17 2023-07-14 腾讯科技(深圳)有限公司 Virtual object transfer method and related device

Similar Documents

Publication Publication Date Title
CN110448891B (en) Method, device and storage medium for controlling virtual object to operate remote virtual prop
CN110917619B (en) Interactive property control method, device, terminal and storage medium
CN110721468B (en) Interactive property control method, device, terminal and storage medium
CN111408133B (en) Interactive property display method, device, terminal and storage medium
CN111659116A (en) Virtual vehicle control method, device, equipment and medium
WO2021143260A1 (en) Method and apparatus for using virtual props, computer device and storage medium
CN111589150B (en) Control method and device of virtual prop, electronic equipment and storage medium
CN110538459A (en) Method, apparatus, device and medium for throwing virtual explosives in virtual environment
CN112076467B (en) Method, device, terminal and medium for controlling virtual object to use virtual prop
CN111589149B (en) Using method, device, equipment and storage medium of virtual prop
CN110507990B (en) Interaction method, device, terminal and storage medium based on virtual aircraft
US20220161138A1 (en) Method and apparatus for using virtual prop, device, and storage medium
US11944904B2 (en) Data synchronization method and apparatus, terminal, server, and storage medium
CN110876849B (en) Virtual vehicle control method, device, equipment and storage medium
CN113041622B (en) Method, terminal and storage medium for throwing virtual throwing object in virtual environment
CN111330274B (en) Virtual object control method, device, equipment and storage medium
CN112138384A (en) Using method, device, terminal and storage medium of virtual throwing prop
CN112057857A (en) Interactive property processing method, device, terminal and storage medium
CN111760284A (en) Virtual item control method, device, equipment and storage medium
CN112933601A (en) Virtual throwing object operation method, device, equipment and medium
CN112316421A (en) Equipment method, device, terminal and storage medium of virtual prop
CN112402964B (en) Using method, device, equipment and storage medium of virtual prop
CN111589137B (en) Control method, device, equipment and medium of virtual role
CN112044073B (en) Using method, device, equipment and medium of virtual prop
CN110960849B (en) Interactive property control method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40028967

Country of ref document: HK