CN113117333B - Control method, device, terminal and storage medium of virtual flight vehicle - Google Patents

Control method, device, terminal and storage medium of virtual flight vehicle Download PDF

Info

Publication number
CN113117333B
CN113117333B CN202110530441.2A CN202110530441A CN113117333B CN 113117333 B CN113117333 B CN 113117333B CN 202110530441 A CN202110530441 A CN 202110530441A CN 113117333 B CN113117333 B CN 113117333B
Authority
CN
China
Prior art keywords
virtual
flight vehicle
flight
angle picture
controlling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110530441.2A
Other languages
Chinese (zh)
Other versions
CN113117333A (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110530441.2A priority Critical patent/CN113117333B/en
Publication of CN113117333A publication Critical patent/CN113117333A/en
Application granted granted Critical
Publication of CN113117333B publication Critical patent/CN113117333B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a control method, a control device, a control terminal and a control storage medium of a virtual flight vehicle, and belongs to the technical field of computers. The method comprises the following steps: displaying a view angle picture of the first virtual object; responding to the triggering operation of the virtual flying carrier, switching the visual angle picture of the first virtual object into the visual angle picture of the virtual flying carrier for displaying, wherein the virtual flying carrier is loaded with the interactive prop; controlling the virtual flight vehicle to fly according to the control operation of the virtual flight vehicle; and controlling the interaction prop to interact in response to the interaction triggering operation of the virtual flight vehicle. According to the technical scheme, the visual angle of the first virtual object is switched to the visual angle of the virtual flying carrier, so that a user can control the virtual flying carrier to fly based on the visual angle, the effect of freely moving the interactive prop loaded on the virtual flying carrier is achieved, further, the interactive prop and the virtual object perform effective interaction, and the man-machine interaction efficiency is remarkably improved.

Description

Control method, device, terminal and storage medium of virtual flight vehicle
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and apparatus for controlling a virtual flight vehicle, a terminal, and a storage medium.
Background
The shooting game is a common game on a terminal, the terminal can display a virtual scene of the shooting game, the virtual scene is displayed with a first virtual object, the first virtual object can interact with virtual objects except the first virtual object based on an interaction prop, and the first virtual object is a virtual object controlled by a user account number logged in by the terminal.
Some interactive props provided by the shooting game at present are placement props, namely, interactive props which cannot be moved after being placed in a virtual scene. Because the acquisition modes of the placement props are difficult, once the placement props are placed in poor positions, the placement props cannot effectively interact with virtual objects, so that the placement props are wasted, and the human-computer interaction efficiency is low.
Disclosure of Invention
The embodiment of the application provides a control method, a control device, a control terminal and a control storage medium for a virtual flying carrier, which can realize the effect that an interactive prop freely moves in a virtual scene along with the virtual flying carrier, so that the interactive prop can effectively interact with a virtual object, and the man-machine interaction efficiency is remarkably improved. The technical scheme is as follows:
In one aspect, a method for controlling a virtual flight vehicle is provided, the method comprising:
displaying a view angle picture of the first virtual object;
responding to triggering operation of a virtual flight vehicle, switching the visual angle picture of the first virtual object into the visual angle picture of the virtual flight vehicle for display, wherein the virtual flight vehicle is loaded with interactive props;
controlling the virtual flight vehicle to fly according to the control operation of the virtual flight vehicle;
and responding to the interaction triggering operation of the virtual flight vehicle, and controlling the interaction prop to interact.
In another aspect, there is provided a control device for a virtual flight vehicle, the device comprising:
the display module is used for displaying the visual angle picture of the first virtual object;
the display module is further used for responding to triggering operation of the virtual flight vehicle, switching the visual angle picture of the first virtual object into the visual angle picture of the virtual flight vehicle for display, and loading the virtual flight vehicle with the interactive prop;
the control module is used for controlling the virtual flight vehicle to fly according to the control operation of the virtual flight vehicle;
the control module is also used for responding to the interaction triggering operation of the virtual flying carrier and controlling the interaction prop to interact.
In some embodiments, the display module is configured to display, in response to a trigger operation on the virtual flight vehicle, a picture of the first virtual object using a calling prop for calling the virtual flight vehicle in a view picture of the first virtual object; and if the first virtual object finishes the use of the calling prop, switching the view angle picture of the first virtual object into the view angle picture of the virtual flight carrier for display.
In some embodiments, the display module is configured to display a switching animation on a view screen of the first virtual object in response to a trigger operation on the virtual flight vehicle; and if the switching animation is completely played, switching the view angle picture of the first virtual object into the view angle picture of the virtual flight carrier for displaying.
In some embodiments, the display module is configured to display, in response to a trigger operation on the virtual flight vehicle, a position of the virtual flight vehicle flying from a position of the first virtual object to an airborne position in a perspective view of the first virtual object; and switching the view angle picture of the first virtual object into the view angle picture of the virtual flight carrier for display.
In some embodiments, the display module is further configured to display an object marker on a view angle screen of the virtual flight vehicle in response to any second virtual object entering a search range of the virtual flight vehicle, where the second virtual object is a virtual object hostile to the first virtual object, and the object marker is used to mark any second virtual object.
In some embodiments, the display module is further configured to obtain a first position and a second position, where the first position is a position of the virtual flight prop in the virtual scene, and the second position is a position of the any second virtual object in the virtual scene; determining a third position according to the first position and the second position, wherein the third position is the position of the second virtual object in the visual angle picture of the virtual flight vehicle; and displaying the object mark at the third position.
In some embodiments, the display module is further configured to determine, according to the first position, a target plane, where the target plane is a plane corresponding to a view angle picture of the virtual flight vehicle; and (3) making a perpendicular line from the second position to the target plane, and determining an intersection point of the perpendicular line and the target plane as the third position.
In some embodiments, the control module is configured to perform at least one of:
controlling the virtual flight vehicle to fly horizontally in the virtual scene;
controlling the virtual flight vehicle to fly vertically in the virtual scene;
and controlling the virtual flight vehicle to switch the flight direction in the virtual scene.
In some embodiments, the visual angle view of the virtual flight vehicle displays a horizontal flight control;
the control module is used for responding to the triggering operation of the horizontal flight control and determining the horizontal flight direction; and controlling the virtual flight vehicle to accelerate the flight along the horizontal flight direction.
In some embodiments, the perspective view of the virtual flight vehicle displays a first vertical flight control and a second vertical flight control;
the control module is used for responding to the triggering operation of the first vertical flight control and controlling the virtual flight vehicle to fly vertically upwards in the virtual scene; and responding to the triggering operation of the second vertical flight control, and controlling the virtual flight vehicle to fly vertically downwards in the virtual scene.
In some embodiments, the view of the virtual flight vehicle displays a steering region;
The control module is used for responding to the left sliding operation in the steering area and controlling the virtual flight vehicle to steer left in the virtual scene; and controlling the virtual flight vehicle to steer to the right in the virtual scene in response to the right-hand sliding operation in the steering area.
In some embodiments, the visual angle picture of the virtual flight vehicle displays an aiming mark and a shooting control, and the interactive triggering operation of the virtual flight vehicle is the triggering operation of the shooting control;
and the control module is used for responding to the triggering operation of the shooting control and controlling the interactive prop to shoot in the direction indicated by the aiming mark.
In some embodiments, the display module is further configured to display a destruction animation on a view angle screen of the virtual flight vehicle if the virtual attribute value of the virtual flight vehicle meets a destruction condition; destroying the virtual flight vehicle and switching to the view angle picture of the first virtual object.
In another aspect, a terminal is provided that includes a processor and a memory for storing at least one segment of a computer program that is loaded and executed by the processor to implement operations performed in a method of controlling a virtual flight vehicle in an embodiment of the present application.
In another aspect, a computer readable storage medium having stored therein at least one segment of a computer program loaded and executed by a processor to implement operations performed in a method of controlling a virtual flight vehicle in an embodiment of the present application is provided.
In another aspect, a computer program product is provided that includes computer program code stored in a computer readable storage medium. The processor of the terminal reads the computer program code from the computer readable storage medium, and the processor executes the computer program code, so that the terminal performs the control method of the virtual flying carrier provided in various optional implementations of the above aspects.
The beneficial effects that technical scheme that this application embodiment provided brought are:
in the embodiment of the application, in the process that the user controls the first virtual object to interact, if the user triggers the virtual flying carrier, the visual angle of the first virtual object is switched to the visual angle of the virtual flying carrier through visual angle switching, so that the user can control the virtual flying carrier to fly based on the visual angle of the virtual flying carrier, the interactive prop loaded on the virtual flying carrier is not limited by the placing position, the effect that the interactive prop freely moves in a virtual scene along with the virtual flying carrier is achieved, the interactive prop can effectively interact with the virtual object, and the man-machine interaction efficiency is remarkably improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is an environmental schematic diagram of an implementation of a control method of a virtual flight vehicle according to an embodiment of the present application;
FIG. 2 is a flow chart of a method of controlling a virtual flight vehicle according to an embodiment of the present application;
FIG. 3 is a flow chart of a method of controlling a virtual flight vehicle according to an embodiment of the present application;
fig. 4 is a schematic view of a perspective view of a first virtual object according to an embodiment of the present application;
fig. 5 is a schematic view of a view angle screen of a virtual flight vehicle according to an embodiment of the present application;
fig. 6 is a schematic diagram of a perspective view of another first virtual object according to an embodiment of the present application;
fig. 7 is a schematic view of a view angle screen of a virtual flight vehicle according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a display object marker provided according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a determination of a third location provided in accordance with an embodiment of the present application;
FIG. 10 is a perspective view of another virtual flight vehicle according to an embodiment of the present application;
FIG. 11 is a schematic view of a visual angle view of another virtual flight vehicle according to an embodiment of the present application;
FIG. 12 is a schematic illustration of a rigid body assembly provided in accordance with an embodiment of the present application;
FIG. 13 is a flow chart of another method of controlling a virtual flight vehicle, provided in accordance with an embodiment of the present application;
FIG. 14 is a block diagram of a control device for a virtual flight vehicle, provided in accordance with an embodiment of the present application;
fig. 15 is a block diagram of a terminal structure according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The terms "first," "second," and the like in this application are used to distinguish between identical or similar items that have substantially the same function and function, and it should be understood that there is no logical or chronological dependency between the "first," "second," and "nth" terms, nor is it limited to the number or order of execution.
The term "at least one" in this application means one or more, and the meaning of "a plurality of" means two or more.
Hereinafter, terms related to the present application are explained.
Virtual scene: is a virtual scene that an application program displays (or provides) while running on a terminal. The virtual scene may be a simulation environment for the real world, a semi-simulation and semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, a virtual scene may include sky, land, sea, etc., the land may include environmental elements of a desert, city, etc., and a user may control a virtual object to move in the virtual scene.
Shooting Game (STG): the shooting type game is one kind of action type game, and has obvious action type game characteristics. Optionally, the shooting game includes, but is not limited to, a first person shooting game, a third person shooting game, a top shooting game, a head-up shooting game, a platform shooting game, a reel shooting game, a light gun shooting game, a key mouse shooting game, a shooting range game, a tactical shooting game, and the like, and the embodiment of the present application does not specifically limit the type of the shooting game.
Virtual object: refers to movable objects in a virtual scene. The movable object may be a virtual character, a virtual animal, a cartoon character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in the virtual scene. The virtual object may be an avatar in the virtual scene for representing a user. A virtual scene may include a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene, occupying a portion of space in the virtual scene. Optionally, when the virtual scene is a three-dimensional virtual scene, optionally, the virtual object is a three-dimensional model, which is a three-dimensional character constructed based on three-dimensional human skeleton technology, and the same virtual object exhibits different external figures by wearing different skins. In some embodiments, the virtual object may also be implemented using a 2.5-dimensional or 2-dimensional model, which is not limited by embodiments of the present application. Alternatively, the virtual object may be a Player Character controlled by an operation on the client, or may be a Non-Player Character (NPC) set in the virtual scene interaction.
The view angle screen is a screen for displaying a virtual scene and a virtual object through a first-person view angle. The view angle picture of the first virtual object is used to represent a scene picture obtained by observing the virtual scene through the view angle of the first virtual object. The view angle picture of the virtual flight vehicle is used for representing a scene picture obtained by shooting a virtual scene through a view angle of the virtual flight vehicle, such as a virtual camera mounted on the virtual flight vehicle.
Virtual flight vehicle: refers to an airplane prop in a virtual scene. The flying prop may be a virtual aircraft, a virtual glider, a virtual helicopter, a virtual unmanned aerial vehicle, or the like. The virtual flying carrier can be arranged on virtual weapons such as virtual machine guns, virtual missiles, virtual torches and the like.
The following describes an implementation environment according to the present application.
Fig. 1 is an implementation environment schematic diagram of a control method of a virtual flight vehicle according to an embodiment of the present application. Referring to fig. 1, the implementation environment includes: a first terminal 120, a server 140, and a second terminal 160.
The first terminal 120 and the second terminal 160 are directly or indirectly communicatively connected to the server 140 via a wireless network or a wired network.
The first terminal 120 installs and runs an application supporting a virtual scene. Optionally, the application includes any one of a First-Person shooter (FPS) game, a third-Person shooter game, a multiplayer online tactical competition (Multiplayer Online Battle Arena, MOBA) game, a virtual reality application, a three-dimensional map program, a military simulation program, or a multiplayer gunfight class survival game. In some embodiments, the first terminal 120 is a terminal used by a first user, and when the first terminal 120 runs the application, the first terminal 120 loads a virtual scene in the application, displays a view of a first virtual object, and the first user uses the first terminal 120 to operate the first virtual object located in the virtual scene to perform activities including, but not limited to: at least one of body posture adjustment, crawling, walking, running, riding, jumping, driving, picking up, shooting, attacking, throwing, and countering.
Server 140 includes at least one of a server, a plurality of servers, a cloud computing platform, or a virtualization center. The server 140 is used to provide background services for applications supporting virtual scenarios. Optionally, the server 140 takes on primary computing work, and the first terminal 120 and the second terminal 160 take on secondary computing work; alternatively, the server 140 performs a secondary computing job, and the first terminal 120 and the second terminal 160 perform a primary computing job; alternatively, the server 140, the first terminal 120 and the second terminal 160 perform cooperative computing by using a distributed computing architecture.
Optionally, the server 140 is a stand-alone physical server, or a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, content delivery networks (Content Delivery Network, CDN), and basic cloud computing services such as big data and artificial intelligence platforms.
The applications installed on the second terminal 160 and the first terminal 120 are the same, or the applications installed on the two terminals are the same type of application of different operating system platforms. The first virtual object controlled by the first terminal 120 and the virtual object controlled by the second terminal 160 are in the same virtual scene, and at this time, the two virtual objects may interact in the virtual scene.
Optionally, the virtual object controlled by the second terminal 160 and the first virtual object controlled by the first terminal 120 are teammate relationships, such as belong to the same camp, the same team, have a friend relationship, or have temporary communication rights. For example, the second terminal 160 is a terminal used by a second user, and the second user uses the second terminal 160 to operate a friend virtual object located in the first virtual scene to perform an activity, where the friend virtual object is a virtual object belonging to the same team as the controlled virtual object.
Optionally, the virtual object controlled by the second terminal 160 and the first virtual object controlled by the first terminal 120 are in hostile relationship, for example, virtual objects belonging to different camps and hostile relationship may interact in a manner of shooting each other on land, such as shooting props each other. For example, the second terminal 160 is a terminal used by a third user, and the third user uses the second terminal 160 to operate an hostile virtual object located in the first virtual scene to perform an activity, where the hostile virtual object refers to a virtual object belonging to a different team from the controlled virtual object.
Alternatively, the first terminal 120 refers broadly to one of a plurality of terminals, and the second terminal 160 refers broadly to one of a plurality of terminals, the present embodiment being exemplified by the first terminal 120 and the second terminal 160. The device types of the first terminal 120 and the second terminal 160 are the same or different, and include: at least one of a smart phone, a tablet computer, a smart speaker, a smart watch, a laptop portable computer, and a desktop computer, but is not limited thereto. For example, the first terminal 120 and the second terminal 160 are smart phones, or other handheld portable gaming devices.
Those skilled in the art will recognize that the number of terminals may be greater or lesser. For example, the number of the terminals may be one, or the number of the terminals may be tens or hundreds, or more. The number of terminals and the device type are not limited in the embodiment of the present application.
Fig. 2 is a flowchart of a control method of a virtual flight vehicle according to an embodiment of the present application, and as shown in fig. 2, in the embodiment of the present application, an example of execution by a terminal is described. The control method of the virtual flying carrier comprises the following steps:
201. the terminal displays a view angle picture of the first virtual object.
In this embodiment of the present application, the first virtual object is a virtual object controlled by a user account logged in by the terminal, and the view angle picture of the first virtual object is a scene picture obtained by observing a virtual scene with the view angle of the first virtual object.
202. And responding to the triggering operation of the virtual flying carrier, the terminal switches the visual angle picture of the first virtual object into the visual angle picture of the virtual flying carrier for displaying, and the virtual flying carrier is loaded with the interactive prop.
In this embodiment of the present application, the user switches the currently displayed view angle picture by triggering the virtual flight vehicle, and correspondingly, if the terminal detects the triggering operation on the virtual flight vehicle, the terminal switches the displayed view angle picture of the first virtual object to the view angle picture of the virtual flight vehicle. Optionally, the virtual flight vehicle is a virtual unmanned aerial vehicle, a virtual plane, a virtual helicopter, or other virtual aircraft. The virtual flying carrier is loaded with an interactive prop, and the interactive prop is used for interacting with the virtual object, the virtual prop and the virtual building, wherein the interaction is to influence the attribute values of the virtual object, the virtual prop and the virtual building, such as reducing the virtual life value of the virtual object. Optionally, the interactive prop is a virtual weapon such as a virtual machine gun, a virtual missile, a virtual torch, and the like.
203. And the terminal controls the virtual flight vehicle to fly according to the control operation of the virtual flight vehicle.
In this embodiment of the present application, the user may control the virtual flight vehicle to fly in the virtual scene, and accordingly, if the terminal detects a control operation on the virtual flight vehicle, the terminal controls the virtual flight vehicle to fly, such as horizontal flight, vertical flight, and steering flight, according to the control operation.
204. And responding to the interaction triggering operation of the virtual flying carrier, and controlling the interaction prop to interact by the terminal.
In the embodiment of the application, the user triggers the interactive prop loaded on the virtual flying carrier through the interactive triggering operation, and correspondingly, if the terminal detects the interactive triggering operation on the virtual flying carrier, the terminal controls the interactive prop to interact according to the interactive triggering operation. Such as controlling virtual machine gun firing, controlling virtual missile firing, or controlling virtual torches to fire, etc.
According to the scheme disclosed by the embodiment of the application, in the process of interaction of the first virtual object controlled by the user, if the user triggers the virtual flying carrier, the visual angle of the first virtual object is switched to the visual angle of the virtual flying carrier through visual angle switching, so that the user can control the virtual flying carrier to fly based on the visual angle of the virtual flying carrier, the loaded interactive prop on the virtual flying carrier is not limited by the placement position, the effect that the interactive prop freely moves in a virtual scene along with the virtual flying carrier is realized, the interactive prop can effectively interact with the virtual object, and the man-machine interaction efficiency is remarkably improved.
Fig. 3 is a flowchart of a control method of a virtual flight vehicle according to an embodiment of the present application, and as shown in fig. 3, in the embodiment of the present application, an example of execution by a terminal is described. The control method of the virtual flying carrier comprises the following steps:
301. the terminal displays a view angle picture of the first virtual object.
In this embodiment of the present application, a user account is logged in to a terminal, where the first virtual object is a virtual object controlled by the user account, and a view angle picture of the first virtual object is a scene picture obtained by observing a virtual scene with a view angle of the first virtual object.
Optionally, the view angle picture of the first virtual object is a scene picture acquired based on the first person view angle or a scene picture acquired based on the third person view angle.
In some embodiments, the virtual camera of the virtual scene is located at the head of the first virtual object when the view of the first virtual object is a scene acquired based on the first person's view. Accordingly, the terminal acquires the view angle picture of the first virtual object based on the scene picture shot by the virtual camera, and at this time, the view angle picture of the first virtual object can display part of the body of the first virtual object, such as arms, breasts, abdomen, legs and the like.
In some embodiments, the virtual camera of the virtual scene is located above and behind the first virtual object when the view of the first virtual object is a scene acquired based on the third person's view. Accordingly, the terminal acquires the view angle picture of the first virtual object based on the scene picture shot by the virtual camera, and at this time, the view angle picture of the first virtual object can display the whole body or most areas of the first virtual object, such as the head, the back, the waist and the like.
For example, referring to fig. 4, fig. 4 is a schematic diagram of a view angle picture of a first virtual object according to an embodiment of the present application, and as shown in fig. 4, the view angle picture of the first virtual object is taken as an example of a scene picture acquired based on a first person's view angle.
302. And responding to the triggering operation of the virtual flying carrier, the terminal switches the visual angle picture of the first virtual object into the visual angle picture of the virtual flying carrier for displaying, and the virtual flying carrier is loaded with the interactive prop.
In the embodiment of the present application, if the terminal detects a triggering operation on the virtual flight vehicle, the terminal switches the currently displayed view angle picture of the first virtual object to the view angle picture of the virtual flight vehicle, so that the user can view the virtual scene and the virtual object in the virtual scene based on the view angle picture of the virtual flight vehicle. The virtual flying carrier is loaded with an interactive prop, and a user can control the virtual flying carrier to interact with virtual objects, virtual props, virtual buildings and the like in the virtual scene based on the interactive prop.
For example, referring to fig. 5, fig. 5 is a schematic view of a view angle frame of a virtual flight vehicle according to an embodiment of the present application. As shown in fig. 5, the virtual flight vehicle is loaded with a virtual machine gun.
In some embodiments, the trigger control of the virtual flight vehicle is displayed on the view angle screen of the first virtual object. If the virtual flight vehicle meets the trigger condition, the trigger control is displayed in a triggerable state, such as highlighting; if the virtual flight vehicle does not meet the trigger condition or has been triggered, the trigger control is displayed in an un-triggerable state. Correspondingly, if the terminal detects the triggering operation of the triggering control when the triggering control pair is in a triggerable state, the terminal is switched to a visual angle picture of the virtual flight vehicle.
For example, referring to fig. 6, fig. 6 is a schematic diagram of a view angle picture of another first virtual object according to an embodiment of the present application, taking a scene picture obtained based on a first person view angle as an example. As shown in fig. 6, a trigger control 601 of the virtual flight vehicle is displayed in the view angle picture of the first virtual object, and the trigger control 601 is in a triggerable state. If the user clicks the trigger control, the terminal will switch the currently displayed view angle picture of the first virtual object to the view angle picture of the virtual flight vehicle, where the view angle picture of the virtual flight vehicle is shown in fig. 5.
In some embodiments, when detecting a triggering operation of a virtual flight vehicle, the terminal displays a first virtual object to call the virtual flight vehicle, and then switches the displayed view angle picture. Correspondingly, responding to the triggering operation of the virtual flying carrier, the terminal displays a picture of the first virtual object using a calling prop in the view angle picture of the first virtual object, wherein the calling prop is used for calling the virtual flying carrier. If the first virtual object finishes the use of the calling prop, the terminal switches the view angle picture of the first virtual object into the view angle picture of the virtual flight carrier for display. Through the display of the picture of the first virtual object using the calling prop, the using mode of the virtual flight carrier is closer to reality, the switching of the visual angle picture is not too abrupt, and therefore a user can be immersed in a game, and the game experience of the user is improved.
In some embodiments, when the trigger operation of the virtual flight vehicle is detected, the terminal displays the switching animation of the viewing angle first, and then switches the displayed viewing angle picture. Correspondingly, in response to triggering operation of the virtual flight vehicle, the terminal displays the switching animation on the view angle picture of the first virtual object. And if the switching animation is completely played, the terminal switches the visual angle picture of the first virtual object into the visual angle picture of the virtual flight carrier for display. The switching animation may be from a blurred image to a clear image, or may display a connection progress bar, which is not limited in the embodiment of the present application. Through displaying the switching animation, the terminal has certain buffering when switching the visual angle picture of the first virtual object into the visual angle picture of the virtual flight carrier, and can avoid visual impact caused by the switching of the visual angle picture to a certain extent.
In some embodiments, when the trigger operation of the virtual flight vehicle is detected, the terminal displays that the virtual flight vehicle flies into the virtual scene, and then switches the displayed view angle picture. Correspondingly, in response to triggering operation of the virtual flight vehicle, the terminal displays the virtual flight vehicle from the position of the first virtual object to the position in the air or displays the virtual flight vehicle from far to near in the air in the view angle picture of the first virtual object. And then switching the view angle picture of the first virtual object into the view angle picture of the virtual flight carrier for display. By displaying the flying-in virtual scene of the virtual flying vehicle, a user can determine the position of the virtual flying vehicle in the virtual scene, so that after the visual angle picture is switched, the virtual flying vehicle can be controlled to fly or interact with the virtual object based on the position of the virtual flying vehicle in the virtual scene.
303. And the terminal controls the virtual flight vehicle to fly according to the control operation of the virtual flight vehicle.
In this embodiment of the present application, if the terminal detects a control operation on the virtual flight vehicle, the terminal may control the virtual flight vehicle to fly in the virtual scene according to the control operation. Optionally, the terminal controls the virtual flight vehicle to fly horizontally in the virtual scene; or the terminal controls the virtual flight vehicle to fly vertically in the virtual scene; or the terminal controls the virtual flight vehicle to switch the flight direction in the virtual scene.
In some embodiments, the visual field of the virtual flight vehicle displays a horizontal flight control. Correspondingly, in response to triggering operation of the horizontal flight control, the terminal determines a horizontal flight direction and then controls the virtual flight vehicle to accelerate in the horizontal flight direction. The horizontal flight directions comprise four flight directions of horizontal forward flight, horizontal backward flight, horizontal left flight and horizontal right flight. Through displaying the horizontal flight control, a user can control the virtual flight vehicle to fly in the expected horizontal direction through the horizontal flight control, and the control efficiency of the virtual flight vehicle is improved. Optionally, the flight speed of the virtual flight vehicle in the horizontal direction has a maximum value, and when the virtual flight vehicle accelerates to the maximum value of the flight speed, the flight speed of the virtual flight vehicle is not increased any more, and the virtual flight vehicle keeps flying at a constant speed. Optionally, the view angle picture of the virtual flight vehicle also displays the current flight speed of the virtual flight vehicle. By setting the maximum value of the flight speed, the virtual flight vehicle can not accelerate unrestrained, so that the flight mode of the virtual flight vehicle is more practical, and the user experience is improved.
For example, referring to fig. 7, fig. 7 is a schematic view of a view angle frame of a virtual flight vehicle according to an embodiment of the present application. As shown in fig. 7, the view angle picture of the virtual flight vehicle displays a horizontal flight control 701, the horizontal flight control 701 has four directions of front, back, left and right, and can be operated, when the user pushes the horizontal flight control forward, the virtual flight vehicle can fly forward, and the speed during flying can be accelerated from zero until the maximum value of the flying speed is reached. If the user releases his hand, the virtual flight vehicle will slow down until the flight speed is zero. Wherein, 0.0KM/H represents that the current flight speed of the virtual flight vehicle is 0. Similarly, when the user pulls the horizontal flight control backwards, the virtual flight vehicle flies backwards; the user pushes the horizontal flight control to the left, and the virtual flight vehicle flies to the left; the user pushes the horizontal flight control to the right, and the virtual flight vehicle flies to the right, so that the description is not given.
In some embodiments, the perspective view of the virtual flight vehicle displays a first vertical flight control and a second vertical flight control. Correspondingly, responding to the triggering operation of the first vertical flight control, and controlling the virtual flight vehicle to fly vertically upwards in the virtual scene by the terminal. Or, in response to the triggering operation of the second vertical flight control, the terminal controls the virtual flight vehicle to fly vertically downwards in the virtual scene. Through displaying the two vertical flight controls, a user can control the virtual flight vehicle to fly towards the expected vertical direction through the two vertical flight controls, and the control efficiency of the virtual flight vehicle is improved. Optionally, the flight speed of the virtual flight vehicle in the vertical direction also has a maximum value, which is not described herein.
For example, with continued reference to fig. 7, the perspective view of the virtual flight vehicle displays a first vertical flight control 702 and a second vertical flight control 703. The virtual flight vehicle flies vertically upward when the user presses the first vertical flight control 702, and flies vertically downward when the user presses the second vertical flight control 703. And, the speed of the virtual flying carrier in flying can accelerate from zero until reaching the maximum value of the flying speed. If the user releases his hand, the virtual flight vehicle will slow down until the flight speed is zero.
In some embodiments, the view of the virtual flight vehicle displays a steering region. Correspondingly, responding to the left sliding operation in the steering area, and controlling the virtual flying carrier to fly in a left steering way in the virtual scene by the terminal; or, in response to a right-hand slide operation in the steering region, the terminal controls the virtual flight vehicle to steer to the right in the virtual scene. The steering area is the right area or the left area or all areas of the visual angle picture of the virtual flying carrier. Through displaying the steering area, the user can control the virtual flight vehicle to steer towards the expected direction through the sliding operation in the steering area, and the control efficiency of the virtual flight vehicle is improved. Optionally, when the virtual flying carrier turns to and flies, the terminal can also display that the virtual flying carrier swings towards the direction of turning, so that the flying mode of the virtual flying carrier is more practical, and the user experience is improved.
For example, with continued reference to fig. 7, the right region of the view angle frame of the virtual flight vehicle is taken as a steering region. If the user slides leftwards in the steering area, the terminal controls the virtual flight vehicle to steer leftwards for flying, and displays that the virtual flight vehicle shakes leftwards; if the user slides right in the steering area, the terminal controls the virtual flying carrier to steer right to fly, and displays that the virtual flying carrier shakes right.
304. And responding to the fact that any second virtual object enters the searching range of the virtual flying carrier, displaying an object mark on a visual angle picture of the virtual flying carrier by the terminal, wherein the second virtual object is a virtual object hostile to the first virtual object, and the object mark is used for marking any second virtual object.
In this embodiment of the present application, the virtual flying carrier has a search range, and if a virtual object hostile to the first virtual object enters the search range, the terminal marks the virtual object, and displays the object mark on a view angle screen of the virtual flying carrier. The search range is an area within a certain range in front of the virtual flight vehicle, and the range size is not limited in the embodiment of the application. By displaying the object mark, a user can quickly find the hostile virtual object based on the object mark, so that the efficiency of interaction with the virtual object is improved.
For example, referring to fig. 8, fig. 8 is a schematic diagram of a display object mark according to an embodiment of the present application. As shown in fig. 8, the objects are marked as circles, and the view of the virtual flying vehicle displays a second virtual object marked by a circle.
The virtual scene in the embodiment of the application is a three-dimensional virtual scene, the view angle picture of the virtual flight carrier is a two-dimensional picture, and the mode of displaying the object mark on the view angle picture of the virtual flight carrier by the terminal is as follows: the terminal acquires a first position and a second position, wherein the first position is the position of the virtual flight prop in the virtual scene, and the second position is the position of any second virtual object in the virtual scene. And then, the terminal determines a third position according to the first position and the second position, wherein the third position is the position of the second virtual object in the visual angle picture of the virtual flight vehicle. Finally, the terminal displays the object mark at the third position. By determining the third position and displaying the object mark at the third position, the object mark can be displayed in the visual angle picture of the virtual flight vehicle without changing the virtual object in the virtual scene, and the development amount is effectively reduced.
In some embodiments, the terminal determines the third location from the first location and the second location by: and the terminal determines a target plane according to the first position, wherein the target plane is a plane corresponding to the visual angle picture of the virtual flight vehicle. Then, the terminal makes a perpendicular line from the second position to the target plane, and determines an intersection point of the perpendicular line and the target plane as the third position. The plane corresponding to the visual angle picture of the virtual flight carrier is a plane where the virtual lens shooting the visual angle picture of the virtual flight carrier is located, and the first position is the center of the visual angle picture of the virtual flight carrier.
For example, referring to fig. 9, fig. 9 is a schematic diagram illustrating a determination of a third location according to an embodiment of the present application. As shown in fig. 9, a represents a first position, i.e., the position of the virtual flying prop in the virtual scene; the boxes represent view angle pictures of the virtual flight vehicle; b represents a second position, i.e. the position of any second virtual object in the virtual scene, and BP represents a perpendicular line drawn from the second position to the target plane. P represents the third position. That is, the user views the second virtual object on the view angle picture of the virtual flight vehicle, and displays the object identifier at the P point.
305. And responding to the interaction triggering operation of the virtual flying carrier, and controlling the interaction prop to interact by the terminal.
In the embodiment of the application, if the terminal detects the interaction triggering operation on the virtual flying carrier, the terminal can control the interaction prop to interact according to the interaction triggering operation. Such as controlling virtual machine gun firing, controlling virtual missile firing, or controlling virtual torches to fire, etc.
In some embodiments, the visual angle picture of the virtual flight vehicle displays an aiming mark and a shooting control, and the interactive triggering operation of the virtual flight vehicle is the triggering operation of the shooting control. Correspondingly, in response to triggering operation of the shooting control, the terminal controls the interactive prop to shoot in the direction indicated by the aiming mark.
For example, referring to fig. 10, fig. 10 is a view angle screen of another virtual flight vehicle according to an embodiment of the present application. As shown in fig. 10, the visual angle picture of the virtual flight vehicle displays a sighting mark 1001 and a shooting control 1002, the interactive prop is a virtual machine gun of infinite bullets, if a user presses the shooting control 1002, the terminal controls the virtual machine gun to shoot in a direction indicated by the sighting mark 1001, and when shooting, the terminal displays flame at a muzzle of the virtual machine gun and a hit effect at a hit position.
It should be noted that the virtual flying carrier can also be destroyed. Correspondingly, if the virtual attribute value of the virtual flight vehicle meets the destruction condition, the terminal displays the destruction animation on the visual angle picture of the virtual flight vehicle. And then the terminal destroys the virtual flight vehicle and switches to the view angle picture of the first virtual object. The destroying condition is that the existing time length reaches a preset time length or the virtual life value reaches a preset value, and the like, which is not limited in the embodiment of the application. The virtual flying carrier can be destroyed by setting, so that the virtual objects and the virtual flying carrier can interact with each other, and the game fun is increased.
For example, referring to fig. 11, fig. 11 is a schematic view of a view angle screen of another virtual flight vehicle according to an embodiment of the present application. As shown in fig. 11, the destruction animation is that the view angle picture of the virtual flight vehicle is blurred, which means that the first virtual object and the virtual flight vehicle are disconnected, and then the terminal destroys the virtual flight vehicle and switches to the view angle picture of the first virtual object.
In some embodiments, if the first virtual object receives an attack during the display of the view of the virtual flying vehicle, the terminal may also switch to the view of the first virtual object, thereby conforming to the situation where the first virtual object remotely controls the virtual flying vehicle.
The virtual flight vehicle is capable of flying because a rigid body (Rigigbody) is added to the virtual flight vehicle. The principle of flying a virtual flying vehicle is briefly described below. In physics, a rigid body is an ideal model. An ideal physical model in which the shape and size (dimensions) of an object remain unchanged under the action of an external force, and the relative positions of parts inside remain constant (without deformation) is generally called a rigid body. In a physical engine, a rigid body is a very important component, and some common physical properties such as mass, friction, collision parameters and the like can be added to an object through the rigid body component. Through the attributes, all virtual behaviors of the object in the three-dimensional virtual scene can be simulated, and after the object is added with the rigid body component, all physical effects in the physical engine are sensed. Therefore, in the game design process, the embodiment of the application adds the rigid body component to the virtual flight carrier, and the physical attribute can be given to the game object through the rigid body in the Unity 3D (a game engine), so that the game object receives the thrust and the torsion under the control of the physical system, thereby realizing the motion effect similar to that in the real world.
For example, referring to fig. 12, fig. 12 is a schematic diagram of a rigid body assembly provided according to an embodiment of the present application. As shown in fig. 12, a rigid body component is added to the virtual flying carrier in the game design, and the calculated direction and force are directly assigned to the rigid body by the program logic according to the detected control operation in the game process, so that the flying of the virtual flying carrier is realized.
It should be noted that, in order to make the control method of the virtual flying vehicle described in the above steps 301 to 305 easier to understand, referring to fig. 13, fig. 13 is a flowchart of another control method of the virtual flying vehicle according to an embodiment of the present application. As shown in fig. 13, the virtual flight vehicle is an aircraft for loading an air gun, and the steps include: 1301. a trigger control of the aircraft is activated. 1302 determines whether the trigger control is triggered, if so, step 1303 is performed. 1303. And calling the aircraft and displaying the visual angle picture of the aircraft. 1304. Judging whether the aircraft is controlled to fly horizontally, if so, executing step 1305; if not, step 1303 is performed. 1305. Controlling the horizontal flight of the aircraft. 1306. And judging whether the steering of the aircraft is controlled. If so, step 1307 is executed; if not, step 1305 is performed. 1307. Controlling the aircraft to turn to fly. 1308. Judging whether the aircraft is controlled to fly vertically, if so, executing step 1309; if not, returning to the previous step. 1309. Controlling the aircraft to fly vertically. 1310. Judging whether the shooting control is triggered, if so, executing step 1311; if not, returning to the previous step. 1311. Controlling the air machine gun to fire bullets. 1312. Judging whether the shooting control is not pressed, if so, executing step 1313; if not, returning to the previous step. 1313. The firing of the bullet is stopped.
According to the scheme provided by the embodiment of the application, in the process of interaction of the first virtual object controlled by the user, if the user triggers the virtual flying carrier, the visual angle of the first virtual object is switched to the visual angle of the virtual flying carrier through visual angle switching, so that the user can control the virtual flying carrier to fly based on the visual angle of the virtual flying carrier, the loaded interactive prop on the virtual flying carrier is not limited by the placement position, the effect that the interactive prop freely moves in a virtual scene along with the virtual flying carrier is achieved, the interactive prop can effectively interact with the virtual object, and the man-machine interaction efficiency is remarkably improved.
Fig. 14 is a block diagram of a control device for a virtual flight vehicle according to an embodiment of the present application. The apparatus is used for executing the steps in the control method of the virtual flying carrier, referring to fig. 14, the apparatus includes: a display module 1401 and a control module 1402.
A display module 1401 for displaying a perspective screen of a first virtual object;
the display module 1401 is further configured to switch, in response to a triggering operation on the virtual flight vehicle, a view angle picture of the first virtual object to a view angle picture of the virtual flight vehicle for display, where the virtual flight vehicle is loaded with an interactive prop;
A control module 1402, configured to control the virtual flight vehicle to fly according to a control operation of the virtual flight vehicle;
the control module 1402 is further configured to control the interaction prop to interact in response to an interaction triggering operation on the virtual flying carrier.
In some embodiments, the display module 1401 is configured to display, in response to a trigger operation on the virtual flight vehicle, a screen of the first virtual object using a calling prop for calling the virtual flight vehicle in a view screen of the first virtual object; and if the first virtual object finishes the use of the calling prop, switching the view angle picture of the first virtual object into the view angle picture of the virtual flight carrier for display.
In some embodiments, the display module 1401 is configured to display a switching animation on the view screen of the first virtual object in response to a trigger operation on the virtual flight vehicle; and if the switching animation is played, switching the view angle picture of the first virtual object into the view angle picture of the virtual flight carrier for display.
In some embodiments, the display module 1401 is configured to display, in response to a trigger operation on the virtual flight vehicle, a position of the virtual flight vehicle flying from a position of the first virtual object to an airborne position in a perspective view of the first virtual object; and switching the view angle picture of the first virtual object into the view angle picture of the virtual flight carrier for display.
In some embodiments, the display module 1401 is further configured to display an object marker on the perspective view of the virtual flight vehicle in response to any second virtual object entering the search range of the virtual flight vehicle, where the second virtual object is a virtual object hostile to the first virtual object, and the object marker is used to mark any second virtual object.
In some embodiments, the display module 1401 is further configured to obtain a first position and a second position, where the first position is a position of the virtual flight prop in the virtual scene, and the second position is a position of the any second virtual object in the virtual scene; determining a third position according to the first position and the second position, wherein the third position is the position of the second virtual object in the visual angle picture of the virtual flight vehicle; the object marker is displayed in the third position.
In some embodiments, the display module 1401 is further configured to determine a target plane according to the first position, where the target plane is a plane corresponding to a view angle picture of the virtual flight vehicle; and (3) making a perpendicular line from the second position to the target plane, and determining an intersection point of the perpendicular line and the target plane as the third position.
In some embodiments, the control module 1402 is configured to perform at least one of:
controlling the virtual flight vehicle to fly horizontally in the virtual scene;
controlling the virtual flight vehicle to fly vertically in the virtual scene;
and controlling the virtual flight vehicle to switch the flight direction in the virtual scene.
In some embodiments, the visual angle view of the virtual flight vehicle displays a horizontal flight control;
the control module 1402 is configured to determine a horizontal flight direction in response to a trigger operation of the horizontal flight control; and controlling the virtual flight vehicle to accelerate the flight along the horizontal flight direction.
In some embodiments, the perspective view of the virtual flight vehicle displays a first vertical flight control and a second vertical flight control;
the control module 1402 is configured to control the virtual flight vehicle to fly vertically upwards in the virtual scene in response to a triggering operation on the first vertical flight control; and responding to the triggering operation of the second vertical flight control, and controlling the virtual flight vehicle to fly vertically downwards in the virtual scene.
In some embodiments, the view of the virtual flight vehicle displays a steering region;
The control module 1402 is configured to control the virtual flight vehicle to fly in a left direction in the virtual scene in response to a left sliding operation in the steering area; and controlling the virtual flight vehicle to steer to the right in the virtual scene in response to the right-hand slide operation in the steering area.
In some embodiments, the visual angle picture of the virtual flight vehicle displays an aiming mark and a shooting control, and the interactive triggering operation of the virtual flight vehicle is the triggering operation of the shooting control;
the control module 1401 is configured to control the interactive prop to shoot in a direction indicated by the aiming mark in response to a trigger operation of the shooting control.
In some embodiments, the display module 1401 is further configured to display a destruction animation on the visual angle screen of the virtual flight vehicle if the virtual attribute value of the virtual flight vehicle satisfies the destruction condition; destroying the virtual flying carrier and switching to the view angle picture of the first virtual object.
In the embodiment of the application, in the process that the user controls the first virtual object to interact, if the user triggers the virtual flying carrier, the visual angle of the first virtual object is switched to the visual angle of the virtual flying carrier through visual angle switching, so that the user can control the virtual flying carrier to fly based on the visual angle of the virtual flying carrier, the interactive prop loaded on the virtual flying carrier is not limited by the placing position, the effect that the interactive prop freely moves in a virtual scene along with the virtual flying carrier is achieved, the interactive prop can effectively interact with the virtual object, and the man-machine interaction efficiency is remarkably improved.
It should be noted that: in the control device for a virtual flight vehicle according to the above embodiment, only the division of the above functional modules is used for illustration when the virtual flight vehicle is controlled, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to perform all or part of the functions described above. In addition, the control device of the virtual flight vehicle provided in the above embodiment and the control method embodiment of the virtual flight vehicle belong to the same concept, and the detailed implementation process of the control device is referred to as method embodiment, and is not repeated here.
Fig. 15 is a block diagram of a terminal structure according to an embodiment of the present application. The terminal 1500 may be a portable mobile terminal such as: a smart phone, a tablet computer, a dynamic video expert compression standard audio layer 3 (Moving Picture Experts Group Audio Layer III, MP3 player), a dynamic video expert compression standard audio layer 4 (Moving Picture Experts Group Audio Layer IV, MP 4) player, a notebook computer, or a desktop computer. Terminal 1500 can also be referred to as a user device, portable terminal, laptop terminal, desktop terminal, and the like.
In general, the terminal 1500 includes: a processor 1501 and a memory 1502.
The processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1501 may be implemented in at least one hardware form of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 1501 may also include a main processor, which is a processor for processing data in an awake state, also called a central processor (Central Processing Unit, CPU), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1501 may be integrated with an image processor (Graphics Processing Unit, GPU) for use in responsible for rendering and rendering of content to be displayed by the display screen. In some embodiments, the processor 1501 may also include an artificial intelligence (Artificial Intelligence, AI) processor for processing computing operations related to machine learning.
Memory 1502 may include one or more computer-readable storage media, which may be non-transitory. Memory 1502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1502 is used to store at least one computer program for execution by processor 1501 to implement the control method of a virtual flight vehicle provided by the method embodiments herein.
In some embodiments, the terminal 1500 may further optionally include: a peripheral interface 1503 and at least one peripheral device. The processor 1501, memory 1502 and peripheral interface 1503 may be connected by a bus or signal lines. The individual peripheral devices may be connected to the peripheral device interface 1503 via a bus, signal lines, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1504, a display screen 1505, a camera assembly 1506, audio circuitry 1507, a positioning assembly 1508, and a power supply 1509.
A peripheral interface 1503 may be used to connect at least one Input/Output (I/O) related peripheral device to the processor 1501 and the memory 1502. In some embodiments, processor 1501, memory 1502, and peripheral interface 1503 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 1501, the memory 1502, and the peripheral interface 1503 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 1504 is configured to receive and transmit Radio Frequency (RF) signals, also known as electromagnetic signals. The radio frequency circuit 1504 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 1504 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1504 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuit 1504 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or wireless fidelity (Wireless Fidelity, wiFi) networks. In some embodiments, the radio frequency circuit 1504 may also include circuitry related to near field wireless communication (Near Field Communication, NFC), which is not limited in this application.
Display 1505 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When display screen 1505 is a touch display screen, display screen 1505 also has the ability to collect touch signals at or above the surface of display screen 1505. The touch signal may be input to the processor 1501 as a control signal for processing. At this point, display 1505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1505 may be one, disposed on the front panel of the terminal 1500; in other embodiments, the display 1505 may be at least two, respectively disposed on different surfaces of the terminal 1500 or in a folded design; in other embodiments, display 1505 may be a flexible display disposed on a curved surface or a folded surface of terminal 1500. Even more, the display 1505 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The display 1505 may be made of LCD (Liquid Crystal Display ), organic Light-Emitting Diode (OLED), or other materials.
The camera assembly 1506 is used to capture images or video. Optionally, the camera assembly 1506 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, the camera assembly 1506 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuitry 1507 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and the environment, converting the sound waves into electric signals, inputting the electric signals to the processor 1501 for processing, or inputting the electric signals to the radio frequency circuit 1504 for voice communication. For purposes of stereo acquisition or noise reduction, a plurality of microphones may be respectively disposed at different portions of the terminal 1500. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 1501 or the radio frequency circuit 1504 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, the audio circuit 1507 may also include a headphone jack.
The positioning component 1508 is for positioning a current geographic location of the terminal 1500 to enable navigation or location-based services (Location Based Service, LBS). The positioning component 1508 can be a global positioning system (Global Positioning System, GPS), beidou system, or galileo system based positioning component.
The power supply 1509 is used to power the various components in the terminal 1500. The power supply 1509 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power supply 1509 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 1500 also includes one or more sensors 1510. The one or more sensors 1510 include, but are not limited to: acceleration sensor 1511, gyro sensor 1512, pressure sensor 1513, optical sensor 1515, and proximity sensor 1516.
The acceleration sensor 1511 may detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the terminal 1500. For example, the acceleration sensor 1511 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1501 may control the display screen 1505 to display the user interface in a landscape view or a portrait view based on the gravitational acceleration signal acquired by the acceleration sensor 1511. The acceleration sensor 1511 may also be used for the acquisition of motion data of a game or user.
The gyro sensor 1512 may detect a body direction and a rotation angle of the terminal 1500, and the gyro sensor 1512 may collect 3D motion of the terminal 1500 by a user in cooperation with the acceleration sensor 1511. The processor 1501, based on the data collected by the gyro sensor 1512, may implement the following functions: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
The pressure sensor 1513 may be disposed on a side frame of the terminal 1500 and/or under the display 1505. When the pressure sensor 1513 is disposed on the side frame of the terminal 1500, a grip signal of the user on the terminal 1500 may be detected, and the processor 1501 performs left-right hand recognition or quick operation according to the grip signal collected by the pressure sensor 1513. When the pressure sensor 1513 is disposed at the lower layer of the display screen 1505, the processor 1501 realizes control of the operability control on the UI interface according to the pressure operation of the user on the display screen 1505. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The optical sensor 1515 is used to collect the ambient light intensity. In one embodiment, processor 1501 may control the display brightness of display screen 1505 based on the intensity of ambient light collected by optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1505 is turned up; when the ambient light intensity is low, the display luminance of the display screen 1505 is turned down. In another embodiment, the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 based on the ambient light intensity collected by the optical sensor 1515.
A proximity sensor 1516, also referred to as a distance sensor, is typically provided on the front panel of the terminal 1500. The proximity sensor 1516 is used to collect the distance between the user and the front of the terminal 1500. In one embodiment, when the proximity sensor 1516 detects a gradual decrease in the distance between the user and the front of the terminal 1500, the processor 1501 controls the display 1505 to switch from the on-screen state to the off-screen state; when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually increases, the processor 1501 controls the display screen 1505 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 15 is not limiting and that more or fewer components than shown may be included or certain components may be combined or a different arrangement of components may be employed.
The embodiment of the application also provides a computer readable storage medium, in which at least one section of computer program is stored, and the at least one section of computer program is loaded and executed by a processor of the terminal to implement the operations performed by the terminal in the control method of the virtual flight vehicle of the embodiment. For example, the computer readable storage medium may be Read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), compact disc Read-Only Memory (Compact Disc Read-Only Memory, CD-ROM), magnetic tape, floppy disk, optical data storage device, and the like.
Embodiments of the present application also provide a computer program product comprising computer program code stored in a computer readable storage medium. The processor of the terminal reads the computer program code from the computer readable storage medium, and the processor executes the computer program code, so that the terminal performs the control method of the virtual flying carrier provided in the above various alternative implementations.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments is merely exemplary in nature and is in no way intended to limit the invention, since it is intended that all modifications, equivalents, improvements, etc. that fall within the spirit and scope of the invention.

Claims (12)

1. A method of controlling a virtual flight vehicle, the method comprising:
displaying a view angle picture of the first virtual object;
Responding to the triggering operation of the virtual flying carrier, displaying a switching animation on the visual angle picture of the first virtual object, wherein the switching animation is from a blurred picture to a clear picture;
if the switching animation is played, switching the view angle picture of the first virtual object into the view angle picture of the virtual flight carrier for display; wherein the virtual flight vehicle is added with a rigid body for adding physical attributes to the virtual flight vehicle in a physical engine,
the virtual flying carrier is loaded with a virtual machine gun with infinite bullets, and a visual angle picture of the virtual flying carrier is displayed with a horizontal flying control, a steering area and the current flying speed of the virtual flying carrier;
determining a horizontal flight direction in response to a triggering operation of the horizontal flight control;
controlling the virtual flight vehicle to accelerate to fly along the horizontal flight direction, controlling the virtual flight vehicle to fly at a constant speed when the virtual flight vehicle accelerates to the maximum value of the flight speed, and decelerating the virtual flight vehicle to fly until the flight speed is zero if the triggering operation of the horizontal flight control is canceled;
Controlling the virtual flight vehicle to turn left to fly in a virtual scene in response to the left-hand sliding operation in the turning area, and displaying the virtual flight vehicle to shake left;
controlling the virtual flight vehicle to turn to the right in the virtual scene in response to the right-hand sliding operation in the turning area, and displaying that the virtual flight vehicle swings to the right;
responding to the fact that any second virtual object enters the searching range of the virtual flight vehicle, acquiring a first position and a second position, wherein the first position is the position of the virtual flight vehicle in the virtual scene, and the second position is the position of any second virtual object in the virtual scene;
determining a target plane according to the first position, wherein the target plane is a plane corresponding to a visual angle picture of the virtual flight vehicle, and the first position is the center of the visual angle picture; the plane corresponding to the visual angle picture of the virtual flight carrier is a plane where a virtual lens shooting the visual angle picture of the virtual flight carrier is located;
making a perpendicular line from the second position to the target plane, determining an intersection point of the perpendicular line and the target plane as a third position, wherein the third position is a position of the second virtual object in a visual angle picture of the virtual flight vehicle;
Displaying an object mark at the third position, wherein the second virtual object is a virtual object hostile to the first virtual object, and the object mark is used for marking any second virtual object;
controlling the virtual machine gun to shoot in the direction indicated by the aiming mark in response to the interactive triggering operation of the virtual flight vehicle;
when the first virtual object is attacked, switching a view angle picture into a view angle picture of the first virtual object;
when the existence duration of the virtual flight vehicle reaches a preset duration or the virtual life value reaches a preset value, displaying a blurred animation on the visual angle picture of the virtual flight vehicle, destroying the virtual flight vehicle, and switching the visual angle into the visual angle picture of the first virtual object.
2. The method of claim 1, wherein switching the perspective view of the first virtual object to the perspective view of the virtual flight vehicle for display in response to a triggering operation on the virtual flight vehicle comprises:
responding to the triggering operation of the virtual flight vehicle, and displaying a picture of the first virtual object using a calling prop in a view angle picture of the first virtual object, wherein the calling prop is used for calling the virtual flight vehicle;
And if the first virtual object finishes the use of the calling prop, switching the view angle picture of the first virtual object into the view angle picture of the virtual flight carrier for display.
3. The method of claim 1, wherein the controlling the virtual flight vehicle to fly comprises at least one of:
controlling the virtual flight vehicle to fly horizontally in the virtual scene;
controlling the virtual flight vehicle to fly vertically in the virtual scene;
and controlling the virtual flight vehicle to switch the flight direction in the virtual scene.
4. The method of claim 3, wherein the view of the virtual flight vehicle displays a first vertical flight control and a second vertical flight control;
the controlling the virtual flight vehicle to fly vertically in the virtual scene includes:
responding to the triggering operation of the first vertical flight control, and controlling the virtual flight vehicle to fly vertically upwards in the virtual scene;
and responding to the triggering operation of the second vertical flight control, and controlling the virtual flight vehicle to fly vertically downwards in the virtual scene.
5. The method of claim 1, wherein the visual angle picture of the virtual flight vehicle displays an aiming mark and a shooting control, and the interactive triggering operation of the virtual flight vehicle is a triggering operation of the shooting control;
the controlling the virtual machine gun to shoot in the direction indicated by the aiming mark in response to the interactive triggering operation of the virtual flying carrier comprises:
and responding to the triggering operation of the shooting control, and controlling the virtual machine gun to shoot in the direction indicated by the aiming mark.
6. A control device for a virtual flying vehicle, the device comprising:
the display module is used for displaying the visual angle picture of the first virtual object;
the display module is further used for responding to the triggering operation of the virtual flying carrier, displaying a switching animation on the visual angle picture of the first virtual object, wherein the switching animation is from a blurred picture to a clear picture;
the display module is further configured to switch, if the switching animation is played, a view angle picture of the first virtual object to a view angle picture of the virtual flight vehicle for display; the virtual flying carrier is added with a rigid body, the rigid body is used for adding physical attributes to the virtual flying carrier in a physical engine, the virtual flying carrier is loaded with a virtual machine gun with infinite bullets, and a visual angle picture of the virtual flying carrier is displayed with a horizontal flying control, a steering area and the current flying speed of the virtual flying carrier;
The control module is used for responding to the triggering operation of the horizontal flight control and determining the horizontal flight direction; controlling the virtual flight vehicle to accelerate to fly along the horizontal flight direction, controlling the virtual flight vehicle to fly at a constant speed when the virtual flight vehicle accelerates to the maximum value of the flight speed, and decelerating the virtual flight vehicle to fly until the flight speed is zero if the triggering operation of the horizontal flight control is canceled; controlling the virtual flight vehicle to turn left to fly in a virtual scene in response to the left-hand sliding operation in the turning area, and displaying the virtual flight vehicle to shake left; controlling the virtual flight vehicle to turn to the right in the virtual scene in response to the right-hand sliding operation in the turning area, and displaying that the virtual flight vehicle swings to the right;
the display module is further configured to obtain a first position and a second position in response to any second virtual object entering a search range of the virtual flight vehicle, where the first position is a position of the virtual flight vehicle in the virtual scene, and the second position is a position of any second virtual object in the virtual scene; determining a target plane according to the first position, wherein the target plane is a plane corresponding to a visual angle picture of the virtual flight vehicle, and the first position is the center of the visual angle picture; the plane corresponding to the visual angle picture of the virtual flight carrier is a plane where a virtual lens shooting the visual angle picture of the virtual flight carrier is located; making a perpendicular line from the second position to the target plane, determining an intersection point of the perpendicular line and the target plane as a third position, wherein the third position is a position of the second virtual object in a visual angle picture of the virtual flight vehicle; displaying an object mark at the third position, wherein the second virtual object is a virtual object hostile to the first virtual object, and the object mark is used for marking any second virtual object;
The control module is further used for responding to the interactive triggering operation of the virtual flying carrier and controlling the virtual machine gun to shoot in the direction indicated by the aiming mark; when the first virtual object is attacked, switching a view angle picture into a view angle picture of the first virtual object;
the display module is further configured to display an unclear animation on a view angle picture of the virtual flight vehicle when the existing duration of the virtual flight vehicle reaches a preset duration or the virtual life value reaches a preset value, destroy the virtual flight vehicle, and switch the view angle to a view angle picture of the first virtual object.
7. The apparatus of claim 6, wherein the display module is configured to:
responding to the triggering operation of the virtual flight vehicle, and displaying a picture of the first virtual object using a calling prop in a view angle picture of the first virtual object, wherein the calling prop is used for calling the virtual flight vehicle;
and if the first virtual object finishes the use of the calling prop, switching the view angle picture of the first virtual object into the view angle picture of the virtual flight carrier for display.
8. The apparatus of claim 6, wherein the control module is configured to perform at least one of:
controlling the virtual flight vehicle to fly horizontally in the virtual scene;
controlling the virtual flight vehicle to fly vertically in the virtual scene;
and controlling the virtual flight vehicle to switch the flight direction in the virtual scene.
9. The apparatus of claim 8, wherein the view of the virtual flight vehicle displays a first vertical flight control and a second vertical flight control;
the control module is used for:
responding to the triggering operation of the first vertical flight control, and controlling the virtual flight vehicle to fly vertically upwards in the virtual scene;
and responding to the triggering operation of the second vertical flight control, and controlling the virtual flight vehicle to fly vertically downwards in the virtual scene.
10. The apparatus of claim 6, wherein the visual angle view of the virtual flight vehicle displays an aiming identifier and a shooting control, and wherein the interactive triggering operation of the virtual flight vehicle is a triggering operation of the shooting control;
the control module is used for:
And responding to the triggering operation of the shooting control, and controlling the virtual machine gun to shoot in the direction indicated by the aiming mark.
11. A terminal comprising a processor and a memory for storing at least one computer program loaded by the processor and executing the method of controlling a virtual flight vehicle according to any one of claims 1 to 5.
12. A computer readable storage medium, characterized in that the computer readable storage medium is adapted to store at least one computer program for executing the method of controlling a virtual flight vehicle according to any one of claims 1 to 5.
CN202110530441.2A 2021-05-14 2021-05-14 Control method, device, terminal and storage medium of virtual flight vehicle Active CN113117333B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110530441.2A CN113117333B (en) 2021-05-14 2021-05-14 Control method, device, terminal and storage medium of virtual flight vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110530441.2A CN113117333B (en) 2021-05-14 2021-05-14 Control method, device, terminal and storage medium of virtual flight vehicle

Publications (2)

Publication Number Publication Date
CN113117333A CN113117333A (en) 2021-07-16
CN113117333B true CN113117333B (en) 2023-06-16

Family

ID=76781955

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110530441.2A Active CN113117333B (en) 2021-05-14 2021-05-14 Control method, device, terminal and storage medium of virtual flight vehicle

Country Status (1)

Country Link
CN (1) CN113117333B (en)

Also Published As

Publication number Publication date
CN113117333A (en) 2021-07-16

Similar Documents

Publication Publication Date Title
CN110917619B (en) Interactive property control method, device, terminal and storage medium
CN110507994B (en) Method, device, equipment and storage medium for controlling flight of virtual aircraft
EP4011471A1 (en) Virtual object control method and apparatus, device, and readable storage medium
CN111282275B (en) Method, device, equipment and storage medium for displaying collision traces in virtual scene
CN110585710B (en) Interactive property control method, device, terminal and storage medium
CN111744186B (en) Virtual object control method, device, equipment and storage medium
CN111589124B (en) Virtual object control method, device, terminal and storage medium
JP2024045184A (en) Method, apparatus and medium for controlling virtual object to mark virtual item
CN111589150B (en) Control method and device of virtual prop, electronic equipment and storage medium
CN110507990B (en) Interaction method, device, terminal and storage medium based on virtual aircraft
CN110465098B (en) Method, device, equipment and medium for controlling virtual object to use virtual prop
WO2021203856A1 (en) Data synchronization method and apparatus, terminal, server, and storage medium
CN113713382B (en) Virtual prop control method and device, computer equipment and storage medium
CN112057857B (en) Interactive property processing method, device, terminal and storage medium
CN112870715B (en) Virtual item putting method, device, terminal and storage medium
CN113713383B (en) Throwing prop control method, throwing prop control device, computer equipment and storage medium
US11786817B2 (en) Method and apparatus for operating virtual prop in virtual environment, device and readable medium
CN112138384A (en) Using method, device, terminal and storage medium of virtual throwing prop
CN112717410B (en) Virtual object control method and device, computer equipment and storage medium
CN113041622A (en) Virtual throwing object throwing method in virtual environment, terminal and storage medium
CN111921190B (en) Prop equipment method, device, terminal and storage medium for virtual object
CN113289331A (en) Display method and device of virtual prop, electronic equipment and storage medium
CN111659122B (en) Virtual resource display method and device, electronic equipment and storage medium
CN111589102B (en) Auxiliary tool detection method, device, equipment and storage medium
CN110960849B (en) Interactive property control method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40048312

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant