CN110893277A - Method, device and storage medium for controlling interaction of virtual object and throwing object - Google Patents

Method, device and storage medium for controlling interaction of virtual object and throwing object Download PDF

Info

Publication number
CN110893277A
CN110893277A CN201911190687.9A CN201911190687A CN110893277A CN 110893277 A CN110893277 A CN 110893277A CN 201911190687 A CN201911190687 A CN 201911190687A CN 110893277 A CN110893277 A CN 110893277A
Authority
CN
China
Prior art keywords
animation
throwing
target
virtual
virtual object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911190687.9A
Other languages
Chinese (zh)
Other versions
CN110893277B (en
Inventor
林凌云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911190687.9A priority Critical patent/CN110893277B/en
Publication of CN110893277A publication Critical patent/CN110893277A/en
Application granted granted Critical
Publication of CN110893277B publication Critical patent/CN110893277B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present application relates to a method, apparatus, computer-readable storage medium and computer device for controlling interaction of a virtual object with a projectile, the method comprising: when an interaction instruction is triggered, acquiring a target animation of a virtual object interacting with a throwing object pointed by the interaction instruction; displaying the target animation on an interactive page based on the visual angle direction of the virtual object; when the virtual object displayed in the target animation executes aiming action on a throwing object, determining a target throwing position in a virtual scene displayed on the interactive page; and continuously displaying the target animation based on the virtual object view direction corresponding to the target throwing position. The scheme provided by the application can complete the whole throwing operation based on one interactive instruction.

Description

Method, device and storage medium for controlling interaction of virtual object and throwing object
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, and a storage medium for controlling a virtual object to interact with a throwing object.
Background
With the rapid development of computer technology and the popularization of intelligent terminals, more and more interactive applications can provide virtual scenes and display virtual articles in the virtual scenes, such as network games, applications for military simulation exercises, somatosensory motion applications based on somatosensory equipment and the like. The user may control the virtual object to interact with the virtual item through the interactive application. The throwing object is a common virtual article, and a user can control the virtual object to throw the throwing object after the virtual object is equipped with the throwing object.
However, in a conventional virtual scene, a user needs to trigger various interaction instructions such as equipment, aiming, throwing and the like to control a virtual object to complete a throwing operation, so that a server needs to respond frequently, and resource waste is caused.
Disclosure of Invention
Based on this, it is necessary to provide a method, an apparatus, and a storage medium for controlling interaction between a virtual object and a throwing object, in order to solve the technical problem that a single throwing operation can be completed by a plurality of interaction instructions.
A method of controlling a virtual object to interact with a projectile, the method comprising:
when an interaction instruction is triggered, acquiring a target animation of a virtual object interacting with a throwing object pointed by the interaction instruction;
displaying the target animation on an interactive page based on the visual angle direction of the virtual object;
when the virtual object displayed in the target animation executes aiming action on a throwing object, determining a target throwing position in a virtual scene displayed on the interactive page;
and continuously displaying the target animation based on the virtual object view direction corresponding to the target throwing position.
An apparatus for controlling a virtual object to interact with a projectile, the apparatus comprising:
the throwing preparation module is used for acquiring a target animation of interaction between a virtual object and a throwing object pointed by an interaction instruction when the interaction instruction is triggered; displaying the target animation on an interactive page based on the visual angle direction of the virtual object;
the position aiming module is used for determining a target throwing position in a virtual scene displayed on the interactive page when the virtual object displayed in the target animation executes aiming action on a throwing object;
and the article throwing module is used for continuously displaying the target animation based on the virtual object view angle direction corresponding to the target throwing position.
A computer readable storage medium having stored thereon computer executable instructions which, when executed by a processor, cause the processor to perform the above-described method of controlling a virtual object to interact with a projectile.
A computer device comprising a memory and a processor, the memory having stored therein computer readable instructions which, when executed by the processor, cause the processor to perform the above-described method of controlling a virtual object to interact with a projectile.
According to the method, the device, the computer readable storage medium and the computer equipment for controlling the interaction of the virtual object and the throwing object, when the interaction brake is triggered, the target animation for interaction between the virtual object and the corresponding throwing object is automatically obtained, the effect that the virtual object sequentially executes different interaction actions on the throwing object can be embodied based on the target animation, namely, after one interaction action is executed, the next interaction action can be automatically switched without additional interaction instruction triggering; when the virtual object performs aiming action on the throwing object, the visual angle direction of the virtual object in the target animation is adjusted according to the automatically determined target throwing position, aiming can be automatically completed, the throwing object can be controlled to be thrown to the target throwing position in the virtual scene by continuously displaying the animation, extra interaction instruction triggering is also not needed, the whole throwing operation is completed based on one interaction instruction, user operation is simplified, meanwhile, the server response frequency is reduced, and server resources are saved.
Drawings
FIG. 1 is a diagram of an application environment for a method of controlling interaction of a virtual object with a projectile in one embodiment;
FIG. 2 is a schematic flow diagram of a method of controlling interaction of a virtual object with a projectile in one embodiment;
FIG. 3a is a diagram illustrating a virtual object performing a pull interaction with a virtual item, in accordance with an embodiment;
FIG. 3b is a diagram illustrating a virtual object performing targeting interactions on a virtual item, in one embodiment;
FIG. 3c is a schematic diagram of a virtual object performing a throwing interaction with a virtual item in one embodiment;
FIG. 4a is a schematic diagram of a page of an interactive page based on a first-person perspective in one embodiment;
FIG. 4b is a schematic diagram of a page of an interactive page based on a first-person perspective in another embodiment;
FIG. 5a is a schematic diagram of a page of an interactive page based on a third person perspective view in one embodiment;
FIG. 5b is a schematic diagram of a page of an interactive page based on a third person perspective view in another embodiment;
FIG. 6a is a flowchart illustrating switching of interaction states of virtual objects according to an embodiment;
fig. 6b is a schematic flow chart of the interaction state switching in the fast throwing mode according to an embodiment;
fig. 6c is a schematic flow chart illustrating the switching of the interaction state in the adaptive throwing mode according to an embodiment;
FIG. 7 is a schematic flow chart of the step of determining a target throw location in a virtual scene in one embodiment;
FIG. 8 is a schematic flow chart of the step of determining the target throwing position in a virtual scene in another embodiment;
FIG. 9 is a page diagram of a configuration page used to obtain interaction mode configuration data, under an embodiment;
FIG. 10 is a schematic diagram of a process for performing a throwing operation in a fast throwing mode for a virtual object in one embodiment;
FIG. 11 is a schematic diagram of a process for performing a throwing operation in an adaptive throwing mode for a virtual object in one embodiment;
FIG. 12 is a schematic diagram of a virtual object interacting with a projectile in one embodiment;
FIG. 13 is a schematic flow chart diagram illustrating a method for controlling the interaction of a virtual object with a projectile in one embodiment;
FIG. 14 is a schematic flow chart diagram illustrating a method for controlling the interaction of a virtual object with a projectile in another embodiment;
FIG. 15 is a schematic flow chart diagram illustrating a method for controlling the interaction of a virtual object with a projectile in yet another embodiment;
FIG. 16 is a block diagram of an apparatus for controlling the interaction of a virtual object with a projectile in one embodiment;
fig. 17 is a block diagram showing the structure of an apparatus for controlling the interaction of a virtual object with a projectile in another embodiment;
FIG. 18 is a block diagram of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Fig. 1 is a diagram of an application environment for a method of controlling a virtual object to interact with a projectile in one embodiment. Referring to fig. 1, the method for controlling the interaction of a virtual object with a projectile is applied to a system for controlling the interaction of a virtual object with a projectile. The control virtual object and projectile interaction system includes a terminal 110 and a server 120. The terminal 110 and the server 120 are connected through a network. An interactive application is running on the terminal 110. The interactive application refers to an application which can display an interactive page and control interaction between a virtual object and a throwing object in a virtual scene rendered by the interactive page, such as an online game application, a three-dimensional map application, a military simulation exercise application, a somatosensory application, a vr (virtual reality) application and the like. The Online game application may be a FPS (First-person shooter game) application, a MOBA (Multiplayer Online Battle Arena) application, or the like. The terminal 110 may specifically be a desktop terminal or a mobile terminal, and the mobile terminal may specifically be at least one of a mobile phone, a tablet computer, a notebook computer, and the like. The server 120 may be implemented as a stand-alone server or a server cluster composed of a plurality of servers.
As shown in fig. 2, in one embodiment, a method of controlling a virtual object to interact with a projectile is provided. The embodiment is mainly illustrated by applying the method to the terminal 110 in fig. 1. Referring to fig. 2, the method for controlling the interaction between the virtual object and the throwing object specifically includes the following steps:
s202, when the interactive instruction is triggered, the target animation of the virtual object interacting with the throwing object pointed by the interactive instruction is obtained.
The interactive instruction is an instruction which is triggered on an interactive page and can trigger the virtual object to interact with the throwing object. The interactive instruction may be an instruction generated based on a trigger operation of the interactive page. The trigger operation may specifically be a touch operation, a cursor operation, a key operation, or a voice operation. The touch operation can be touch click operation, touch press operation or touch slide operation, and the touch operation can be single-point touch operation or multi-point touch operation; the cursor operation can be an operation of controlling a cursor to click or an operation of controlling the cursor to press; the key operation may be a virtual key operation or a physical key operation, etc.
The interactive page is a page presented based on the interactive application. The interactive page can provide a virtual scene to the user. The virtual scene is a virtual environment provided by the interactive application when running on the terminal, and can be a simulation environment of the real world, a semi-simulation semi-fictional environment, or a pure fictional environment. The virtual scene may be a two-dimensional virtual scene or a three-dimensional virtual scene. For example, the virtual scene may include sky, land, sea, etc., and the land may include environmental elements such as desert, city, etc. The virtual scene can also be used for simulating real environments in different weathers, such as sunny days, rainy days, foggy days or nights. The virtual scene may further include a virtual object, and the virtual object may equip the virtual object, throw the equipped virtual object, or discard the equipped virtual object.
A virtual object refers to a movable stereoscopic model in a virtual scene that can be used to represent the user's avatar. The virtual object may be in any form, such as a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, etc. displayed in the virtual scene. Optionally, the virtual objects are three-dimensional stereo models created based on an animated skeleton technique in a three-dimensional virtual scene, each virtual object having its own shape and volume in the three-dimensional virtual scene, occupying a part of the space in the three-dimensional virtual scene.
Virtual items refer to items arranged in a virtual scene that can be armed, thrown, or discarded by a virtual object. For example, the virtual article may be clothing, a helmet, body armor, medical, cold or hot weapons, and the like. A projectile is a virtual object that can be thrown or fired in a virtual scene, such as an explosive, a firearm, a ball game, etc. Wherein, the explosive can be grenade, combustion bottle, bomb and the like. The firearm article can be a submachine gun, a sniper gun, a shotgun, etc. The ball game article can be a badminton, a tennis ball, a table tennis ball or a table tennis ball, and the like, which need to be thrown and aimed when in use.
The virtual object can execute single actions such as running, jumping, climbing, lying, squatting, steering and the like in a virtual scene, and can also execute interactive actions such as equipment, aiming, throwing and the like on a virtual article. The single body motion and the interactive motion can be respectively realized based on the pre-stored animation. The server prestores animations that the virtual object performs various single actions, and also prestores animations that the virtual object performs different interactive actions on different throws. The target animation refers to the animation that the virtual object and the throwing object pointed by the interactive instruction execute different interactive actions.
When a virtual object interacts with different projectiles, the type of action involved in the interaction may be different. For example, when the throwing object is a grenade, the virtual object and the throwing object need to sequentially perform four interactive actions of equipping, pulling a lead wire, aiming and throwing to finish one throwing. When the throwing object is a gun, the virtual object and the throwing object need to sequentially perform three interactive actions of equipping, aiming and shooting to finish one-time throwing. When the throwing object is a ball game object, the virtual object and the throwing object need to sequentially perform three interactive actions of equipping, aiming and throwing to finish one-time throwing.
Referring to fig. 3a, fig. 3a shows an image frame of an animation of a virtual object performing a pull line interaction on a virtual object in an embodiment, wherein a throwing object is located in one hand of the virtual object, and the other hand of the virtual object is also extended to the position of the throwing object, so that the action of pulling a line in a real scene can be simulated. Referring to FIG. 3b, FIG. 3b illustrates one image frame in an animation of a virtual object performing a targeting interaction on a virtual object in one embodiment. Referring to fig. 3c, fig. 3c illustrates one image frame in an animation of a virtual object performing a throwing interaction on a virtual item in one embodiment. As shown in fig. 3a, 3b and 3c, the virtual objects have different physical poses when performing different interactions on the virtual article.
Specifically, the terminal displays an interactive page based on the interactive application, and renders a virtual scene and a virtual object serving as a user on the interactive page. The user can control the virtual object to execute different single actions in the virtual scene so as to complete a certain task. During the task execution, the user can control the virtual object by means of a certain virtual object. When a virtual article is needed, a user can trigger operation of the virtual article based on the interactive page.
When the trigger operation occurs, the terminal acquires configuration data of a virtual article to which the trigger operation is directed. The configuration data for the virtual item may include item identification, item type, availability status, and the like. When the article type is a throwing article type, the terminal can control the virtual object to throw aiming at the virtual articles. And when the available state is available, generating an interactive instruction based on the article identification and the article type, and sending the interactive instruction to the server. The interactive instructions carry an item identification of the corresponding projectile. The server inquires a target animation of interaction between the virtual object and the throwing object corresponding to the article identification, and returns the target animation to the terminal.
Referring to FIG. 4a, FIG. 4a illustrates a page view of triggering an interactive instruction based on an interactive page in one embodiment. As shown in fig. 4a, the interaction page provides a trigger control 402 capable of triggering a virtual object 406 to interact with a projectile 408, when a trigger operation on the trigger control 402 occurs, the terminal generates a corresponding interaction instruction, and pulls a target animation of the virtual object and the projectile corresponding to the trigger control 402 to interact from the server based on the interaction instruction.
In one embodiment, the speed of triggering the interactive instruction in some scenes directly influences the task execution condition, and the triggering speed and the convenience of the interactive instruction have high requirements, so that corresponding triggering controls can be respectively displayed for each throwing object on an interactive page. The trigger control can be exposed in the edge area of the interactive page.
The throwing object which can be picked up by the virtual object can be a throwing object which belongs to the virtual object, or can also be a throwing object which is located near the virtual object and is in a non-attributive state. For example, from the perspective of a virtual article, the terminal may establish an attribution relationship table including a virtual object to which each virtual article belongs, and from the perspective of a virtual object, the terminal may establish an article library of virtual objects in which virtual articles belonging to the virtual object can be regarded as being stored. Then, when the terminal opens the item pool of the virtual object according to the user's operation, the virtual item belonging to the virtual object is displayed, and the pick-up button of the thrown object is displayed, and when the confirmation operation of the pick-up button by the user is detected, the virtual object is controlled to pick up the thrown object.
The terminal controls the virtual object to move in the virtual scene according to the operation of a user, displays a picture of the virtual scene in a preset range, can display a picking button of a throwing object when the throwing object is determined to be in a non-attributive state according to the attribution relation table when a certain throwing object is included in the preset range, and controls the virtual object to pick up the throwing object when the confirmation operation of the user on the picking button is detected.
Of course, the embodiment of the present invention is only described by taking an example of picking up one throwing object, and the article library of the virtual object or the virtual scene within the preset range may include one or more throwing objects, and the picking up process of each throwing object is similar to this, and is not described herein again.
In one embodiment, the trigger control corresponding to the virtual article may be adaptively hidden or physically displayed according to the control area of the interactive page by the user. For example, when the trigger point corresponding to the trigger operation based on the interactive page by the user is located in the central area of the interactive page, the trigger control corresponding to the virtual article may be hidden and displayed, so as to reduce the occlusion of the trigger control on the virtual scene and increase the visual field range of the virtual scene. The trigger operation may specifically be a touch operation, a cursor operation, or a key operation. The trigger point refers to a position pointed by the touch operation on the interactive page, a position where a cursor or a key is located, and the like. When the trigger point corresponding to the trigger operation based on the interactive page by the user is located in the edge area of the interactive page, the trigger control corresponding to the virtual article can be displayed physically for the user to control. The trigger point is that the trigger point corresponding to the trigger operation of the user based on the interactive page is located in the central area of the interactive page.
In one embodiment, the target animation includes a motion animation in which the virtual object performs each interactive action with a projectile pointed to by the interactive instruction. Different interactive actions have corresponding execution timings, and correspondingly, each action animation has a corresponding presentation timing. The terminal can pull the virtual object from the server based on the interaction instruction to complete the throwing operation, and the action animation of all interaction actions on the corresponding throwing object is needed to be executed, so that the interaction frequency with the server is reduced.
In one embodiment, during the throwing operation of the virtual object, the user may trigger a burst instruction to stop throwing the current throwing object, to throw another throwing object, or the like at any time. In this case, the terminal may also pull the animation of the first time sequence interactive action of the virtual object on the throwing object from the server based on the interactive instruction, and pull the animation of the next display time sequence from the server when the display of the animation of the next display time sequence is detected to be completed or to be completed, so as to improve the utilization rate of animation resources and further save data transmission resources between the terminal and the server.
And S204, displaying the target animation on the interactive page based on the visual angle direction of the virtual object.
Wherein, the visual angle direction of the virtual object refers to the position orientation of the virtual object in the virtual scene. As shown in fig. 4a, the terminal may display the virtual object in the first-person perspective, and the displayed virtual scene only includes the hand, arm, or virtual article held in the hand of the virtual object, so as to simulate the effect of observing the virtual scene through the perspective direction of the virtual object. As shown in fig. 5a, the terminal may further display the virtual object 506 using a third person perspective, where the third person perspective is in the same direction as the first person perspective, but the third person perspective displays the virtual object back to the terminal screen in the virtual scene, so that the user can see the action of the virtual object controlled by the user, the environment where the user is located, and the like in the virtual scene. The interactive page provides a trigger control 502 capable of triggering the virtual object 506 to interact with the thrower, when the trigger operation of the trigger control 502 occurs, the terminal generates a corresponding interactive instruction, and pulls the virtual object and the thrower corresponding to the trigger control 502 from the server to interact with a target animation based on the interactive instruction.
At one moment, the terminal can only display a local scene in the whole virtual scene based on the interactive page, wherein the local scene is the part of the virtual scene which can be observed by the virtual object currently. As shown in fig. 5b, when the viewing direction of the virtual object changes, that is, the position and orientation of the virtual object in the virtual scene changes, the local scene displayed on the interactive page changes, and the local scene is switched from one local scene to another local scene, so as to simulate the effect of turning the virtual object.
The terminal creates a virtual camera in the virtual scene to simulate the viewing direction of the virtual object. A virtual camera is a three-dimensional model of a virtual scene that is positioned around a virtual object. When the virtual object is presented in a first-person perspective, the virtual camera is located near or at the head of the virtual object. When the virtual object is displayed by adopting the third person weighing view angle, the virtual camera is positioned behind the virtual object, and the virtual scene is observed through the view angle of the virtual camera. The shooting direction of the virtual camera coincides with the viewing direction of the virtual object. The shooting direction of the virtual camera is an observation direction when observing the virtual scene at the first person's perspective or the third person's perspective of the virtual object.
Specifically, the terminal determines the current shooting direction of the virtual camera, and displays the target animation based on the current shooting direction of the virtual camera, namely, the visual angle direction of the virtual object in the target animation is controlled to be the shooting direction of the current virtual camera, so that the virtual object is in the same visual angle direction at the moment before the interactive instruction is triggered and at the moment after the interactive instruction is triggered, and the virtual object has a good action linking effect.
Further, the terminal controls automatic switching between different display time sequence action animations based on the timer. And the terminal displays the action animation of the first display time sequence on the interactive page. And the terminal starts a timer, determines the display duration of the first display time sequence action animation, and configures the timing parameters of the timer according to the display duration. The terminal triggers a timer to start timing from the beginning of displaying the first display time sequence action animation, and when the target time corresponding to the timing parameter is reached, the timer generates a timing instruction for triggering the display of the second display time sequence action animation. And the terminal displays the action animation of the second display time sequence according to the timing instruction, clears the timing time of the timer, restarts a new time, generates a timing instruction for triggering the action animation of the third display time sequence when the display time length of the action animation of the second display time sequence reaches, and so on until the display of the action animation of the last display time sequence is finished.
For example, when the throw object pointed by the interactive instruction is a grenade, the target animation comprises an equipment action animation, a pull line action animation, an aiming action animation and a throwing action animation, the corresponding display time duration is 0.5 second, 1 second and 0.5 second in sequence, the terminal firstly displays the equipment action animation, automatically triggers and displays the pull line action animation after 0.5 second from the beginning of displaying the equipment action animation, automatically triggers and displays the aiming action animation after 0.5 second from the beginning of displaying the pull line action animation, and automatically triggers and displays the throwing action animation after 1 second from the beginning of displaying the aiming action animation.
In one embodiment, during the process of showing the target animation, the terminal shields other control operations of the virtual object triggered by the user. In other words, during the interactive action of the virtual object on the throwing object, if the user triggers another control operation on the virtual object, the terminal may not need to respond to the control operation. Other control operations may be an operation to trigger the virtual object to perform some sort of monolithic action, an operation to terminate the current throw operation, or an operation to switch a throw that interacts with the virtual object, etc.
In one embodiment, the terminal responds to other control operations on the virtual object triggered by the user during the presentation of the target animation. If an operation of triggering the virtual object to execute a certain single action occurs in the process of displaying the target animation, the terminal acquires the animation (recorded as the control animation) of the virtual object executing the corresponding single action, fuses the control animation and the currently displayed target animation, and displays the fused animation obtained by fusion on the interactive page. For example, the virtual object includes an upper body region and a lower body region. And the terminal splices the upper body area of the virtual object in each image frame of the target animation with the lower body area of the virtual object in each image frame of the control animation and displays each spliced image frame. And if the current throwing operation is terminated in the process of displaying the target animation, the terminal stops displaying the target animation and continues displaying the image frame corresponding to the virtual object at the moment before the triggering interaction instruction. If the operation of switching the throwing object interacted with the virtual object occurs in the process of displaying the target animation, the terminal stops displaying the current target animation, the target animation of the virtual object interacting with the switched throwing object is obtained according to the mode, and the newly obtained target animation is displayed on the interaction page.
S206, when the virtual object displayed in the target animation executes the aiming action on the throwing object, the target throwing position is determined in the virtual scene displayed on the interactive page.
When the virtual object carries out throwing operation on the throwing object, the throwing object moves from the initial position to the final position in the virtual scene. The target throwing position refers to a terminal position where the throwing object moves.
Specifically, when the animation of the aiming action is displayed, the terminal tracks the position of the throwing object in the animation, determines the current position of the throwing object in the virtual scene, and takes the current position of the throwing object in the virtual scene as the initial position of the throwing object. The terminal determines a target throwing position of a throwing object in a virtual scene according to a preset aiming strategy, and adjusts the virtual camera from the current shooting direction to the direction facing the target throwing position, so that the virtual object is adjusted from the current visual angle direction to the direction facing the target throwing position. The preset aiming strategy can be to determine a random position in a virtual scene currently displayed by the interactive page as a target throwing position.
In one embodiment, the preset aiming strategy can also be to determine the position of a virtual target closest to the controlled virtual object as the target throwing position. In the multi-person online interactive application, a controlled virtual object selects one or more virtual targets in a virtual scene as objects to carry out attack interaction, wherein one attack mode is that the controlled virtual object throws a virtual article to the virtual target. The virtual target refers to other virtual objects which are the attack targets of the controlled virtual object in the virtual scene.
In one embodiment, the preset aiming strategy can also be that a position determined according to the concentration ratio of the target throwing positions (hereinafter referred to as 'historical throwing positions') of the throwing objects in a unit time length before the current time is used as the target throwing position of the controlled throwing object, and the like. The unit time period may be a predetermined length of time, such as 1 minute. Specifically, the highest concentration of the historical throwing positions of the throws can be used as the target throwing position of the current controlled thrower, or a position closest to each historical throwing position in the virtual scene can be used as the target throwing position of the current controlled thrower.
In one embodiment, the terminal may make a line segment between the initial position and the target throw position as the trajectory of the throw, such as trajectory 410 directed to target throw position 404 in fig. 4b above, or trajectory 508 directed to target throw position 504 in fig. 5b above. The terminal displays the motion track in the virtual scene, so that a user can know the motion track of the throwing object before throwing operation is carried out on the throwing object, and whether the throwing direction of the throwing object needs to be adjusted or not is determined.
In one embodiment, when an obstacle exists on a throwing track of the throwing object formed by the initial position and the target throwing position determined in the above manner, the position of the obstacle is determined as a final target throwing position. The virtual scene includes a plurality of scene elements, such as mountains, trees, stones, buildings, and the like. The obstacle is a scene element which is crossed with the throwing track of the current controlled throwing object in the space position in the virtual scene. For example, if the position C of the scene element stone is located on the line AB formed by the initial position a and the target throwing position B determined in the above manner, the position C of the scene element stone is determined as the final target throwing position, so that the final throwing track of the currently controlled throwing object is AC.
And S208, continuously displaying the target animation based on the virtual object view angle direction corresponding to the target throwing position.
Wherein the shooting direction of the virtual camera is consistent with the visual angle direction of the virtual object. Either one changes and the other follows. The user can adjust the shooting direction of the camera by controlling the virtual object to rotate (i.e., adjusting the viewing angle direction of the virtual object) through the control operation. If the control operation is a sliding operation, the terminal detects the sliding operation, and the rotating direction, the rotating angle and the rotating speed of the virtual camera corresponding to the sliding operation can be determined based on the sliding direction, the sliding distance and the sliding speed of the sliding operation. For example, the sliding direction of the sliding operation may correspond to the rotating direction of the virtual camera, the sliding distance of the sliding operation may be positively correlated with the rotating angle of the virtual camera, and the sliding speed of the sliding operation may be positively correlated with the rotating speed of the virtual camera.
The terminal may also shift the viewing direction of the virtual object by automatically adjusting the photographing direction of the virtual camera. The shooting direction of the virtual camera may be expressed in the form of a vector. The adjustment of the shooting direction of the virtual camera is essentially an adjustment of the vector coordinates of the virtual camera in a three-dimensional space coordinate system based on the virtual scene. The virtual scene includes a ground surface. The three-dimensional space coordinate system based on the virtual scene may be a coordinate system in which the position of the virtual camera is an origin, one direction parallel to the ground is an X-axis direction, the other direction parallel to the ground and perpendicular to the X-axis is a Y-axis direction, and the direction perpendicular to the ground is a Z-axis direction. The vector of the virtual camera shooting direction may be (X, Y, Z). The movement of the virtual camera may be a rotation about the X, Y or Z axis, a panning along the X axis, a translation along the Y axis, or an elevation along the Z axis, etc.
Specifically, the terminal adjusts the virtual camera from the current shooting direction to a direction toward the target throwing position after determining the target throwing position of the throwing object. And the terminal continuously displays the animation of the aiming action based on the adjusted shooting direction of the virtual camera, so that the virtual object in the animation continuously executes the aiming action towards the target throwing position. When the animation of the throwing motion is continuously displayed, the virtual object can execute the throwing motion towards the adjusted visual angle direction, and then the throwing object can be thrown to the target throwing position in the virtual scene, so that one-time throwing operation is completed.
In one embodiment, when the interactive page shows the animation of the virtual object performing the aiming action on the throwing object, the terminal can monitor the control operation of the user on the virtual object, and adjust the shooting direction of the virtual camera according to the control operation. The target throwing position of the throwing object can be determined according to the adjusted shooting direction of the virtual camera, and the shooting direction of the adjusted virtual camera is determined as the throwing direction of the throwing object.
According to the method for controlling the interaction between the virtual object and the throwing object, after the interaction brake is triggered, the target animation for interaction between the virtual object and the corresponding throwing object is automatically obtained, the effect that the virtual object sequentially executes different interaction actions on the throwing object can be reflected on the basis of the target animation, namely, after one interaction action is executed, the next interaction action can be automatically switched without additional interaction instruction triggering; when the virtual object performs aiming action on the throwing object, the visual angle direction of the virtual object in the target animation is adjusted according to the automatically determined target throwing position, aiming can be automatically completed, the throwing object can be controlled to be thrown to the target throwing position in the virtual scene by continuously displaying the animation, extra interaction instruction triggering is also not needed, the whole throwing operation is completed based on one interaction instruction, user operation is simplified, meanwhile, the server response frequency is reduced, and server resources are saved.
In one embodiment, the method for controlling the virtual object to interact with the throwing object further comprises: acquiring initial animation of interaction between a virtual object and a throwing object; the initial animation comprises a plurality of action animations with different presentation time sequences; splicing at least two action animations with adjacent display time sequences and action types of virtual objects in the action animations meeting splicing conditions to obtain a state animation in an interactive state; and respectively determining each action animation which does not meet the splicing condition as a state animation to obtain a target animation comprising a plurality of state animations.
The target animation described in the above embodiment is substantially the initial animation of the present embodiment. As with the target animation described in the embodiments above, the initial animation includes an action animation of each interactive action that the virtual object needs to perform on the projectile in turn to complete one throwing operation. Different action animations have different presentation timings. In this embodiment, the target animation is obtained by splicing all or part of the motion animations adjacent to the display time sequence in the initial animation. The target animation of the embodiment comprises a plurality of state animations, wherein at least one state animation is spliced by at least two action animations. When a state animation is a motion animation, the representation is that the motion animation which is not spliced with other motion animations is determined as the state animation in an independent interaction state.
Each action animation has a corresponding action type tag. As above, the types of actions for which the virtual object performs different interactive actions on the projectile include at least one of arming, aiming, throwing, and the like. And the type of action of the interactive action involved when the virtual object interacts with different projectiles is different. The splicing condition is a predetermined label combination of the type of action capable of splicing, such as (pull wire, aim), (pull wire, aim, throw), etc. The tag combination may be determined according to the difference of the execution logic between different interactions, for example, the action type tags corresponding to a plurality of interactions whose execution logic approximation degree is smaller than a threshold may be determined as one tag combination. The splicing of the two animations is realized based on image frame splicing, specifically, a splicing template can be established according to a preset reference frame, and subsequent frames are obtained by projection fusion by using template parameters. The splicing of the animation can be done based on software tools such as video.
Specifically, the terminal obtains initial animations of the virtual objects interacting with the virtual objects, determines a plurality of action animations of adjacent display time sequences capable of being spliced in the initial animations according to splicing conditions, and splices the determined action animations to obtain a state animation in an interaction state. And each action animation which does not meet the splicing condition and cannot be spliced with other action animations is directly and independently determined to be a state animation, so that the target animation comprising a plurality of state animations is obtained. When the interactive instruction is triggered, the terminal can control automatic switching between different display time sequence action animations based on the timer.
For example, the initial animation of the virtual object interacting with the projectile M includes an equipment action animation a, a holding action animation B, a pull line action animation C, a aiming action animation D and a throwing action animation E, and assuming that the preset action type label of the interaction action includes (equipment, holding), a and B can be spliced to obtain a state Animation (AB), and C, D and E which do not meet the splicing condition can be directly used as a state animation (C), (D) and (E), respectively, to obtain a target animation including (AB), (C), (D) and (E). Assuming that the preset action type label of the interactive action comprises (pull wire, aim, throw), C, D and E can be spliced to obtain a status animation (CDE), and A and B which do not meet the splicing condition can be directly used as a status animation (A), (B) respectively, thereby obtaining a target animation containing (A), (B) and (CDE). Assuming that the preset action type label of the interaction only includes (pull wire, aim), C and D can be spliced to obtain a state animation (CD), A, B, E not meeting the splicing condition is directly determined as a state animation (a), (B), (E), respectively, thereby obtaining the target animation including (a), (B), (CD), and (E).
In one embodiment, the step of converting the initial animation into the target animation, i.e. splicing the action animation into the state animation, may be performed at the terminal, and the terminal sends the converted target animation to the server for storage. The step of converting the initial animation into the target animation may be performed in the server, or may be performed in other computer devices. And the other computer equipment sends the target animation obtained by conversion to a server for storage.
In the embodiment, the state animation is spliced in advance, when the interactive instruction is triggered, the interactive instruction is directly responded based on the state animation, and the interactive instruction is taken at any time, so that the interactive instruction response efficiency is improved; the plurality of action animations are spliced together in advance, the processing process that the terminal controls the switching between the non-used action animations through the timer is reduced, and the response efficiency of the interactive instruction can be further improved.
In one embodiment, presenting the target animation based on the viewing direction of the virtual object in the interactive page comprises: determining the waiting time of each state animation according to the display time and the display time sequence of the state animation; starting timing after an interactive instruction is triggered, and determining the current visual angle direction of the virtual object; and when the waiting time corresponding to one state animation is reached, triggering the display of the corresponding state animation based on the current view angle direction on the interactive page.
And the display time sequence of the state animation in the converted target animation is consistent with the display time sequence of the corresponding action animation in the initial animation. The presentation duration of the state animation may be determined by summing the presentation durations of the respective one or more action animations. For example, the presentation duration of the above-mentioned state animation (CDE) may be the sum of C, D, E presentation durations of three motion animations.
Specifically, the terminal controls automatic switching between different presentation time sequence state animations based on a timer. Referring to fig. 6a, fig. 6a shows a schematic flow diagram of switching between different interaction states of a virtual object and a projectile in one embodiment. When the projectile is a grenade, the corresponding target animation includes four state animations (a), (B), (CD), and (E), as shown in fig. 6 a. And the terminal displays the state animation (A) of the first display time sequence on the interactive page. And the terminal starts a timer, determines the display time length of the state animation (A), and configures the timing parameters of the timer according to the display time length. And the terminal triggers a timer to start timing from the state animation (A) which starts to be displayed, and when the target time corresponding to the timing parameter is reached, the timer generates a timing instruction which triggers the state animation (B) which displays the second display time sequence. And the terminal displays the state animation (B) according to the timing instruction, clears the timing time of the timer, restarts a new round of timing, generates a timing instruction for triggering the state animation (CD) displaying a third display time sequence when the display time length of the state animation (B) is up, and so on until the state animation (E) displaying the time sequence finally is finished.
Referring to fig. 6b, fig. 6b shows a schematic flow diagram of switching between interaction states of a virtual object and a projectile in another embodiment. As shown in FIG. 6B, the target animation includes three state animations, state animation (A), (B), and (CDE). And the terminal sequentially displays the state animation (A), the state animation (B) and the state animation (CDE) on the interactive page according to the mode.
In the embodiment, the interactive instruction is responded based on the converted target animation, so that the times of controlling animation switching by the terminal can be reduced, the data processing load of the terminal is reduced, and the interactive instruction response efficiency is improved; the consistency of different interactive actions executed by the virtual object is enhanced, and the instruction execution fluency is better.
In one embodiment, the target animation includes a ready state animation and a quick cast state animation; as shown in fig. 7, the step S206, when the virtual object displayed in the target animation performs the aiming action on the throwing object, the step of determining the target throwing position in the virtual scene displayed in the interactive page includes:
s702, when the display of the ready-state animation is finished, judging whether the interactive instruction is still continuous.
And S704, when the interactive instruction is not continued any more, displaying the quick-throw state animation.
S706, when the virtual object displayed in the fast casting state animation executes the aiming action on the casting object, the target casting position is determined in the virtual scene displayed on the interactive page.
Step S208, the step of continuing to display the target animation based on the virtual object viewing direction corresponding to the target throwing position includes:
and S708, continuously displaying the quick-throw state animation based on the visual angle direction of the virtual object corresponding to the target throwing position.
The preparation state animation and the quick-cast state animation are two state animations with adjacent display time sequences in the target animation. The ready state animation display timing precedes the quick throw state animation. The preparation state animation is a state animation that does not include the content of the virtual object performing the aiming motion for the projectile. The quick-cast state animation is a state animation including a content of the virtual object performing the aiming action on the projectile.
For example, in the above example, when the target animation of the virtual object interacting with the projectile M includes the state animations (a), (B), and (CDE), the state animations (a) and (B) may be ready-state animations, and the state animation (CDE) may be a quick-throw state animation. When the target animation of the virtual object interacting with the projectile M includes the state Animations (AB), (CD), and (E), the state Animation (AB) may be a ready-state animation, and the state animations (CD) and (E) may be a quick-throw state animation.
The interactive instructions have corresponding instruction states including a persistent state and a terminating state. The user may control the interactive instructions to switch from the persistent state to the terminating state. According to the triggering mode of the interactive instruction, the mode of correspondingly switching the instruction state of the interactive instruction can be different. For example, when the interactive instruction is triggered based on a trigger control of a throwing object provided by the interactive page, the continuous state of the interactive instruction can be adjusted by controlling the trigger state of the trigger control, and for example, when the trigger control is pressed all the time, the corresponding interactive instruction is in the continuous state; and when the trigger control is released, the corresponding interactive instruction is in a termination state. When the interactive instruction is triggered based on voice operation, the continuous state of the interactive instruction can be adjusted by inputting different voice instructions, for example, when a user sends a 'thunder' voice instruction, the corresponding interactive instruction is in the continuous state; when the user sends out a voice command of 'throwing the grenade', the corresponding interactive command is in a termination state.
Specifically, the terminal displays the ready state animation and the quick-throw state animation in sequence on the interactive page according to the mode. Referring to fig. 6c, fig. 6c shows a schematic flow diagram of switching between interaction states of a virtual object and a projectile in yet another embodiment. As shown in fig. 6C, the target animation includes preparation state animations (a) and (B), a pull state animation (C), a targeting state animation (D), a throwing state animation (E), and a quick casting state animation (CDE). And when the preparation state animations (A) and (B) are displayed, the terminal judges whether the interactive instruction continues. If the interactive instruction is not continued any more, namely the interactive instruction is terminated, the terminal continues to display the quick-throw state animation (CDE), and when the virtual object displayed in the quick-throw state animation (CDE) performs the aiming action on the throwing object, the target throwing position of the throwing object is automatically determined in the virtual scene according to the mode. The quick cast state animation (CDE) includes a throwing motion animation E. The terminal continuously displays the quick casting state animation based on the visual angle direction of the virtual object corresponding to the target casting position, and then the virtual object can be controlled to cast the casting object to the target casting position.
In the embodiment, the requirement of the user on the throwing efficiency of the throwing object can be judged according to the continuous state of the interactive instruction, when the interactive instruction is displayed with the ready-state animation, the interactive instruction is not continuous any more, the fact that the user expects to throw the throwing object quickly is shown, the terminal automatically finishes aiming at the moment, the participation operation of the user is simplified, the interactive execution response efficiency is improved, and the urgent throwing requirement of the user can be met.
In one embodiment, determining a target throw position in a virtual scene of an interactive page presentation comprises: and determining the position of the target coordinate point in the interactive page corresponding to the virtual scene as a target throwing position.
The target coordinate point refers to a coordinate point in a two-dimensional coordinate system based on the interactive page. The two-dimensional coordinate system based on the interactive page may be a rectangular coordinate system formed with one vertex in the rectangular interactive page as an origin and one side including the origin as a coordinate axis. The target coordinate point may be a coordinate point preset in the interactive page. As shown in fig. 4b or fig. 5b above, the target coordinate point may be a center point of the interaction page. It is to be understood that the target coordinate point may also be any other predetermined position in the interactive page, which is not limited in this respect. The target coordinate point may also be a dynamically determined coordinate point. For example, the position with the highest virtual target density in the currently displayed interactive page may be determined as a target coordinate point, and the like.
Specifically, the terminal determines the position coordinates of the target coordinate point in a two-dimensional coordinate system based on the interactive page. And the terminal converts the target coordinate point from a two-dimensional coordinate system based on the interactive page to a three-dimensional coordinate system based on the virtual camera to obtain the position coordinate of the target coordinate point in the three-dimensional coordinate system based on the virtual camera. And determining the target throwing position of the throwing object in the virtual scene according to the position coordinates of the target coordinate point in the three-dimensional coordinate system based on the virtual camera.
In the embodiment, the target coordinate point is determined based on the interactive page, and compared with the method for directly calculating and determining the target throwing position of the throwing object in the virtual scene, the method for determining the position coordinate of the two-dimensional space has simple operation logic and is beneficial to improving the efficiency of determining the target throwing position when the target coordinate point is converted into the target throwing position in the virtual scene.
In one embodiment, determining a target throw position in a virtual scene of an interactive page presentation comprises: when a virtual target exists in a throwing distance range corresponding to the virtual target in a virtual scene, acquiring preset priority data; selecting a virtual target within the throwing distance range according to the priority data; and determining the position of the selected virtual target in the virtual scene as the target position of the throwing object.
Wherein different throws have different throw distance ranges. The throwing distance range may be specifically a circular region formed by using the position of the virtual object as an origin point and the target throwing position as a boundary point in the virtual scene, and the like. The target throwing position is taken as a boundary point, and the length of the projection of a straight line segment formed by connecting the initial position of the throwing object and the target throwing position on the ground of the virtual scene is taken as a radius.
The preset priority data refers to the condition of preferentially selecting the virtual target which is preset according to actual needs. A setting interface may be provided so that the user can set the condition for selecting the virtual target, for example, distance priority or target attribute value priority is set through the provided setting interface in a game scene. The priority data may be that the virtual target is preferentially selected from a near to a far according to the distance, or the virtual target is preferentially selected from a low to a high according to the attribute value of the virtual target, and the like.
In one embodiment, the terminal acquires the priority data input by the user through the setting interface, and can store the identifier of the controlled virtual object. The terminal can also upload the identifier of the controlled virtual object and the corresponding priority data to the server for storage. Therefore, when the terminal cannot acquire the priority data from the local, the priority data corresponding to the identification of the controlled virtual object can be acquired from the server.
Specifically, when the target animation is displayed to the virtual object to perform aiming action on the throwing object, the terminal acquires position data of each virtual target and the controlled virtual object in the virtual scene, and calculates the distance between each virtual target and the controlled virtual object in the scene according to the position data. If the distance between the virtual target and the controlled virtual target is less than or equal to the throwing distance of the throwing object, the virtual target is within the throwing distance range corresponding to the throwing object, and if the virtual target is outside the throwing distance range corresponding to the throwing object.
For example, if the throwing distance corresponding to the grenade is 5 meters and the throwing distance corresponding to the combustion bottle is 7 meters, when the obtained throwing object is the grenade, the controlled virtual object and the position data of each virtual object in the scene are obtained, the distance between each virtual object and the controlled virtual object is calculated according to the position data, and whether a virtual object with the distance between the virtual object and the controlled virtual object within the range of 5 meters exists in the virtual scene is detected. When the obtained throwing object is a combustion bottle, whether a virtual target with the distance of 7 meters from the controlled virtual object exists in the virtual scene or not is detected.
Further, if the preset priority data is to preferentially select the virtual targets from near to far according to the distance, the virtual target with the closest distance is preferentially selected when the virtual target is selected, and in one embodiment, if more than two virtual targets with the closest distance exist, the virtual target is randomly selected from the more than two virtual targets or the virtual target with the lowest attribute value is selected. If the preset priority data is that the virtual target is preferentially selected according to the attribute values from low to high, the virtual target with the lowest attribute value is preferentially selected when the virtual target is selected. In one embodiment, if there are more than two virtual targets with the lowest attribute value, then a virtual target is randomly selected from the more than two virtual targets or a virtual target with the closest distance is selected.
In one embodiment, when the obtained throwing distance range corresponding to the throwing object does not have the virtual target, a position is determined in the virtual scene as the target throwing position of the throwing object according to a preset other mode of determining the target throwing position. For example, the position corresponding to the current viewing angle direction of the controlled virtual character is determined as the target throwing position, or a default position in the virtual scene is determined as the target throwing position.
In a traditional mode, when a controlled virtual object selects one or more virtual targets as objects in a virtual scene for attack interaction, the controlled virtual object can throw a throwing object to a virtual target closest to the controlled virtual object by default after acquiring a control instruction. If the selected virtual target is not the virtual target which the controlled virtual object wants to interact with, the position of the controlled virtual object needs to be adjusted by acquiring other control instructions, so that a lot of unnecessary data processing is generated, and resource waste is caused.
In the embodiment, the virtual target is selected within the throwing distance range according to the priority data, and compared with the traditional method in which a virtual target closest to the controlled virtual object is selected by default for interaction, a more reasonable virtual target can be selected within the throwing distance range according to the preset priority data for interaction, so that the resource waste is reduced.
In one embodiment, the target animation includes a readiness animation, a targeting action animation, and a throwing action animation; as shown in fig. 8, the step S206, when the virtual object performs the aiming action on the throwing object in the target animation, determining the target throwing position in the virtual scene displayed on the interactive page includes:
s802, when the display of the ready-state animation is finished, whether the interactive instruction continues is judged.
And S804, when the interactive instruction continues, displaying the aiming state animation.
S806, monitoring control events of the virtual object in the process of displaying the aiming action animation.
And S808, determining a target throwing position according to the visual angle direction of the virtual object in the control event, and returning to the step of judging whether the interactive instruction is still continuous.
Step S208, the step of continuing to display the target animation based on the virtual object viewing direction corresponding to the target throwing position includes:
and S810, when the interaction instruction is not continued any more, showing the throwing motion animation based on the virtual object view direction corresponding to the target throwing position.
In which, as above, a single motion animation may be used as a state animation, and thus the aiming motion animation in the target animation may be referred to as the aiming state animation and the throwing motion animation may be referred to as the throwing state animation. It can be understood that the aiming action animation in the target animation adopted in the embodiment is necessarily an independent action animation and is not spliced with other action animations. The preparation state animation is two state animations adjacent to the presentation time sequence in the target animation of the aiming action animation. The readiness animation can be a separate one or more action animations, or a concatenation of multiple action animations.
For example, in the above example, when the target animation of the virtual object interacting with the projectile M includes the state animations (a), (B), (C), and (D), (a) and (B) may be the preparation state animation, (C) is the aiming state animation, and (D) is the throwing state animation. When the target animation of the virtual object interacting with the projectile M includes the state Animations (AB), (C), and (D), (AB) may be a ready state animation.
The control event refers to the behavior of a control operation on the virtual object triggered by the user based on the interactive page. As above, during the presentation of the target animation, i.e. during the interaction of the virtual object with the projectile, the user may trigger other control operations on the virtual object. The control operation may specifically be an operation of triggering the virtual object to perform a single motion such as running, jumping, and turning, an operation of terminating a current throwing operation, or an operation of switching a throwing object interacting with the virtual object, or the like. Operations that trigger the virtual object to perform a single action of running, jumping, turning, etc., may change the viewing direction of the virtual object in the virtual scene, belonging to a direction adjustment event.
Specifically, the terminal displays the ready state animation, the aiming state animation and the throwing state animation in turn on the interactive page according to the mode. As shown in fig. 6C, when the ready-state animations (a), (B), and (C) are displayed, the terminal determines whether the interactive command continues. And if the interactive instruction is still continuous, the terminal continues to display the aiming motion animation D. In the presentation aiming motion animation, when the user wishes to adjust the throwing direction of the throwing object, a direction adjustment operation can be performed. And in the process of displaying the aiming action animation, whether the continuous interactive instruction of the terminal is still continuous or not and whether a control event for the virtual object occurs in the virtual scene or not. When the direction adjusting event is monitored, the terminal adjusts the shooting direction of the virtual camera according to the direction adjusting event, and the shooting direction of the adjusted virtual camera is determined as the throwing direction of the throwing object.
In one embodiment, the step of determining a target throw position based on the perspective direction of the virtual object at the control event comprises: the terminal shows visual angle adjustment controls based on the interactive page, and a user can perform direction adjustment operation through the visual angle adjustment controls to control the shooting direction of the virtual camera to rotate clockwise or anticlockwise so as to adjust the throwing direction of the thrown object. The direction adjustment operation may be a pressing operation, the user performs a pressing operation on a visual angle adjustment control of the terminal, and the terminal may determine an adjustment angle of the shooting direction based on a specific position of the pressing operation in the visual angle adjustment control, a pressing pressure degree of the pressing operation, and a pressing time.
In one embodiment, the step of determining a target throw position based on the perspective direction of the virtual object at the control event comprises: the direction adjusting operation may also be a rotating operation on the terminal, and when the terminal detects the rotating operation, the adjusting angle of the shooting direction of the virtual camera is determined according to the rotating direction, the rotating angle and the rotating speed of the rotating operation. Or, the direction adjusting operation may be a sliding operation in a process that the user presses the trigger control corresponding to the throwing object, and when the terminal detects the sliding operation, the adjusting angle of the shooting direction of the virtual camera is determined according to the sliding direction and the sliding distance of the sliding operation. It will be appreciated that the direction adjustment operation may also be a key operation or a toggle operation on a joystick device, among other operations.
Further, before the interactive command is terminated, if a plurality of direction adjustment events occur, the shooting direction of the virtual camera needs to be readjusted for a plurality of times, and during the period, the aiming animation is repeatedly shown if the aiming animation is shown completely. When the interaction instruction is monitored to be terminated, the terminal finishes displaying the aiming action animation on the interaction page, and displays the throwing action animation based on the shooting direction of the virtual camera after the last adjustment, so that the virtual object can be controlled to throw the throwing object to the target throwing position set by the user.
In the embodiment, the requirement of the user on the throwing efficiency of the throwing object can be judged according to the continuous state of the interactive instruction, when the interactive instruction is still continuous when the ready state animation is displayed, the fact that the user has high requirement on the throwing speed of the throwing object is not shown, but has high requirement on the throwing accuracy is shown, at the moment, the terminal supports the user to control the auxiliary aiming of the virtual object, and the requirement of the user on accurate throwing is met.
In one embodiment, the method for controlling the virtual object to interact with the throwing object further comprises: determining an interaction mode corresponding to the interaction instruction; the interaction mode includes a fast throw mode; determining a target throwing position in a virtual scene presented by an interactive page comprises: in the fast throwing mode, a target throwing position is determined based on the current position of the virtual object in the virtual scene.
The embodiment of the application supports the virtual object to interact with the throwing object in different interaction modes based on different requirements of a user on the throwing speed, throwing accuracy and the like of the virtual object throwing object. Different interaction modes are used to meet different throwing needs of the user. In an embodiment of the application, the interactive application provides at least one of a fast throwing mode, a precision throwing mode and an adaptive throwing mode.
In the fast throwing mode, the user can complete the whole throwing operation by only triggering one interaction instruction. In the accurate throwing mode, after a user triggers an interactive instruction, the user is supported to automatically control the aiming action of the virtual object so as to ensure that a throwing object is thrown to the user's intention position. In the adaptive throwing mode, the user may select one of a fast throwing mode and a precision throwing mode by controlling a duration state of the interactive execution. For example, if the interaction instruction continues after the preset duration of the interaction instruction is triggered, continuing to execute the subsequent throwing operation based on the accurate throwing mode; and if the interaction instruction is terminated after the preset time of the interaction instruction is triggered, continuing to execute the subsequent throwing operation based on the quick throwing mode. The preset duration may be a length of time less than or equal to the ready state animation presentation duration.
Specifically, when the interactive instruction is triggered, the terminal acquires interactive mode configuration data. The interaction mode configuration data includes the selected interaction mode, the interaction mode applicability, and the like. The range of the interaction mode is the range of the throws which are applicable to the configured interaction mode, such as all throws, one or more specific types of throws, one or more specific throws and the like. And the terminal correspondingly stores the identification of the virtual object and the interaction mode configuration data to the server.
In one embodiment, the terminal acquires the interaction mode configuration data input by the user through a configuration page. Referring to FIG. 9, FIG. 9 is a page diagram that illustrates a configuration page used to obtain interaction mode configuration data, in one embodiment. As shown in fig. 9, the configuration page provides a variety of interaction mode options. The user can specify the interaction mode corresponding to each throwing object by making the throwing object and selecting the interaction mode option.
Further, the terminal determines an interaction mode suitable for the thrower to which the current interaction instruction points according to the interaction mode configuration data. Referring to fig. 10, fig. 10 shows a schematic flow diagram of a virtual object interacting with a projectile in a fast throwing mode in one embodiment. As shown in fig. 10, if the interactive mode is the fast throwing mode, the terminal determines whether the throwing object is available according to the configuration data of the throwing object pointed by the interactive instruction. Taking the selected throwing object as the grenade as an example, when the throwing object is available, the terminal obtains a target animation of the interaction between the virtual object and the throwing object pointed by the interaction instruction. And the terminal displays the target animation in the interactive page based on the visual angle direction of the virtual object. In the target animation, the virtual object performs a set-up action animation, a pull-line action, a sighting action, and a throwing action on a throwing object in order. When the virtual object displayed in the target animation executes the aiming action on the throwing object, the terminal automatically determines the target throwing position in the virtual scene displayed on the interactive page. And the terminal continuously displays the target animation based on the visual angle direction of the virtual object corresponding to the target throwing position. Please refer to the descriptions of steps S202 to S208 and fig. 6b, which are not repeated herein.
In one embodiment, when the target animation comprises a ready state animation and a quick cast state animation, the terminal sequentially displays the ready state animation and the quick cast state animation on the interactive page. And when the display of the ready-state animation is finished, the terminal continues to display the quick-throw state animation, and automatically determines the target throwing position of the throwing object in the virtual scene when the virtual object performs the aiming action on the throwing object in the quick-throw state animation. The terminal continuously displays the quick casting state animation based on the visual angle direction of the virtual object corresponding to the target casting position, and then the virtual object can be controlled to cast the casting object to the target casting position. The state animation response interaction instruction spliced with the action animations is adopted, and the requirement of a user on quick throwing can be better met.
The way to automatically determine the target throwing position of the throwing object in the virtual scene may be: the method comprises the steps of determining a random position in a virtual scene currently displayed by an interactive page as a target throwing position, determining the position of a virtual target closest to a controlled virtual object as a target throwing position, determining the position with the highest concentration of historical throwing positions of all throws as the target throwing position of the currently controlled throws, determining the position of a target coordinate point in the interactive page corresponding to the virtual scene as the target throwing position, determining the position of the virtual target closest to the controlled virtual object as the target throwing position, determining the position of a virtual target selected in a throwing distance range according to priority data in the virtual scene as the target throwing position and the like.
In the embodiment, the user is supported to select different interaction modes, in the rapid throwing mode, the state machine can automatically trigger to complete the whole throwing process only by triggering the interaction instruction once, and the target throwing position of the throwing object is automatically determined to be released in the aiming link in the throwing process, so that the user operation is greatly simplified, and the response efficiency of the interaction instruction is improved.
In one embodiment, the method for controlling the virtual object to interact with the throwing object further comprises: determining an interaction mode corresponding to the interaction instruction; the interaction mode comprises an adaptive throwing mode; when the virtual object and the throwing object are in the aiming state in the target animation, the step of determining the target throwing position in the virtual scene displayed on the interactive page comprises the following steps: in the self-adaptive throwing mode, when the virtual object and the throwing object displayed in the target animation are in the aiming state, judging whether the interaction instruction is still continuous; if the virtual object continues to be thrown, determining a target throwing position based on the monitored control event of the virtual object; if the target throwing position is determined, the target throwing position is determined based on the current position of the virtual object in the virtual scene.
Specifically, when the interactive instruction is triggered, the terminal determines an interactive mode suitable for a thrower to which the current interactive instruction points according to the interactive mode configuration data. Referring to fig. 11, fig. 11 shows a schematic flow diagram of a virtual object interacting with a projectile in an adaptive throwing mode in one embodiment. As shown in fig. 11, when the interaction mode is adaptive throwing, the terminal determines whether the throwing object is available according to the configuration data of the throwing object pointed by the interaction instruction. Taking the selected throwing object as the grenade as an example, when the throwing object is available, the terminal obtains the preparation state animation of the interaction between the virtual object and the throwing object pointed by the interaction instruction. And when the display of the ready-state animation is finished, the terminal judges whether the interactive instruction is still continuous.
Further, if the interaction instruction is not continued when the ready state animation display is finished, triggering the throwing object to carry out a rapid throwing mode. In the fast casting mode, the terminal obtains the fast casting state animation, displays the fast casting state animation on the interactive page, and completes the casting operation based on the fast casting state animation, which is specifically referred to the description of steps S702 to S708 and fig. 6c, and is not described herein again. And if the interaction instruction is still continuous when the preparation state animation display is finished, triggering the throwing object to carry out the accurate throwing mode. In the accurate throwing mode, the terminal acquires the aiming action animation and the throwing action animation, displays the aiming action animation on the interactive page, adjusts the shooting direction of the virtual camera based on the monitored control event of the virtual object, displays the throwing action animation based on the adjusted shooting direction of the virtual camera, and completes the throwing operation, which is specifically referred to the description of steps S802 to S808 and fig. 6c, and is not repeated herein.
In one embodiment, the terminal may determine whether the interactive instruction continues before the ready-state animation display is completed, that is, the timing of determining whether the throwing object should enter the accurate throwing mode may be earlier than the ready-state animation display end time. The specific time for judging whether the throwing object should enter the accurate throwing mode can be freely set according to the requirement, and the specific time is only required to be earlier than the starting time of aiming motion animation display.
In one embodiment, when the interaction mode corresponding to a throwing object is the accurate throwing mode, the terminal does not need to judge the continuous state of the interaction instruction, and continues to display the aiming action animation after the preparation state animation is displayed. And in the process of displaying the aiming action animation, the terminal continuously judges whether a control event for the virtual object occurs in the virtual scene. When the direction adjusting event is monitored, the terminal adjusts the shooting direction of the virtual camera according to the direction adjusting event, and the shooting direction of the adjusted virtual camera is determined as the throwing direction of the throwing object. If no new direction adjustment event occurs again within the preset duration since the last direction adjustment event is finished, the terminal finishes displaying the aiming action animation on the interactive page, and displays the throwing action animation based on the shooting direction of the virtual camera after the last adjustment, so that the virtual object can be controlled to throw the throwing object to the target throwing position set by the user. And if the new direction adjustment event occurs again within the preset time length since the last direction adjustment event is finished, the terminal readjusts the shooting direction of the virtual camera according to the new direction adjustment event, and the aiming motion animation is displayed in a circulating mode if the aiming motion animation is displayed completely in the period.
In the embodiment, the requirement of the user on the throwing efficiency of the throwing object can be judged according to the continuous state of the interactive instruction, and aiming is automatically finished by the terminal for the throwing object with higher requirement on the throwing speed of the throwing object, so that the user participation operation is simplified; for the throwing object with high requirement on throwing accuracy of the throwing object, the terminal supports the user to control the virtual object to assist aiming, and diversified throwing requirements of the user are met. More importantly, any kind of interaction mode is completed only based on one interaction instruction, and server resources can be saved.
In one embodiment, the method for controlling the virtual object to interact with the throwing object further comprises: when monitoring a control event of the virtual object in the process of displaying the target animation, generating a notification message of the control event; and sending the notification message to a corresponding message subscription interface, so that the message subscription interface responds to the control event.
In an embodiment of the present application, the interactive application may be an object-oriented application designed based on observer patterns. The watcher schema, also known as publish/subscribe schema, is a software design schema. In observer mode, a target object manages all observer objects that depend on it, and actively issues notifications when its own state changes, typically by calling the methods provided by the observers. Observer patterns separate the observer objects and the observed objects well. For example, the user interface may be an observer object and the business data is the observed observer object. And the user interface observes the change of the service data, and displays the changed service data on the user interface when the change of the service data is found.
In the observer mode, an observer object subscribes information to be known to the observed object in advance, and when the information corresponding to the observed object changes, the changed information is actively notified to each observer object subscribing the message. In an embodiment of the present application, the interactive application includes a plurality of module systems, such as a scene system, a virtual object system, a throwing object system, a User Interface (UI) system, a sound effect system, and the like. Each module system is provided with a corresponding message subscription interface and is used for subscribing messages to an observer and submitting the acquired information to the corresponding module system.
In particular, during the process of showing the target animation, the user can trigger the control operation on the virtual object at any time. The terminal listens for a control event corresponding to the control operation. When the control event is monitored, the terminal generates a notification message of the control event. The notification message of the control event refers to information including an event identification of the control event. And the terminal sends the notification message of the control event to the module system subscribed to the control event through the message subscription interface.
For example, when a trigger control corresponding to a certain throwing object in the interactive page is triggered, the throwing object system in the interactive application needs to know the event. Since the triggering of one trigger control may affect the display effect of other controls, the UI system also needs to know the trigger event of the control. Since pressing a control may require a sound, the sound effects system also needs to know that the control triggered an event. Therefore, the throwing object system, the UI system and the sound effect system respectively subscribe the control event corresponding to the trigger control in advance.
The conventional method for notifying the control event needs the terminal to temporarily judge which module systems the control event should be sent to, so that the response efficiency of the control event is reduced, the service logic and the data logic are strongly coupled, and the maintenance cost of the interactive application is increased. The observer mode of the embodiment defines clear boundaries among different module systems of the interactive application, and improves maintainability and reusability of the interactive application.
In this embodiment, when monitoring a control event, the observer mode-based interactive application only needs to be responsible for notifying the control event to an interface which subscribes the control event in advance, and does not need to perform temporary calculation, so that the efficiency of event notification is improved, the coupling degree of interactive application data logic and service logic is reduced, and the maintainability of interactive application codes is maintained.
In one embodiment, generating the notification message of the control event comprises: judging whether the control event can be executed or not; generating a notification message of the control event when the control event is executable; when the control event is not executable, the control event is masked.
Specifically, if a user triggers a control operation on a virtual object in a process of displaying a target animation, a terminal first determines from service logic whether a control event corresponding to the control operation conflicts with an interactive action currently being executed by the virtual object, that is, determines whether to allow the control event corresponding to the control operation. For example, in the process of displaying the target animation, the user triggers a control event that the virtual object performs a single action such as running, jumping, steering and the like, and the terminal determines that the control event is not executable when determining that the terminal is currently in the fast throwing mode, or determines that the control event is executable when determining that the terminal is currently in the accurate throwing mode. Referring to fig. 12, fig. 12 shows a directed graph of the different interactive actions performed by a virtual object on a projectile in one embodiment. The terminal may determine whether the control event is executable based on the directed graph.
Further, when it is determined that the control event is executable, the terminal generates a notification message of the control event, and transmits the notification message of the control event to the module system subscribed to the control event in the above manner. When the control event is determined not to be executable, the terminal directly masks the control event, namely, does not respond to the control command corresponding to the control event.
The conventional method for notifying the control event is that the terminal immediately notifies the control event after monitoring the control event, and the module systems receiving the notification message respectively judge whether the control event is executable or not. In this way, once the control event cannot be executed, more notification messages are transmitted inefficiently, data transmission resources are wasted, more memory garbage is generated, and the memory garbage recycling pressure is increased.
In the embodiment, the event is screened during event notification, and only the screened event is notified to the corresponding message subscription interface, so that the frequency of event triggering is greatly reduced, and frequent memory garbage collection is reduced.
In a specific embodiment, as shown in fig. 13, the method for controlling the interaction of the virtual object with the throwing object provided by the self-contained device comprises the following steps:
s1302, acquiring an initial animation of interaction between a virtual object and a throwing object; the initial animation includes a plurality of motion animations with different presentation timings.
S1304, splicing at least two action animations with adjacent display time sequences and action types of virtual objects in the action animations meeting splicing conditions to obtain a state animation in an interactive state.
And S1306, respectively determining each action animation which does not meet the splicing condition as one state animation, and obtaining a target animation comprising a plurality of state animations.
S1308, when the interaction instruction is triggered, the target animation of the virtual object interacting with the throwing object pointed by the interaction instruction is obtained.
S1310, determining the waiting time of each state animation according to the display time length and the display time sequence of the state animation.
S1313, starting timing from triggering the interactive instruction, and determining a current view direction of the virtual object.
And S1314, when the waiting time corresponding to one state animation is reached, triggering the corresponding state animation to be displayed on the interactive page based on the current view direction.
S1316, when the virtual object displayed in the target animation executes the aiming action on the throwing object, the target throwing position is determined in the virtual scene displayed on the interactive page.
And S1318, continuing displaying the target animation based on the virtual object view angle direction corresponding to the target throwing position.
According to the method for controlling the interaction between the virtual object and the throwing object, after the interaction brake is triggered, the target animation for interaction between the virtual object and the corresponding throwing object is automatically obtained, the effect that the virtual object sequentially executes different interaction actions on the throwing object can be reflected on the basis of the target animation, namely, after one interaction action is executed, the next interaction action can be automatically switched without additional interaction instruction triggering; when the virtual object performs aiming action on the throwing object, the visual angle direction of the virtual object in the target animation is adjusted according to the automatically determined target throwing position, aiming can be automatically completed, the throwing object can be controlled to be thrown to the target throwing position in the virtual scene by continuously displaying the animation, extra interaction instruction triggering is also not needed, the whole throwing operation is completed based on one interaction instruction, user operation is simplified, meanwhile, the server response frequency is reduced, and server resources are saved.
In another embodiment, as shown in fig. 14, the method for controlling the interaction of the virtual object with the throwing object provided by the present invention comprises the following steps:
s1402, when the interactive instruction is triggered, obtaining a preparation state animation of the virtual object interacting with the throwing object pointed by the interactive instruction.
S1404, a readiness animation is presented at the interaction page based on the perspective direction of the virtual object.
S1406, when the display of the ready-state animation is finished, judging whether the interactive instruction is still continuous.
And S1408, when the interactive instruction is not continued any more after the display of the ready-state animation is finished, acquiring the quick-throw state animation and displaying the quick-throw state animation.
And S1410, when the virtual object in the fast casting state animation executes the aiming action on the throwing object, determining the position of the target coordinate point in the interactive page corresponding to the virtual scene as a target throwing position.
And S1412, continuously displaying the quick-throw state animation based on the visual angle direction of the virtual object corresponding to the target throwing position.
And S1414, when the preparation state animation is displayed and the interactive instruction is still continuous, acquiring the aiming action animation and the throwing action animation, and displaying the aiming action animation.
S1416, monitoring control events for the virtual object in the process of displaying the aiming action animation.
S1418, determining a target throwing position according to the visual angle direction of the virtual object in the control event, and returning to the step of judging whether the interactive instruction is still continuous.
And S1420, when the interaction instruction is not continued any more, showing the throwing motion animation based on the virtual object view direction corresponding to the target throwing position.
According to the method for controlling the interaction between the virtual object and the throwing object, the requirement of the user on the throwing efficiency of the throwing object can be judged according to the continuous state of the interaction instruction, when the interaction instruction is displayed in the ready state animation, the condition that the user expects to throw the throwing object quickly is shown, the terminal automatically finishes aiming at the moment, the participation operation of the user is simplified, the interaction execution response efficiency is improved, and the urgent throwing requirement of the user can be met. When the interactive instruction is still continuous when the ready-state animation is displayed, the fact that the user has high requirements on the throwing speed of the throwing object is not high, but high requirements on the throwing accuracy are achieved, and at the moment, the terminal supports the user to control the virtual object to assist aiming, and the requirement of the user on accurate throwing is met.
In another specific embodiment, as shown in fig. 15, the method for controlling the interaction of the virtual object with the throwing object provided by the invention comprises the following steps:
s1502, when the interactive instruction is triggered, determining an interactive mode corresponding to the interactive instruction; the interaction mode includes a fast throwing mode and an adaptive throwing mode.
And S1504, when the interaction mode is the quick throwing mode, acquiring a target animation of the virtual object interacting with the throwing object pointed by the interaction instruction.
S1506, displaying the target animation based on the visual angle direction of the virtual object on the interactive page.
And S1508, when the virtual object displayed in the target animation performs the aiming action on the throwing object, determining the position of the target coordinate point in the interactive page corresponding to the virtual scene as a target throwing position.
S1510, continuing to display the target animation on the interactive page based on the visual angle direction of the virtual object.
And S1512, when the interaction mode is the self-adaptive throwing mode, acquiring a preparation state animation of the virtual object interacting with the throwing object pointed by the interaction instruction.
S1514, a ready state animation is presented at the interaction page based on the perspective direction of the virtual object.
S1516, when the display of the ready-state animation is finished, it is determined whether the interactive instruction continues.
S1518, when the interactive instruction is not continued after the display of the ready-state animation is finished, the fast-cast state animation is obtained, and the fast-cast state animation is displayed.
S1520, when the virtual object in the animation in the fast casting state is displayed to execute the aiming action on the throwing object, determining the position of the target coordinate point in the interactive page corresponding to the virtual scene as a target throwing position.
And S1522, continuously displaying the quick-throw state animation based on the virtual object view direction corresponding to the target throwing position.
S1524, when the display of the ready state animation is finished and the interactive instruction is still continuous, the aiming action animation and the throwing action animation are obtained, and the aiming action animation is displayed.
S1526, monitoring a control event for the virtual object in the process of displaying the aiming action animation;
s1528, determining a target throwing position according to the visual angle direction of the virtual object in the control event, and returning to the step of judging whether the interaction instruction is still continuous.
And S1530, when the interactive instruction is not continued any more, showing the throwing motion animation based on the virtual object view direction corresponding to the target throwing position.
The method for controlling the interaction between the virtual object and the throwing object supports the user to select different interaction modes, under the rapid throwing mode, the user only needs to trigger one-time interaction instruction, the state machine can automatically trigger and complete the whole throwing process, and the aiming link in the throwing process automatically determines the target throwing position of the throwing object to release, so that the user operation is greatly simplified, and the interaction instruction response efficiency is improved.
Fig. 2, 7, 8, 13, 14 and 15 are schematic flow charts illustrating a method for controlling a virtual object to interact with a projectile in one embodiment. It should be understood that although the individual steps in the flowcharts of fig. 2, 7, 8, 13, 14 and 15 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 7, 8, 13, 14, and 15 may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least some of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 16, there is provided an apparatus 1600 for controlling a virtual object to interact with a projectile, comprising: a throw preparation module 1602, a location aiming module 1604, and an item throwing module 1606, wherein,
a throwing preparation module 1602, configured to obtain a target animation in which a virtual object interacts with a throwing object pointed by an interaction instruction when the interaction instruction is triggered; and displaying the target animation on the interactive page based on the visual angle direction of the virtual object.
And the position aiming module 1604 is used for determining a target throwing position in a virtual scene displayed on the interactive page when the virtual object displayed in the target animation performs the aiming action on the throwing object.
And the article throwing module 1606 is configured to continue to display the target animation based on the virtual object view direction corresponding to the target throwing position.
In one embodiment, as shown in fig. 17, the device 1600 for controlling the virtual object to interact with the projectile further includes an animation splicing module 1608 for obtaining an initial animation of the virtual object interacting with the projectile; the initial animation comprises a plurality of action animations with different presentation time sequences; splicing at least two action animations with adjacent display time sequences and action types of virtual objects in the action animations meeting splicing conditions to obtain a state animation in an interactive state; and respectively determining each action animation which does not meet the splicing condition as a state animation to obtain a target animation comprising a plurality of state animations.
In one embodiment, the throwing preparation module 1602 is further configured to determine a waiting duration for each status animation according to the presentation duration and the presentation timing of the status animation; starting timing after an interactive instruction is triggered, and determining the current visual angle direction of the virtual object; and when the waiting time corresponding to one state animation is reached, triggering the display of the corresponding state animation based on the current view angle direction on the interactive page.
In one embodiment, the target animation includes a ready state animation and a quick cast state animation; the position aiming module 1604 is further configured to determine whether the interactive instruction continues when the preparation state animation display is completed; when the interactive instruction is not continued any more, displaying the quick-throw state animation; when the virtual object displayed in the animation in the fast casting state performs the aiming action on the casting, determining a target casting position in a virtual scene displayed on the interactive page; the item throwing module 1606 is further configured to continue to display the quick-throw status animation based on the virtual object perspective direction corresponding to the target throwing position.
In one embodiment, the position targeting module 1604 is further configured to determine a position of the target coordinate point in the interactive page corresponding to the virtual scene as a target throwing position.
In one embodiment, the position targeting module 1604 is further configured to obtain preset priority data when a virtual object exists in the virtual scene within a throwing distance range corresponding to the virtual object; selecting a virtual target within the throwing distance range according to the priority data; and determining the position of the selected virtual target in the virtual scene as the target throwing position of the throwing object.
In one embodiment, the target animation includes a readiness animation, a targeting action animation, and a throwing action animation; the position aiming module 1604 is further configured to determine whether the interactive instruction continues when the preparation state animation display is completed; when the interactive instruction is still continuous, displaying the aiming action animation; monitoring a control event for the virtual object in the process of displaying the aiming action animation; determining a target throwing position according to the visual angle direction of the virtual object in the control event, and returning to the step of judging whether the interaction instruction is still continuous; continuously displaying the target animation based on the virtual object view direction corresponding to the target throwing position comprises the following steps: and when the interaction instruction is not continued any more, showing the throwing motion animation based on the virtual object view angle direction corresponding to the target throwing position.
In one embodiment, the device 1600 for controlling the virtual object to interact with the projectile further includes a mode determining module 1610, configured to determine an interaction mode corresponding to the interaction instruction; the interaction mode includes a fast throw mode; position targeting module 1604 is also configured to determine a target throwing position based on a current position of the virtual object in the virtual scene in the fast throwing mode.
In one embodiment, the mode determining module 1610 is further configured to determine an interaction mode corresponding to the interaction instruction; the interaction mode comprises an adaptive throwing mode; the position aiming module 1604 is further configured to determine whether the interaction instruction continues when the virtual object and the throwing object are in the aiming state in the target animation in the adaptive throwing mode; if the virtual object continues to be thrown, determining a target throwing position based on the monitored control event of the virtual object; if the target throwing position is determined, the target throwing position is determined based on the current position of the virtual object in the virtual scene.
In one embodiment, the device 1600 for controlling the virtual object to interact with a projectile further includes an event notification module 1612 for generating a notification message of a control event when the control event for the virtual object is monitored during the process of displaying the target animation; and sending the notification message to a corresponding message subscription interface, so that the message subscription interface responds to the control event.
In one embodiment, the event notification module 1612 is further configured to determine whether a control event is executable, generate a notification message for the control event when the control event is executable; when the control event is not executable, the control event is masked.
According to the device for controlling the interaction between the virtual object and the throwing object, after the interaction brake is triggered, the target animation for interaction between the virtual object and the corresponding throwing object is automatically obtained, the effect that the virtual object sequentially executes different interaction actions on the throwing object can be embodied based on the target animation, namely, after one interaction action is executed, the next interaction action can be automatically switched without additional interaction instruction triggering; when the virtual object performs aiming action on the throwing object, the visual angle direction of the virtual object in the target animation is adjusted according to the automatically determined target throwing position, aiming can be automatically completed, the throwing object can be controlled to be thrown to the target throwing position in the virtual scene by continuously displaying the animation, extra interaction instruction triggering is also not needed, the whole throwing operation is completed based on one interaction instruction, user operation is simplified, meanwhile, the server response frequency is reduced, and server resources are saved.
FIG. 18 is a diagram illustrating an internal structure of a computer device in one embodiment. The computer device may specifically be the terminal 110 in fig. 1. As shown in fig. 18, the computer apparatus includes a processor, a memory, a network interface, an input device, and a display screen connected through a system bus. Wherein the memory includes a non-volatile storage medium and an internal memory. The non-volatile storage medium of the computer device stores an operating system and may also store a computer program that, when executed by the processor, causes the processor to implement a method of controlling a virtual object to interact with a projectile. The internal memory may also store a computer program that, when executed by the processor, causes the processor to execute a method of controlling the interaction of the virtual object with the projectile. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 18 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, the apparatus for controlling the interaction of a virtual object with a projectile provided herein may be implemented in the form of a computer program that is executable on a computer device such as that shown in fig. 18. The memory of the computer device may store various program modules constituting the means for controlling the interaction of the virtual object with the projectile, such as the throw preparation module, the position targeting module and the item throwing module shown in fig. 16. The computer program of each program module makes the processor execute the steps of the method for controlling the virtual object to interact with the throwing object of each embodiment of the application described in the specification.
For example, the computer apparatus shown in fig. 18 may execute steps S202 and S204 by a throwing preparation module in the device for controlling the virtual object to interact with a throwing object as shown in fig. 16. The computer device may perform step S206 through the location targeting module. The computer device may perform step S208 by the item throwing module.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the above-described steps of controlling a virtual object to interact with a projectile. Here, the steps of the method for controlling the virtual object to interact with the throwing object may be the steps of the method for controlling the virtual object to interact with the throwing object of the above embodiments.
In one embodiment, a computer readable storage medium is provided, storing a computer program that, when executed by a processor, causes the processor to perform the above-described steps of controlling a virtual object to interact with a projectile. Here, the steps of the method for controlling the virtual object to interact with the throwing object may be the steps of the method for controlling the virtual object to interact with the throwing object of the above embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, and the program can be stored in a non-volatile computer readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above examples only show some embodiments of the present application, and the description thereof is more specific, but not to be construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.

Claims (15)

1. A method of controlling a virtual object to interact with a projectile, the method comprising:
when an interaction instruction is triggered, acquiring a target animation of a virtual object interacting with a throwing object pointed by the interaction instruction;
displaying the target animation on an interactive page based on the visual angle direction of the virtual object;
when the virtual object displayed in the target animation executes aiming action on a throwing object, determining a target throwing position in a virtual scene displayed on the interactive page;
and continuously displaying the target animation based on the virtual object view direction corresponding to the target throwing position.
2. The method of claim 1, further comprising:
acquiring initial animation of interaction between a virtual object and a throwing object; the initial animation comprises a plurality of motion animations with different presentation timings;
splicing at least two action animations with adjacent display time sequences and action types of virtual objects in the action animations meeting splicing conditions to obtain a state animation in an interactive state;
and respectively determining each action animation which does not meet the splicing condition as a state animation to obtain a target animation comprising a plurality of state animations.
3. The method of claim 2, wherein said presenting the target animation based on the perspective direction of the virtual object at the interactive page comprises:
determining the waiting time of each state animation according to the display time and the display time sequence of the state animation;
starting timing after an interactive instruction is triggered, and determining the current visual angle direction of the virtual object;
and when the waiting time corresponding to one state animation is reached, triggering the corresponding state animation to be displayed on the interactive page based on the current view angle direction.
4. The method of claim 1, wherein the goal animation comprises a ready state animation and a quick cast state animation; when the virtual object displayed in the target animation performs the aiming action on the throwing object, the determining the target throwing position in the virtual scene displayed on the interactive page comprises the following steps:
when the preparation state animation display is finished, judging whether the interactive instruction is still continuous;
when the interactive instruction is not continued any more, displaying the quick-throw state animation;
when the virtual object displayed in the fast casting state animation executes aiming action on the casting, determining a target casting position in a virtual scene displayed on an interactive page;
the continuously displaying the target animation based on the virtual object view direction corresponding to the target throwing position comprises the following steps:
and continuously displaying the quick-throw state animation based on the visual angle direction of the virtual object corresponding to the target throwing position.
5. The method of claim 1 or 4, wherein said determining a target throw position in a virtual scene presented by said interactive page comprises:
and determining the position of the target coordinate point in the interactive page corresponding to the virtual scene as a target throwing position.
6. The method of claim 1 or 4, wherein said determining a target throw position in a virtual scene presented by said interactive page comprises:
when a virtual target exists in a throwing distance range corresponding to the virtual target in a virtual scene, acquiring preset priority data;
selecting a virtual target within the throwing distance range according to the priority data;
and determining the position of the selected virtual target in the virtual scene as the target throwing position of the throwing object.
7. The method of claim 1, wherein the target animation comprises a readiness animation, a targeting action animation, and a throwing action animation; when the virtual object displayed in the target animation performs the aiming action on the throwing object, the determining the target throwing position in the virtual scene displayed on the interactive page comprises the following steps:
when the preparation state animation display is finished, judging whether the interactive instruction is still continuous;
when the interaction instruction continues, displaying the aiming action animation;
monitoring a control event to the virtual object in the process of displaying the aiming action animation;
determining a target throwing position according to the visual angle direction of the virtual object in the control event, and returning to the step of judging whether the interaction instruction is still continuous;
the continuously displaying the target animation based on the virtual object view direction corresponding to the target throwing position comprises the following steps:
and when the interaction instruction is not continued any more, displaying the throwing motion animation based on the virtual object view direction corresponding to the target throwing position.
8. The method of claim 1, further comprising:
determining an interaction mode corresponding to the interaction instruction; the interaction mode comprises a fast throw mode;
the determining a target throwing position in the virtual scene displayed by the interactive page comprises:
in the fast throwing mode, a target throwing position is determined based on a position of a virtual object currently in a virtual scene.
9. The method of claim 1, further comprising:
determining an interaction mode corresponding to the interaction instruction; the interaction mode comprises an adaptive throwing mode;
when the virtual object and the throwing object displayed in the target animation are in the aiming state, the step of determining the target throwing position in the virtual scene displayed on the interactive page comprises the following steps:
in the self-adaptive throwing mode, when the virtual object and the throwing object in the target animation are displayed in the aiming state, judging whether the interaction instruction is still continuous;
if the virtual object continues to be thrown, determining a target throwing position based on the monitored control event of the virtual object;
if the target throwing position is determined, the target throwing position is determined based on the current position of the virtual object in the virtual scene.
10. The method according to any one of claims 1 to 9, further comprising:
when monitoring a control event of a virtual object in the process of displaying the target animation, generating a notification message of the control event;
and sending the notification message to a corresponding message subscription interface, so that the message subscription interface responds to the control event.
11. The method of claim 10, wherein generating the notification message of the control event comprises:
judging whether the control event can be executed or not;
generating a notification message of the control event when the control event is executable;
masking the control event when the control event is not executable.
12. An apparatus for controlling interaction of a virtual object with a projectile, the apparatus comprising:
the throwing preparation module is used for acquiring a target animation of interaction between a virtual object and a throwing object pointed by an interaction instruction when the interaction instruction is triggered; displaying the target animation on an interactive page based on the visual angle direction of the virtual object;
the position aiming module is used for determining a target throwing position in a virtual scene displayed on the interactive page when the virtual object displayed in the target animation executes aiming action on a throwing object;
and the article throwing module is used for continuously displaying the target animation based on the virtual object view angle direction corresponding to the target throwing position.
13. The apparatus of claim 12, further comprising an animation stitching module for obtaining an initial animation of the virtual object interacting with the projectile; the initial animation comprises a plurality of motion animations with different presentation timings; splicing at least two action animations with adjacent display time sequences and action types of virtual objects in the action animations meeting splicing conditions to obtain a state animation in an interactive state; and respectively determining each action animation which does not meet the splicing condition as a state animation to obtain a target animation comprising a plurality of state animations.
14. A computer-readable storage medium, storing a computer program which, when executed by a processor, causes the processor to carry out the steps of the method according to any one of claims 1 to 11.
15. A computer device comprising a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of the method according to any one of claims 1 to 11.
CN201911190687.9A 2019-11-28 2019-11-28 Method, device and storage medium for controlling interaction of virtual object and throwing object Active CN110893277B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911190687.9A CN110893277B (en) 2019-11-28 2019-11-28 Method, device and storage medium for controlling interaction of virtual object and throwing object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911190687.9A CN110893277B (en) 2019-11-28 2019-11-28 Method, device and storage medium for controlling interaction of virtual object and throwing object

Publications (2)

Publication Number Publication Date
CN110893277A true CN110893277A (en) 2020-03-20
CN110893277B CN110893277B (en) 2021-05-28

Family

ID=69788292

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911190687.9A Active CN110893277B (en) 2019-11-28 2019-11-28 Method, device and storage medium for controlling interaction of virtual object and throwing object

Country Status (1)

Country Link
CN (1) CN110893277B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112044071A (en) * 2020-09-04 2020-12-08 腾讯科技(深圳)有限公司 Virtual article control method, device, terminal and storage medium
CN112749666A (en) * 2021-01-15 2021-05-04 百果园技术(新加坡)有限公司 Training and motion recognition method of motion recognition model and related device
EP4119210A4 (en) * 2020-11-19 2023-11-01 Tencent Technology (Shenzhen) Company Limited State switching method and apparatus in virtual scene, device, medium, and program product

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1348829A (en) * 2001-11-07 2002-05-15 陈少元 Multi-target multi-shooter simulated laser shooting system
JP2006288528A (en) * 2005-04-07 2006-10-26 Namco Bandai Games Inc Game system, server system, game device, program and information storage medium
CN108404406A (en) * 2018-03-22 2018-08-17 腾讯科技(深圳)有限公司 Display methods, device, equipment and the readable medium of ballistic trajectory in virtual environment
CN108619720A (en) * 2018-04-11 2018-10-09 腾讯科技(深圳)有限公司 Playing method and device, storage medium, the electronic device of animation
CN108815851A (en) * 2018-06-05 2018-11-16 腾讯科技(深圳)有限公司 Interface display method, equipment and storage medium when being shot in virtual environment
CN109200582A (en) * 2018-08-02 2019-01-15 腾讯科技(深圳)有限公司 The method, apparatus and storage medium that control virtual objects are interacted with ammunition
CN110427111A (en) * 2019-08-01 2019-11-08 腾讯科技(深圳)有限公司 The operating method of virtual item, device, equipment and storage medium in virtual environment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1348829A (en) * 2001-11-07 2002-05-15 陈少元 Multi-target multi-shooter simulated laser shooting system
JP2006288528A (en) * 2005-04-07 2006-10-26 Namco Bandai Games Inc Game system, server system, game device, program and information storage medium
CN108404406A (en) * 2018-03-22 2018-08-17 腾讯科技(深圳)有限公司 Display methods, device, equipment and the readable medium of ballistic trajectory in virtual environment
CN108619720A (en) * 2018-04-11 2018-10-09 腾讯科技(深圳)有限公司 Playing method and device, storage medium, the electronic device of animation
CN108815851A (en) * 2018-06-05 2018-11-16 腾讯科技(深圳)有限公司 Interface display method, equipment and storage medium when being shot in virtual environment
CN109200582A (en) * 2018-08-02 2019-01-15 腾讯科技(深圳)有限公司 The method, apparatus and storage medium that control virtual objects are interacted with ammunition
CN110427111A (en) * 2019-08-01 2019-11-08 腾讯科技(深圳)有限公司 The operating method of virtual item, device, equipment and storage medium in virtual environment

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112044071A (en) * 2020-09-04 2020-12-08 腾讯科技(深圳)有限公司 Virtual article control method, device, terminal and storage medium
CN112044071B (en) * 2020-09-04 2021-10-15 腾讯科技(深圳)有限公司 Virtual article control method, device, terminal and storage medium
US11904241B2 (en) 2020-09-04 2024-02-20 Tencent Technology (Shenzhen) Company Limited Virtual item control method and apparatus, terminal, and storage medium
EP4119210A4 (en) * 2020-11-19 2023-11-01 Tencent Technology (Shenzhen) Company Limited State switching method and apparatus in virtual scene, device, medium, and program product
CN112749666A (en) * 2021-01-15 2021-05-04 百果园技术(新加坡)有限公司 Training and motion recognition method of motion recognition model and related device

Also Published As

Publication number Publication date
CN110893277B (en) 2021-05-28

Similar Documents

Publication Publication Date Title
CN110893277B (en) Method, device and storage medium for controlling interaction of virtual object and throwing object
WO2022151946A1 (en) Virtual character control method and apparatus, and electronic device, computer-readable storage medium and computer program product
CN111803933B (en) Prop control method in game, terminal, electronic device and readable storage medium
WO2022252911A1 (en) Method and apparatus for controlling called object in virtual scene, and device, storage medium and program product
CN110075522B (en) Control method, device and terminal of virtual weapon in shooting game
WO2021244209A1 (en) Virtual object control method and apparatus, and terminal and storage medium
CN110947176B (en) Virtual object control method, bullet number recording method, device, and medium
CN113398601B (en) Information transmission method, information transmission device, computer-readable medium, and apparatus
WO2022227958A1 (en) Virtual carrier display method and apparatus, device, and storage medium
CN113181649B (en) Control method, device, equipment and storage medium for calling object in virtual scene
WO2023020122A1 (en) Virtual skill control method and apparatus, device, storage medium, and program product
CN113521759B (en) Information processing method, device, terminal and storage medium
JP2023523157A (en) Virtual environment screen display method, device and computer program
WO2021143290A1 (en) Method and apparatus for displaying virtual prop, storage medium and electronic device
CN113713383A (en) Throwing prop control method and device, computer equipment and storage medium
CN113769394B (en) Prop control method, device, equipment and storage medium in virtual scene
CN113694515B (en) Interface display method, device, terminal and storage medium
WO2022068573A1 (en) Operation prompt method and device, terminal, and storage medium
CN113813599B (en) Virtual character control method and device, storage medium and electronic equipment
WO2023024078A1 (en) Virtual object control method and apparatus, electronic device, and storage medium
CN114159785A (en) Virtual item discarding method and device, electronic equipment and storage medium
CN113426110A (en) Virtual character interaction method and device, computer equipment and storage medium
WO2020243953A1 (en) Control method for remote control movable platform, device and computer-readable storage medium
WO2023231557A9 (en) Interaction method for virtual objects, apparatus for virtual objects, and device, storage medium and program product
CN116549972A (en) Virtual resource processing method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40022341

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant