CN116966564A - Method and device for playing back action trace, storage medium and computer equipment - Google Patents

Method and device for playing back action trace, storage medium and computer equipment Download PDF

Info

Publication number
CN116966564A
CN116966564A CN202311016958.5A CN202311016958A CN116966564A CN 116966564 A CN116966564 A CN 116966564A CN 202311016958 A CN202311016958 A CN 202311016958A CN 116966564 A CN116966564 A CN 116966564A
Authority
CN
China
Prior art keywords
attack
trace
target
action
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311016958.5A
Other languages
Chinese (zh)
Inventor
许展昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202311016958.5A priority Critical patent/CN116966564A/en
Publication of CN116966564A publication Critical patent/CN116966564A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5378Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a method, a device, a storage medium and computer equipment for playing back action trace, wherein the method comprises the following steps: controlling the first virtual weapon to conduct target attack behaviors in response to attack operation aiming at an attack object; and recording attack record information of the target attack behaviors, wherein the attack record information is used for recording the action trace of the target attack behaviors. Therefore, when the virtual weapon performs the attack, the action trace of the attack is recorded, so that a user can know whether the target is hit or which parts of the target are hit when the user attacks by looking up the attack record information.

Description

Method and device for playing back action trace, storage medium and computer equipment
Technical Field
The present application relates to the field of computers, and in particular, to a method and apparatus for playing back an action trace, a computer readable storage medium, and a computer device.
Background
In recent years, with development and popularization of computer equipment technology, more and more applications having three-dimensional virtual environments, such as: virtual reality applications, three-dimensional map programs, military simulation programs, first person shooter games (First person shooting game, FPS), role-playing games (RPG), and the like.
In the prior art, when a user plays an FPS game, the user hardly clearly perceives the track direction of specific action tracks of different types of firearms in the using process, and the conditions of various shooting angles and the postures such as squatting and jumping can exist when facing enemies, so that complex using scene factors are synthesized, and the user cannot see the action track of a bullet in shooting.
In the course of research and practice of the prior art, the inventors of the present application found that the prior art user was not aware of the trace of action of the bullet at the time of firing.
Disclosure of Invention
The embodiment of the application provides a method and a device for playing back action traces, which can enable a user to know the action trace of a bullet during shooting.
In order to solve the technical problems, the embodiment of the application provides the following technical scheme:
a method for playback of action tracks, providing a graphical user interface by a terminal device, wherein the content displayed by the graphical user interface comprises at least part of a virtual scene, and the virtual scene comprises a first virtual weapon, the method comprising:
controlling the first virtual weapon to conduct target attack behaviors in response to attack operation aiming at an attack object;
And recording attack record information of the target attack behaviors, wherein the attack record information is used for recording the action trace of the target attack behaviors.
An action trace playback apparatus for providing a graphical user interface through a terminal device, the graphical user interface displaying content comprising at least a portion of a virtual scene including a first virtual weapon, the apparatus comprising:
the control module is used for responding to the attack operation aiming at the attack object and controlling the first virtual weapon to carry out target attack behavior;
the recording module is used for recording attack record information of the target attack behaviors, and the attack record information is used for recording the action trace of the target attack behaviors.
A computer readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps of the effect trace playback method described above.
A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the effect trace playback method as described above when the program is executed.
According to the embodiment of the application, the first virtual weapon is controlled to be adopted for carrying out target attack behaviors by responding to the attack operation aiming at the attack object; and recording attack record information of the target attack behaviors, wherein the attack record information is used for recording the action trace of the target attack behaviors. Therefore, when the virtual weapon performs the attack, the action trace of the attack is recorded, so that a user can know whether the target is hit or which parts of the target are hit when the user attacks by looking up the attack record information.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1a is a system schematic diagram of an action trace playback method according to an embodiment of the present application.
Fig. 1b is a schematic flow chart of a first method for playing back an action trace according to an embodiment of the present application.
Fig. 1c is a first schematic diagram of a graphical user interface according to an embodiment of the present application.
FIG. 1d is a second schematic diagram of a graphical user interface according to an embodiment of the present application.
Fig. 1e is a third schematic diagram of a graphical user interface according to an embodiment of the present application.
Fig. 1f is a fourth schematic diagram of a graphical user interface according to an embodiment of the present application.
FIG. 1g is a fifth schematic diagram of a graphical user interface provided by an embodiment of the present application.
Fig. 1h is a sixth schematic diagram of a graphical user interface according to an embodiment of the present application.
FIG. 1i is a seventh diagram of a graphical user interface according to an embodiment of the present application.
Fig. 1j is an eighth schematic diagram of a gui provided by an embodiment of the present application.
FIG. 2 is a schematic structural diagram of an action trace playback apparatus according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to fall within the scope of the application.
The embodiment of the application provides a method and a device for playing back action traces, a storage medium and computer equipment. Specifically, the method for playing back the action trace according to the embodiment of the present application may be executed by a computer device, where the computer device may be a device such as a terminal or a server. The terminal may be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA), etc., and the terminal device may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, etc. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligent platforms.
For example, when the effect trace playback method is run on the terminal, the terminal device stores a game application and presents a part of a game scene in a game through the display component. The terminal device is used for interacting with a user through a graphical user interface, for example, the terminal device downloads and installs a game application program and runs the game application program. The way in which the terminal device presents the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device, or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including game screens and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for running the game, generating the graphical user interface, responding to the operation instructions, and controlling the display of the graphical user interface on the touch display screen.
For example, when the action trace playback method is run on a server, it may be a cloud game. Cloud gaming refers to a game style based on cloud computing. In the cloud game operation mode, an operation main body of the game application program and a game picture presentation main body are separated, and the storage and operation of the action trace playback method are completed on a cloud game server. The game image presentation is completed at a cloud game client, which is mainly used for receiving and sending game data and presenting game images, for example, the cloud game client may be a display device with a data transmission function, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, etc., near a user side, but the terminal device for playing back action traces is a cloud game server in the cloud. When playing the game, the user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the cloud game client through a network, and finally decodes the data through the cloud game client and outputs the game pictures.
Referring to fig. 1a, fig. 1a is a schematic system diagram of a playback method of action traces according to an embodiment of the application. The system may include at least one computer device 1000, at least one server 2000, at least one database 3000, and a network 4000. The computer device 1000 held by the user may be connected to servers of different games through the network 4000. Computer device 1000 is any device having computing hardware capable of supporting and executing software products corresponding to a game. In addition, the computer device 1000 has one or more multi-touch sensitive screens for sensing and obtaining input of a user through touch or slide operations performed at multiple points of the one or more touch sensitive display screens. In addition, when the system includes a plurality of computer devices 1000, a plurality of servers 2000, and a plurality of networks 4000, different computer devices 1000 may be connected to each other through different networks 4000, through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, the different computer devices 1000 may be connected to other terminals or to a server or the like using their own bluetooth network or hotspot network. For example, multiple users may be online through different computer devices 1000 so as to be connected through an appropriate network and synchronized with each other to support multi-user gaming. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information related to game environments may be continuously stored in the databases 3000 when different users play multi-user games online.
The embodiment of the application provides a method for playing back action traces, which can be executed by a terminal or a server. The embodiment of the application is described by taking the action trace playback method as an example to be executed by a terminal. The terminal comprises a display component and a processor, wherein the display component is used for presenting a graphical user interface and receiving operation instructions generated by a user acting on the display component. When a user operates the graphical user interface through the display component, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end server through responding to the received operation instruction. For example, the user-generated operational instructions for the graphical user interface include instructions for launching the gaming application, and the processor is configured to launch the gaming application after receiving the user-provided instructions for launching the gaming application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch-sensitive display screen. A touch display screen is a multi-touch-sensitive screen capable of sensing touch or slide operations performed simultaneously by a plurality of points on the screen. The user performs touch operation on the graphical user interface by using a finger, and when the graphical user interface detects the touch operation, the graphical user interface controls different virtual objects in the graphical user interface of the game to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role playing game, a strategy game, a sports game, an educational game, a first person shooter game (First person shooting game, FPS), and the like. Wherein the game may comprise a virtual scene of the game drawn on a graphical user interface. Further, one or more virtual objects, such as virtual characters, controlled by a user (or users) may be included in the virtual scene of the game. In addition, one or more obstacles, such as rails, ravines, walls, etc., may also be included in the virtual scene of the game to limit movement of the virtual object, e.g., to limit movement of the one or more objects to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, scores, character health status, energy, etc., to provide assistance to the user, provide virtual services, increase scores related to the user's performance, etc. In addition, the graphical user interface may also present one or more indicators to provide indication information to the user. For example, a game may include a user-controlled virtual object and one or more other virtual objects (such as enemy characters). In one embodiment, one or more other virtual objects are controlled by other users of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using an Artificial Intelligence (AI) algorithm, implementing a human-machine engagement mode. For example, virtual objects possess various skills or capabilities that a game user uses to achieve a goal. For example, the virtual object may possess one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by a user of the game using one of a plurality of preset touch operations with the touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of the user.
It should be noted that, the system schematic diagram of the active trace playback system shown in fig. 1a is only an example, and the active trace playback system and the scenario described in the embodiments of the present application are for more clearly describing the technical solutions of the embodiments of the present application, and do not constitute a limitation on the technical solutions provided by the embodiments of the present application, and those skilled in the art can know that, with the evolution of the active trace playback system and the appearance of a new service scenario, the technical solutions provided by the embodiments of the present application are equally applicable to similar technical problems.
In the present embodiment, description will be made from the viewpoint of an action trace playback apparatus which can be integrated in a computer device having a storage unit and a microprocessor mounted thereto and having arithmetic capability.
Referring to fig. 1b, fig. 1b is a schematic flow chart of a playback method of an action trace according to an embodiment of the application. The method for playing back the action trace provides a graphical user interface through terminal equipment, wherein the content displayed by the graphical user interface comprises at least part of virtual scenes, the virtual scenes comprise first virtual weapons, and the method comprises the following steps:
in step 101, in response to an attack operation for an attack object, a target attack behavior with the first virtual weapon is controlled.
The three-dimensional virtual scene is provided when the application program runs on the terminal, and can be a simulation scene of a real world, a semi-simulation and semi-fictitious scene or a pure fictitious scene. The scene picture displayed on the graphical user interface is a scene picture presented when the virtual object observes a three-dimensional virtual scene. The user controls the virtual object in the game scene through the terminal, the virtual object can observe the three-dimensional virtual scene through the camera model, taking FPS game as an example, when the virtual object is in the first person view angle, the camera model is positioned at the head or neck of the target virtual object, and only the arm part of the virtual character can be displayed in the graphical user interface; when in the third person viewing angle, the camera model is behind the target virtual object, and only the upper body portion of the virtual character can be displayed in the graphical user interface. The graphic user interface is a scene picture presented by observing the three-dimensional virtual scene through the camera model at a certain view angle.
Specifically, referring to fig. 1c, fig. 1c is a first schematic diagram of a graphical user interface according to an embodiment of the present application. The graphical user interface is presented by a screen of the computer device 1000, in which a target virtual object 10 manipulated by a user is included, the target virtual object 10 being configured with a first virtual weapon, a firing target identification 20 for assisting the firing of the virtual weapon, a cursor control 30 for prompting the user of current direction information of the target virtual object 10, a movement control 40 for controlling the movement of the virtual object 10 in the three-dimensional virtual environment, a target control 50 for prompting the user of the position of the target virtual object 10 in the three-dimensional virtual environment, a map control 60 for prompting the user of the position of the target virtual object 10 in the three-dimensional virtual environment, a firing control 70 for controlling the firing of the target virtual object 10 in the three-dimensional virtual environment, a skill control 80 for controlling the skill release of the target virtual object 10 in the three-dimensional virtual environment, and so on. An indication control 31 is further disposed in the cursor control 30, and is used for indicating the direction of the target virtual object 10 in the cursor control 30. It will be appreciated that the user graphical interface may include not only the above-mentioned identifier and control, but also other functional controls or identifiers, which are determined according to the specific game content, and are not limited herein.
The attack operation is that the user selects the attack object through the shooting aiming mark 20 and clicks the firing control 70, so as to trigger the attack operation. And, based on the virtual weapon of different types, there are different attacks corresponding. For example, when the virtual weapon is of a firearm type (e.g. rifle, sniper, etc.), its corresponding target attack is to attack the object by a virtual bullet; when the virtual weapon is of a throwing type (e.g., dart, javelin, etc.), its corresponding target attack is to attack the object through the virtual prop.
In some embodiments, the step of controlling the target attack by using the first virtual weapon and recording attack record information of the target attack includes:
(1) When the attack object of the target attack behavior is a virtual character, changing the collision volume of the virtual character from an original collision volume to a target collision volume, wherein the target collision volume is larger than the original collision volume;
(2) And recording the action trace on the original collision volume in the continuous process of the target attack behavior, and obtaining an action trace video.
When the attack object aiming at the target attack behavior is the virtual character, the embodiment of the application has two recording modes of action traces, wherein the first recording mode is to record the action trace on the original collision volume corresponding to the virtual character only, so as to obtain the action trace video. The second recording mode is to additionally record the action trace which is not acted on the original collision volume corresponding to the virtual character on the basis of recording the action trace which is acted on the original collision volume corresponding to the virtual character.
In some embodiments, the method further comprises:
(1) When other collision volumes are detected to be attacked in the continuous process of the target attack behavior, determining the current video frame as a target video frame, and establishing a corresponding relation between the target video frame and attack positions when the other collision volumes are attacked, wherein the other collision volumes are collision volumes except the original collision volume in the target collision volume;
(2) After the step of recording the action trace on the original collision volume in the duration of the target attack behavior to obtain an action trace video, the method further comprises the following steps:
(3) And generating an action trace in the action trace video at the attack position in the target video frame according to the corresponding relation between the target video frame and the attack position.
In the second recording method, if the collision volume of the virtual character is not attacked, the trace of the action on the collision volume of the virtual character cannot be recorded, so that when the attack object of the target attack is the virtual character, the collision volume of the virtual character is changed from the original collision volume to the target collision volume. In the recording process, when the target attack behavior is detected to attack other collision volumes except the original collision volume corresponding to the virtual character in other collision volumes of the target, determining the current video frame recorded in the process as a target video frame, and establishing a corresponding relation between the target video frame and the attack position when the other collision volumes are attacked; and after the action trace video is obtained by normally recording the action trace on the original collision volume in the continuous process of the target attack behavior, finding a target video frame from the action trace video, and generating the action trace at the attack position corresponding to the target video frame.
For example, 10 frames of video are recorded, when it is detected that the target attack behavior has an attack on other collision volumes of the target other collision volumes except the original collision volume corresponding to the virtual character, the current video frame is the 3 rd frame and the 6 th frame, the attack position of the 3 rd frame is a position at the upper left part of the head of the virtual character, the attack position of the 6 th frame is a position at the right side of the leg of the virtual character, after the normal recording of the active trace video, an active trace is generated at a position recorded at the upper left part of the head of the virtual character displayed in the 3 rd frame of the active trace video, and an active trace is generated at a position recorded at the right side of the leg of the virtual character displayed in the 6 th frame of the active trace video.
In step 102, attack record information of the target attack behavior is recorded, where the attack record information is used to record an action trace of the target attack behavior.
The attack record information is used for recording action traces of the target attack behaviors, wherein the action traces comprise traces acting on an original collision volume of the virtual object and can also comprise traces acting on the target collision volume of the virtual object.
Specifically, the recording mode may be recording, and the recording mode may be controlling the virtual lens in the game to move to the vicinity of the attack object and recording the attack object and the surrounding virtual scene when the target attack behavior is detected.
In some embodiments, after the step of recording attack record information of the target attack behavior, the attack record information is used for recording an action trace of the target attack behavior, the method further includes:
and displaying an action trace playback control in the graphical user interface, and playing back the action trace video through the action trace playback control.
Fig. 1d is a second schematic diagram of a graphical user interface according to an embodiment of the present application, as shown in fig. 1 d. After the active trace video caused by the attack is recorded, a specified video frame may be selected from a plurality of video frames constituting the active trace video and displayed in the active trace playback control 90 in the graphical user interface. The designated video frames can be video frames corresponding to the middle sequence number or the maximum sequence number in a plurality of video frames forming the action trace video according to the video frame sequence number; the specified video frames may be corresponding video frames at intermediate time points in the whole action trace track video according to the video duration, or video frames corresponding to the maximum time point, etc., which are not limited herein.
Specifically, as shown in fig. 1d, a plurality of action trace tracks 100 exist in a specified video frame, some action trace tracks 100 are distributed on the virtual trunk of the attack object, and some action trace tracks 100 are distributed on the virtual scene outside the attack object.
In some embodiments, before the step of recording attack record information of the target attack behaviour, the method further comprises:
(1) If the operation interval time between the current shooting operation and the last shooting operation reaches the preset operation time; and/or
(2) If the first virtual weapon is a replaced virtual weapon; and/or
(3) And if the attack object is the replaced virtual object, executing the step of recording attack record information of the target attack behavior.
In order to avoid unnecessary recording, such as recording when a gun is opened due to false touch by a user, a triggering mode of attack recording information for recording target attack behaviors can be limited, and the operation interval duration of the current shooting operation and the last shooting operation can be detected to reach a preset operation duration (for example, the operation interval duration of the current shooting operation and the last shooting operation reaches 3 seconds); and/or the first virtual weapon is a replaced virtual weapon (e.g., originally an AK rifle is used, and subsequently replaced with an M4 rifle for attack); and/or triggering and recording attack record information of the target attack behavior when the attack object is a replaced virtual object (original attack virtual character A and subsequent replacement of the attack virtual character B).
In some embodiments, the step of recording attack record information of the target attack behavior includes:
triggering and recording an action trace video of the first virtual weapon acting on the attack object when the target attack behavior is detected to hit the attack object; or (b)
When the target attack behavior is detected to start attack, triggering and recording the action trace video of the first virtual weapon on the attack object.
Wherein the start time of the recording may also be triggered in a different way, when the target attack behavior is detected to hit the attack object, recording is started (for example, the AK rifle shoots 30 rounds in total, the 5 th round hits the attack object, and recording is started from the 5 th round; or when the target attack behavior is detected to start to attack, triggering recording a video of the action trace of the first virtual weapon on the attack object (for example, an AK rifle shoots 30 rounds in total, and starts recording when shooting from 1 st round).
In some embodiments, the active trace playback control further includes an active trace comparison sub-control, and after the step of controlling target attack using the first virtual weapon in response to an attack operation against an attack object, further includes:
Recording attack behavior operation data during attack behavior;
after the step of displaying the specified video frame in the active trace video in the active trace playback control, further comprising:
(1) Responsive to a control operation for the bracketing trace comparison sub-control, displaying a virtual weapon selection interface within the graphical user interface, the virtual weapon selection interface including a number of candidate virtual weapons;
(2) Selecting a target virtual weapon from the candidate virtual weapons in response to a control operation for the candidate virtual weapons;
(3) And determining a first predicted action trace of the target virtual weapon based on the attack behavior operation data, the attack behavior parameters of the target virtual weapon and the specified video frame, and displaying the first predicted action trace in the displayed specified video frame.
Fig. 1e and fig. 1f show a third schematic view of a graphical user interface provided by an embodiment of the present application, and fig. 1f shows a fourth schematic view of a graphical user interface provided by an embodiment of the present application. Wherein, in order to provide the user with the action trace track formed by different virtual weapons under the same attack action operation data, the action trace comparison sub-control 91 can be set on the action trace playback control, and the attack action operation data of the user in attack action can be recorded in each attack action. The attack behavior operation data specifically may include operation data such as a movement distance and a speed of a mouse when the user controls the attack behavior, a sliding distance and a speed when the user slides the screen.
Specifically, in response to a control operation by the user on the action trace comparison sub-control 91, a virtual weapon selection interface 911 is displayed within the graphical user interface, the virtual weapon selection interface 911 including a number of candidate virtual weapons, such as virtual weapon 1, virtual weapon 2, virtual weapon 3, and virtual weapon 4 in FIG. 1 e.
Wherein a target virtual weapon is selected from the candidate virtual weapons in response to a control operation for the candidate virtual weapons. For example, in fig. 1f, a user selects a virtual weapon 3 from the virtual weapon 1, the virtual weapon 2, the virtual weapon 3 and the virtual weapon 4 as a target virtual weapon by clicking operation, determines a first predicted action trace of the target virtual weapon by combining recorded attack action operation data, attack action parameters set in advance for the target virtual weapon and the specified video frame, and displays the first predicted action trace in the displayed specified video frame. A first predicted action trace, such as the predicted action trace formed by the action trace 110 in fig. 1f caused by the plurality of target virtual weapons. In order to distinguish the action trace traces caused by the first virtual weapon and the target virtual weapon, different action trace traces may be displayed in a designated video frame, for example, the action trace 100 caused by the first virtual weapon is circular in fig. 1f, the action trace 110 caused by the target virtual weapon is rectangular, or the action trace traces are distinguished by colors, which are not limited herein.
In some embodiments, the step of determining the first predicted trace of action of the target virtual weapon based on the attack behavior operation data, the attack behavior parameters of the target virtual weapon, and the specified video frame comprises:
(1.1) acquiring an initial video frame of the effect trace video;
(1.2) determining an attack duration from the initial video frame to the specified video frame;
and (1.3) controlling the target virtual weapon to simulate attack according to the attack operation data and the attack parameters of the target virtual weapon, recording simulated attack action traces, determining the recorded simulated attack action traces as first predicted action traces of the target virtual weapon until the simulated attack action duration of the simulated attack reaches the attack action duration.
Wherein, based on the attack behavior operation data, the attack behavior parameters of the target virtual weapon and the specified video frame, the manner of determining the first predicted action trace of the target virtual weapon may be: determining an attack behavior duration consumed from the beginning of recording the action trace track of the first virtual weapon to the designated video frame; and controlling the target virtual weapon to simulate the attack according to the attack operation data and the attack parameters of the target virtual weapon, recording the simulated attack action trace until the simulated attack action time of the simulated attack reaches the attack action time, and determining the recorded simulated attack action trace as a first predicted action trace of the target virtual weapon.
For example, when the time point at which the recording of the action trace of the first virtual weapon is started is 11 minutes and 10 seconds and the recording is performed to the specified video frame, the time point of the specified video frame is 11 minutes and 15 seconds, and a total of 5 seconds are recorded. And according to the attack behavior operation data (for example, when a user controls a lens, a mouse moves leftwards by 5 cm or a sliding screen moves rightwards by 2 cm) and the attack behavior parameters of the target virtual weapon (for example, when the aiming lens of the target virtual weapon is controlled to move downwards by 4 cm within 3 seconds), controlling the target virtual weapon to simulate attack behavior, recording the simulated attack behavior action trace until the simulated attack behavior duration of the simulated attack behavior reaches the attack behavior duration (5 seconds), and determining the recorded simulated attack behavior action trace as a first predicted action trace of the target virtual weapon.
In some embodiments, the step of determining the first predicted trace of action of the target virtual weapon based on the attack behavior operation data, the attack behavior parameters of the target virtual weapon, and the specified video frame comprises:
(1.1) determining a specified ordering number of the specified video frame among a plurality of video frames included in the effect trace video;
And (1.2) controlling the target virtual weapon to simulate attack according to the attack operation data and the attack parameters of the target virtual weapon, recording simulated attack action traces until the sequence number of the last frame in the recorded simulated attack action trace video is the same as the designated sequence number, and determining the recorded simulated attack action trace as a first predicted action trace of the target virtual weapon.
The method for determining the first predicted action trace of the target virtual weapon based on the attack behavior operation data, the attack behavior parameters of the target virtual weapon and the specified video frame may further be: determining a designated sequence number of the designated video frame in a plurality of video frames included in the effect trace video; and controlling the target virtual weapon to simulate the attack according to the attack operation data and the attack parameters of the target virtual weapon, recording simulated attack action traces until the sequence number of the last frame in the recorded simulated attack action trace video is the same as the designated sequence number, and determining the recorded simulated attack action trace as a first predicted action trace of the target virtual weapon.
For example, the action trace video includes 10 video frames, the designated video frame is the 5 th frame, according to the attack action operation data (for example, when the user controls the lens, the mouse moves leftwards by 5 cm or the sliding screen moves rightwards by 2 cm) and the attack action parameters of the target virtual weapon (for example, when the aiming lens of the target virtual weapon is controlled to move downwards by 4 cm within 3 seconds), the target virtual weapon is controlled to perform simulated attack action, the simulated attack action trace is recorded, the sequence number of the last frame in the recorded simulated attack action trace is determined at any time until the sequence number of the last frame in the recorded simulated attack action trace video is the same as the designated sequence number (the 5 th frame), and the recorded simulated attack action trace is determined as the first predicted action trace of the target virtual weapon.
In some embodiments, the active trace playback control further includes a play sub-control, after the step of displaying the specified video frame in the active trace video within the active trace playback control, further comprising:
(1) Responding to the control operation for the play sub-control, and determining an action trace video to which the appointed video frame belongs;
(2) And playing the action trace video in the action trace playback control.
Fig. 1g is a schematic diagram of a graphical user interface according to an embodiment of the present application. The action trace playback control 90 also includes a play sub-control 92 that enables playback viewing in the manner of playing action trace video when the user wishes to play back the action trace track formed by the first virtual weapon each time a virtual bullet is fired. And therefore, when it is determined that the specified video frame belongs to the active trace video in response to the control operation of the user for the play sub-control 92, the active trace video is played in the active trace playback control.
In some embodiments, the active trace playback control further includes an attack behavior record sub-control, after the step of displaying the specified video frame in the active trace video within the active trace playback control, further comprising:
(1) Responding to the control operation of the attack behavior recording sub-control, displaying a plurality of history appointed video frames in a graphical user interface, and recording the recording time points of the action trace track video corresponding to each history appointed video frame;
(2) Selecting a target history designation video frame from the history designation video frames in response to a control operation for the history designation video frame;
(3) And displaying the target history specified video frame in the action trace playback control.
Fig. 1h is a sixth schematic diagram of a graphical user interface according to an embodiment of the present application, as shown in fig. 1 h. In a game play, the user may operate the first virtual weapon of the target virtual character to attack the plurality of attack objects, and thus the attack record sub-control 93 may be set in the effect trace playback control 90. In response to a control operation for the attack behavior recording sub-control 93, a number of history designation video frames (for example, two history designation video frames in fig. 1 h) and recording time points of the action trace video corresponding to each history designation video frame (for example, recording time points 11.23 and 11.40 below the two history designation video frames in fig. 1 h) are displayed in the graphical user interface.
Specifically, a target history designation video frame may be selected from the history designation video frames in response to a control operation by a user with respect to the history designation video frame (e.g., a click selection with respect to the history designation video frame), and the target history designation video frame may be displayed within the active trace playback control 90.
In some embodiments, after the step of displaying the target history designation video frame within the effect trace playback control, further comprising:
and responding to the control operation for the play sub-control, and playing the target action trace video to which the target history appointed video frame belongs in the action trace playback control.
And in response to the control operation of the user on the play sub-control, playing the target action trace video of the target history appointed video frame in the action trace playback control.
In some embodiments, the effect trace playback control further includes an attack behavior control parameter sub-control, and after the step of controlling the attack behavior of the attack object with the first virtual weapon, further includes:
recording attack behavior operation data during attack behavior;
after the step of displaying the specified video frame in the active trace video in the active trace playback control, further comprising:
(1) Responding to the control operation of the attack behavior control parameter sub-control, and displaying a plurality of adjustment sub-controls in a graphical user interface, wherein each adjustment sub-control corresponds to an attack behavior control parameter type;
(2) Determining attack behavior control parameters corresponding to each attack behavior control parameter type in response to control operation aiming at the adjustment child control, and obtaining an attack behavior control parameter set;
(3) And determining a second predicted action trace predicted according to the attack behavior control parameter set based on the attack behavior operation data, the attack behavior control parameter set and the specified video frame, and displaying the second predicted action trace in the displayed specified video frame.
Fig. 1i is a seventh schematic diagram of a graphical user interface according to an embodiment of the present application, as shown in fig. 1 i. In order to make the user perceive the trace of action caused by different attack control parameters (such as lens sensitivity, etc. affecting attack control) under the same attack operation data, the attack operation data of the user can be recorded at each attack, and the attack control parameter sub-control 94 can be set in the attack playback control 90.
Specifically, in response to the control operation for the attack behavior control parameter sub-control, displaying a plurality of adjustment sub-controls (for example, a quasi-center sensitivity adjustment sub-control and a lens sensitivity adjustment sub-control in fig. 1 i) in a graphical user interface, wherein each adjustment sub-control corresponds to an attack behavior control parameter type (for example, a quasi-center sensitivity adjustment sub-control corresponds to a quasi-center sensitivity type in fig. 1i, and a lens sensitivity adjustment sub-control corresponds to a lens sensitivity type); determining an attack behavior control parameter corresponding to each attack behavior control parameter type in response to the control operation aiming at the adjustment child control, and obtaining an attack behavior control parameter set (composed of the attack behavior control parameters adjusted under a plurality of attack behavior control parameter types); and determining a second predicted action trace predicted according to the attack behavior control parameter set based on the attack behavior operation data, the attack behavior control parameter set and the specified video frame, and displaying the second predicted action trace in the displayed specified video frame.
A second predicted trace of action, such as the predicted trace of action formed by the trace of action 120 of the first virtual weapon under the set of attack behaviour control parameters in fig. 1 i. In order to distinguish the action trace traces caused by the first virtual weapon according to the original control parameter set and the adjusted attack behavior control parameter set, different action trace traces may be displayed in the designated video frame, for example, the action trace 100 caused by the first virtual weapon according to the original control parameter set in fig. 1i is circular, the action trace 120 caused by the first virtual weapon according to the adjusted attack behavior control parameter set is triangular, or the difference is made by the color of the action trace, which is not limited herein.
In some embodiments, the step of determining a second predicted action trace predicted according to the set of attack behavior control parameters based on the attack behavior operation data, the set of attack behavior control parameters, and the specified video frame includes:
(1.1) acquiring an initial video frame of the effect trace video;
(1.2) determining an attack duration from the initial video frame to the specified video frame;
And (1.3) controlling the first virtual weapon to simulate the attack according to the attack operation data and the attack control parameter set, recording the simulated attack action trace until the simulated attack action duration of the simulated attack reaches the attack duration, and determining the recorded simulated attack action trace as a second predicted action trace.
The method for determining the second predicted action trace predicted according to the attack behavior control parameter set based on the attack behavior operation data, the attack behavior control parameter set and the specified video frame may be as follows: determining an attack behavior duration from the initial video frame to the designated video frame; and controlling the first virtual weapon to simulate the attack according to the attack operation data and the attack control parameter set, recording the simulated attack action trace, and determining the recorded simulated attack action trace as a second predicted action trace until the simulated attack action duration of the simulated attack reaches the attack action duration.
For example, when the time point at which the recording of the trace of the action of the first virtual weapon is started is 11 minutes and 10 seconds and the time point of the specified video frame is 11 minutes and 20 seconds at the time of recording the specified video frame, a total of 10 seconds are recorded. And controlling the first virtual weapon to simulate the attack according to the attack operation data and the attack control parameter set, recording the simulated attack action trace until the simulated attack action duration of the simulated attack reaches the attack action duration (10 seconds), and determining the recorded simulated attack action trace as a second predicted action trace.
In some embodiments, the step of determining a second predicted trace of action predicted by the set of attack behavioural control parameters based on the attack behavioural operation data, the set of attack behavioural control parameters and the specified video frame comprises:
(1.1) determining a specified ordering number of the specified video frame among a plurality of video frames included in the effect trace video;
and (1.2) controlling the first virtual weapon to simulate attack according to the attack operation data and the attack control parameter set, recording simulated attack action traces until the sequence number of the last frame in the recorded simulated attack action trace video is the same as the designated sequence number, and determining the recorded simulated attack action trace as a second predicted action trace.
The method for determining the second predicted action trace predicted according to the attack behavior control parameter set based on the attack behavior operation data, the attack behavior control parameter set and the specified video frame may further be: determining a designated sequence number of the designated video frame in a plurality of video frames included in the effect trace video; and controlling the first virtual weapon to simulate the attack according to the attack operation data and the attack control parameter set, recording simulated attack action traces until the sequence number of the last frame in the recorded simulated attack action trace video is the same as the designated sequence number, and determining the recorded simulated attack action trace as a second predicted action trace.
For example, the action trace video includes 10 video frames, the 8 th video frame is designated as the 8 th video frame, the first virtual weapon is controlled to simulate the attack according to the attack operation data and the attack control parameter set, the simulated attack action trace is recorded until the sequence number of the last frame in the recorded simulated attack action trace video is the same as the designated sequence number (8 th frame), and the recorded simulated attack action trace is determined as the second predicted action trace.
In some embodiments, before the step of displaying a number of adjustment sub-controls within the graphical user interface, further comprising:
(1) Simulating the attack on the attack object based on the attack operation data and the candidate attack behavior control parameter groups formed by attack behavior control parameters corresponding to different attack behavior control parameter types, and calculating the hit rate of the action trace of each candidate attack behavior control parameter group on the attack object and the damage value caused by the attack behavior control parameter groups;
(2) Screening a target attack behavior control parameter set from candidate attack behavior control parameter sets based on the action trace hit rate and the damage value;
(3) Determining a target attack behavior control parameter corresponding to each attack behavior control parameter type in the target attack behavior control parameter set;
the step of displaying a plurality of adjustment sub-controls within the graphical user interface includes:
and displaying a plurality of adjustment sub-controls in the graphical user interface, and displaying corresponding target attack behavior control parameters on each adjustment sub-control in a first display mode based on attack behavior control parameter types corresponding to each adjustment sub-control.
In order to enable a user to quickly determine proper attack behavior control parameters, candidate attack behavior control parameter sets formed by attack behavior control parameters corresponding to different attack behavior control parameter types can be based on the attack behavior operation data, attack behaviors on the attack object are simulated, and the hit rate of the attack trace of each candidate attack behavior control parameter set on the attack object and the damage value caused by the attack trace are calculated; screening a target attack behavior control parameter set from candidate attack behavior control parameter sets based on the action trace hit rate and the damage value; determining a target attack behavior control parameter corresponding to each attack behavior control parameter type in the target attack behavior control parameter set; and displaying the target attack behavior control parameters corresponding to the control parameter types in the adjustment sub-controls corresponding to the control parameter types in a first display mode.
Specifically, after the hit rate of the action trace simulated under each candidate attack behavior control parameter set and the injury value are obtained, the target attack behavior control parameter set can be screened out according to a specified screening mode. The designated screening mode may be to preferentially screen the candidate attack behavior control parameter set with the largest damage value as the target attack behavior control parameter set. When the damage values of the plurality of candidate attack behavior control parameter sets are all the largest, the candidate attack behavior control parameter set with the largest hit rate can be determined as the target attack behavior control parameter set.
Fig. 1j is an eighth schematic diagram of a graphical user interface according to an embodiment of the present application, as shown in fig. 1 j. After the target attack behavior control parameter set is determined, the target attack behavior control parameters corresponding to the control parameter types can be displayed in a first display mode in the adjustment sub-control corresponding to the control parameter types. For example, in FIG. 1j, the quasi-cardiology sensitivity recommendation is adjusted to 80% and the lens sensitivity recommendation is adjusted to 30%.
From the foregoing, it can be seen that, in the embodiment of the present application, a graphical user interface is displayed, where the graphical user interface includes at least a part of a virtual scene, and a target virtual object located in the virtual scene, where the target virtual object is configured with a first virtual weapon; controlling the first virtual weapon to conduct attack on the attack object in response to attack operation on the attack object, and recording an action trace video of the first virtual weapon on the attack object; and displaying an action trace playback control in the graphical user interface, and displaying a specified video frame in the action trace video in the action trace playback control. In this way, the appointed video frames in the recorded action trace video are displayed in the action trace playback control displayed in the graphical user interface, so that the user can know the action trace track when in attack by looking at the appointed video frames.
In order to facilitate better implementation of the action trace playback method provided by the embodiment of the application, the embodiment of the application also provides a device based on the action trace playback method. Wherein the meaning of nouns is the same as in the above-mentioned playback method of effect traces, specific implementation details can be referred to the description in the method embodiment.
Referring to fig. 2, fig. 2 is a schematic structural diagram of an active trace playback apparatus according to an embodiment of the present application, where the active trace playback apparatus provides a graphical user interface through a terminal device, and content displayed on the graphical user interface includes at least a part of a virtual scene, where the virtual scene includes a first virtual weapon, and may include a control module 301, a recording module 302, and so on.
A control module 301, configured to control, in response to an attack operation for an attack object, to perform a target attack behavior by using the first virtual weapon;
the recording module 302 is configured to record attack record information of the target attack behavior, where the attack record information is used to record an action trace of the target attack behavior.
In some embodiments, the recording module 302 includes:
a changing sub-module, configured to change a collision volume of a virtual character from an original collision volume to a target collision volume when an attack object of the target attack behavior is the virtual character, where the target collision volume is greater than the original collision volume;
And the recording sub-module is used for recording the action trace on the original collision volume in the continuous process of the target attack behavior to obtain an action trace video.
In some embodiments, the recording module 302 further comprises:
the establishing sub-module is used for determining the current video frame as a target video frame when other collision volumes are attacked in the continuous process of the target attack behavior, and establishing the corresponding relation between the target video frame and the attack position when the other collision volumes are attacked, wherein the other collision volumes are collision volumes except the original collision volume in the target collision volume;
the apparatus further comprises:
and the generation module is used for generating an action trace in the attack position in the target video frame according to the corresponding relation between the target video frame and the attack position.
In some embodiments, the apparatus further comprises:
and the display module is used for displaying an action trace playback control in the graphical user interface, and playing back the action trace video through the action trace playback control.
In some embodiments, the active trace playback control further comprises an active trace comparison sub-control, the apparatus further comprising:
The second recording module is used for recording attack behavior operation data in attack behavior;
a third display module, configured to display a virtual weapon selection interface in the graphical user interface in response to a control operation for the bracketing trace comparison sub-control, where the virtual weapon selection interface includes a plurality of candidate virtual weapons;
a selection module for selecting a target virtual weapon from the candidate virtual weapons in response to a control operation for the candidate virtual weapons;
and the fourth display module is used for determining a first predicted action trace of the target virtual weapon based on the attack behavior operation data, the attack behavior parameters of the target virtual weapon and the appointed video frame, and displaying the first predicted action trace in the displayed appointed video frame.
In some embodiments, the fourth display module includes:
the first acquisition sub-module is used for acquiring an initial video frame of the action trace video;
a first determining submodule, configured to determine an attack duration from the initial video frame to the specified video frame;
the first recording sub-module is used for controlling the target virtual weapon to simulate the attack according to the attack operation data and the attack parameters of the target virtual weapon, recording the simulated attack action trace until the simulated attack action duration of the simulated attack reaches the attack action duration, and determining the recorded simulated attack action trace as a first predicted action trace of the target virtual weapon.
In some embodiments, the fourth display module includes:
a second determining submodule, configured to determine a specified ordering sequence number of the specified video frame in a plurality of video frames included in the effect trace video;
and the second recording sub-module is used for controlling the target virtual weapon to simulate the attack according to the attack operation data and the attack parameter of the target virtual weapon, recording simulated attack action traces until the sequence number of the last frame in the recorded simulated attack action trace video is the same as the appointed sequence number, and determining the recorded simulated attack action trace as a first predicted action trace of the target virtual weapon.
In some embodiments, the effect trace playback control further comprises a play sub-control, the apparatus further comprising:
the first determining module is used for determining the action trace video to which the appointed video frame belongs in response to the control operation for the play sub-control;
and the first playing module is used for playing the action trace video in the action trace playback control.
In some embodiments, the effect trace playback control further includes an attack behavior record sub-control, the apparatus further including:
The fifth display module is used for responding to the control operation of the attack behavior recording sub-control, displaying a plurality of history appointed video frames in the graphical user interface and recording the recording time points of the action trace track video corresponding to each history appointed video frame;
a selecting module for selecting a target history designation video frame from the history designation video frames in response to a control operation for the history designation video frame;
and the sixth display module is used for displaying the target history appointed video frame in the action trace playback control.
In some embodiments, the apparatus further comprises:
and the second playing module is used for responding to the control operation of the playing sub-control and playing the target action trace video of the target history appointed video frame in the action trace playback control.
In some embodiments, the effect trace playback control further includes an attack behavior control parameter sub-control, the apparatus further including:
the third recording module is used for recording attack behavior operation data in attack behavior;
the seventh display module is used for responding to the control operation of the attack behavior control parameter sub-control, and displaying a plurality of adjustment sub-controls in the graphical user interface, wherein each adjustment sub-control corresponds to an attack behavior control parameter type;
The second determining module is used for determining the attack behavior control parameters corresponding to each attack behavior control parameter type in response to the control operation aiming at the adjustment child control, so as to obtain an attack behavior control parameter set;
and an eighth display module, configured to determine a second predicted action trace predicted according to the attack behavior control parameter set based on the attack behavior operation data, the attack behavior control parameter set, and the specified video frame, and display the second predicted action trace in the displayed specified video frame.
In some embodiments, the eighth display module includes:
the second acquisition sub-module is used for acquiring an initial video frame of the action trace video;
a third determining submodule, configured to determine an attack duration from the initial video frame to the specified video frame;
and the third recording sub-module is used for controlling the first virtual weapon to simulate the attack according to the attack operation data and the attack control parameter set, recording the simulated attack action trace until the simulated attack action duration of the simulated attack reaches the attack action duration, and determining the recorded simulated attack action trace as a second predicted action trace.
In some embodiments, the eighth display module includes:
a fourth determining submodule, configured to determine a specified ordering sequence number of the specified video frame in a plurality of video frames included in the effect trace video;
and the fourth recording sub-module is used for controlling the first virtual weapon to simulate the attack according to the attack operation data and the attack control parameter set, recording simulated attack action traces until the sequence number of the last frame in the recorded simulated attack action trace video is the same as the appointed sequence number, and determining the recorded simulated attack action trace as a second predicted action trace.
In some embodiments, the apparatus further comprises:
the calculation module is used for simulating the attack behavior of the attack object based on the attack behavior operation data and the candidate attack behavior control parameter groups formed by attack behavior control parameters corresponding to different attack behavior control parameter types, and calculating the hit rate of the action trace of each candidate attack behavior control parameter group on the attack object and the damage value caused by the attack behavior control parameter groups;
the screening module is used for screening out a target attack behavior control parameter set from candidate attack behavior control parameter sets based on the action trace hit rate and the injury value;
A third determining module, configured to determine a target attack behavior control parameter corresponding to each attack behavior control parameter type in the target attack behavior control parameter set;
the seventh display module includes:
the display sub-module is used for displaying a plurality of adjustment sub-controls in the graphical user interface, and displaying corresponding target attack behavior control parameters on each adjustment sub-control in a first display mode based on attack behavior control parameter types corresponding to each adjustment sub-control.
In some embodiments, the apparatus further comprises:
the execution module is used for judging whether the operation interval duration between the current shooting operation and the last shooting operation reaches the preset operation duration; and/or
If the first virtual weapon is a replaced virtual weapon; and/or
And if the attack object is the replaced virtual object, executing the step of recording attack record information of the target attack behavior.
In some embodiments, the recording module 302 includes:
the triggering sub-module is used for triggering and recording the action trace video of the first virtual weapon acting on the attack object when the target attack behavior is detected to hit the attack object; or (b)
When the target attack behavior is detected to start attack, triggering and recording the action trace video of the first virtual weapon on the attack object.
As can be seen from the foregoing, in the embodiment of the present application, the control module 301 responds to the attack operation for the attack object to control the target attack behavior by using the first virtual weapon; the recording module 302 records attack record information of the target attack behavior, where the attack record information is used to record the action trace of the target attack behavior. Therefore, when the virtual weapon performs the attack, the action trace of the attack is recorded, so that a user can know whether the target is hit or which parts of the target are hit when the user attacks by looking up the attack record information.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Correspondingly, the embodiment of the application also provides computer equipment, which can be a terminal or a server, wherein the terminal can be terminal equipment such as a smart phone, a tablet personal computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA) and the like. Fig. 3 is a schematic structural diagram of a computer device according to an embodiment of the present application, as shown in fig. 3. The computer device 1000 includes a processor 401 having one or more processing cores, a memory 402 having one or more computer readable storage media, and a computer program stored on the memory 402 and executable on the processor. The processor 401 is electrically connected to the memory 402. It will be appreciated by those skilled in the art that the computer device structure shown in the figures is not limiting of the computer device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
The processor 401 is a control center of the computer device 1000, connects various parts of the entire computer device 1000 using various interfaces and lines, and performs various functions of the computer device 1000 and processes data by running or loading software programs and/or modules stored in the memory 402, and calling data stored in the memory 402, thereby performing overall monitoring of the computer device 1000.
In the embodiment of the present application, the processor 401 in the computer device 1000 loads the instructions corresponding to the processes of one or more application programs into the memory 402 according to the following steps, and the processor 401 executes the application programs stored in the memory 402, so as to implement various functions:
controlling the first virtual weapon to conduct target attack behaviors in response to attack operation aiming at an attack object; and recording attack record information of the target attack behaviors, wherein the attack record information is used for recording the action trace of the target attack behaviors.
In some embodiments, the step of recording attack record information of the target attack behavior includes:
when the attack object of the target attack behavior is a virtual character, changing the collision volume of the virtual character from an original collision volume to a target collision volume, wherein the target collision volume is larger than the original collision volume;
And recording the action trace on the original collision volume in the continuous process of the target attack behavior, and obtaining an action trace video.
In some embodiments, the method further comprises:
when other collision volumes are detected to be attacked in the continuous process of the target attack behavior, determining the current video frame as a target video frame, and establishing a corresponding relation between the target video frame and attack positions when the other collision volumes are attacked, wherein the other collision volumes are collision volumes except the original collision volume in the target collision volume;
after the step of recording the action trace on the original collision volume in the duration of the target attack behavior to obtain an action trace video, the method further comprises the following steps:
and generating an action trace in the action trace video at the attack position in the target video frame according to the corresponding relation between the target video frame and the attack position.
In some embodiments, after the step of recording attack record information of the target attack behavior, the attack record information is used for recording an action trace of the target attack behavior, the method further includes:
and displaying an action trace playback control in the graphical user interface, and playing back the action trace video through the action trace playback control.
In some embodiments, the active trace playback control further includes an active trace comparison sub-control, after the step of controlling the attack on the attack object with the first virtual weapon, further comprising:
recording attack behavior operation data during attack behavior;
after the step of displaying the specified video frame in the active trace video in the active trace playback control, further comprising:
responsive to a control operation for the bracketing trace comparison sub-control, displaying a virtual weapon selection interface within the graphical user interface, the virtual weapon selection interface including a number of candidate virtual weapons;
selecting a target virtual weapon from the candidate virtual weapons in response to a control operation for the candidate virtual weapons;
and determining a first predicted action trace of the target virtual weapon based on the attack behavior operation data, the attack behavior parameters of the target virtual weapon and the specified video frame, and displaying the first predicted action trace in the displayed specified video frame.
In some embodiments, the step of determining the first predicted trace of action of the target virtual weapon based on the attack behavior operation data, the attack behavior parameters of the target virtual weapon, and the specified video frame comprises:
Acquiring an initial video frame of the action trace video;
determining an attack behavior duration from the initial video frame to the designated video frame;
and controlling the target virtual weapon to simulate the attack according to the attack operation data and the attack parameters of the target virtual weapon, recording the simulated attack action trace until the simulated attack action time of the simulated attack reaches the attack action time, and determining the recorded simulated attack action trace as a first predicted action trace of the target virtual weapon.
In some embodiments, the step of determining the first predicted trace of action of the target virtual weapon based on the attack behavior operation data, the attack behavior parameters of the target virtual weapon, and the specified video frame comprises:
determining a designated sequence number of the designated video frame in a plurality of video frames included in the effect trace video;
and controlling the target virtual weapon to simulate the attack according to the attack operation data and the attack parameters of the target virtual weapon, recording simulated attack action traces until the sequence number of the last frame in the recorded simulated attack action trace video is the same as the designated sequence number, and determining the recorded simulated attack action trace as a first predicted action trace of the target virtual weapon.
In some embodiments, the active trace playback control further includes a play sub-control, after the step of displaying the specified video frame in the active trace video within the active trace playback control, further comprising:
responding to the control operation for the play sub-control, and determining an action trace video to which the appointed video frame belongs;
and playing the action trace video in the action trace playback control.
In some embodiments, the active trace playback control further includes an attack behavior record sub-control, after the step of displaying the specified video frame in the active trace video within the active trace playback control, further comprising:
responding to the control operation of the attack behavior recording sub-control, displaying a plurality of history appointed video frames in a graphical user interface, and recording the recording time points of the action trace track video corresponding to each history appointed video frame;
selecting a target history designation video frame from the history designation video frames in response to a control operation for the history designation video frame;
and displaying the target history specified video frame in the action trace playback control.
In some embodiments, after the step of displaying the target history designation video frame within the effect trace playback control, further comprising:
and responding to the control operation for the play sub-control, and playing the target action trace video to which the target history appointed video frame belongs in the action trace playback control.
In some embodiments, the effect trace playback control further includes an attack behavior control parameter sub-control, and after the step of controlling the attack behavior of the attack object with the first virtual weapon, further includes:
recording attack behavior operation data during attack behavior;
after the step of displaying the specified video frame in the active trace video in the active trace playback control, further comprising:
responding to the control operation of the attack behavior control parameter sub-control, and displaying a plurality of adjustment sub-controls in a graphical user interface, wherein each adjustment sub-control corresponds to an attack behavior control parameter type;
determining attack behavior control parameters corresponding to each attack behavior control parameter type in response to control operation aiming at the adjustment child control, and obtaining an attack behavior control parameter set;
And determining a second predicted action trace predicted according to the attack behavior control parameter set based on the attack behavior operation data, the attack behavior control parameter set and the specified video frame, and displaying the second predicted action trace in the displayed specified video frame.
In some embodiments, the step of determining a second predicted action trace predicted according to the set of attack behavior control parameters based on the attack behavior operation data, the set of attack behavior control parameters, and the specified video frame includes:
acquiring an initial video frame of the action trace video;
determining an attack behavior duration from the initial video frame to the designated video frame;
and controlling the first virtual weapon to simulate the attack according to the attack operation data and the attack control parameter set, recording the simulated attack action trace, and determining the recorded simulated attack action trace as a second predicted action trace until the simulated attack action duration of the simulated attack reaches the attack action duration.
In some embodiments, the step of determining a second predicted trace of action predicted by the set of attack behavioural control parameters based on the attack behavioural operation data, the set of attack behavioural control parameters and the specified video frame comprises:
Determining a designated sequence number of the designated video frame in a plurality of video frames included in the effect trace video;
and controlling the first virtual weapon to simulate the attack according to the attack operation data and the attack control parameter set, recording simulated attack action traces until the sequence number of the last frame in the recorded simulated attack action trace video is the same as the designated sequence number, and determining the recorded simulated attack action trace as a second predicted action trace.
In some embodiments, before the step of displaying a number of adjustment sub-controls within the graphical user interface, further comprising:
simulating the attack on the attack object based on the attack operation data and the candidate attack behavior control parameter groups formed by attack behavior control parameters corresponding to different attack behavior control parameter types, and calculating the hit rate of the action trace of each candidate attack behavior control parameter group on the attack object and the damage value caused by the attack behavior control parameter groups;
screening a target attack behavior control parameter set from candidate attack behavior control parameter sets based on the action trace hit rate and the damage value;
Determining a target attack behavior control parameter corresponding to each attack behavior control parameter type in the target attack behavior control parameter set;
the step of displaying a plurality of adjustment sub-controls within the graphical user interface includes:
and displaying a plurality of adjustment sub-controls in the graphical user interface, and displaying corresponding target attack behavior control parameters on each adjustment sub-control in a first display mode based on attack behavior control parameter types corresponding to each adjustment sub-control.
In some embodiments, before the step of recording attack record information of the target attack behaviour, the method further comprises:
if the operation interval time between the current shooting operation and the last shooting operation reaches the preset operation time; and/or
If the first virtual weapon is a replaced virtual weapon; and/or
And if the attack object is the replaced virtual object, executing the step of recording attack record information of the target attack behavior.
In some embodiments, the step of recording attack record information of the target attack behavior includes:
triggering and recording an action trace video of the first virtual weapon acting on the attack object when the target attack behavior is detected to hit the attack object; or (b)
When the target attack behavior is detected to start attack, triggering and recording the action trace video of the first virtual weapon on the attack object.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Optionally, as shown in fig. 3, the computer device 1000 further includes: a touch display 403, a radio frequency circuit 404, an audio circuit 405, an input unit 406, and a power supply 407. The processor 401 is electrically connected to the touch display 403, the radio frequency circuit 404, the audio circuit 405, the input unit 406, and the power supply 407, respectively. Those skilled in the art will appreciate that the computer device structure shown in FIG. 3 is not limiting of the computer device and may include more or fewer components than shown, or may be combined with certain components, or a different arrangement of components.
The touch display 403 may be used to display a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface. The touch display screen 403 may include a display panel and a touch panel. Wherein the display panel may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of a computer device, which may be composed of graphics, text, icons, video, and any combination thereof. Alternatively, the display panel may be configured in the form of a liquid crystal display (LCD, liquid Crystal Display), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations on or near the user (such as operations on or near the touch panel by the user using any suitable object or accessory such as a finger, stylus, etc.), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 401, and can receive and execute commands sent from the processor 401. The touch panel may overlay the display panel, and upon detection of a touch operation thereon or thereabout, the touch panel is passed to the processor 401 to determine the type of touch event, and the processor 401 then provides a corresponding visual output on the display panel in accordance with the type of touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 403 to realize the input and output functions. In some embodiments, however, the touch panel and the touch panel may be implemented as two separate components to perform the input and output functions. I.e. the touch-sensitive display 403 may also implement an input function as part of the input unit 406.
In an embodiment of the present application, the processor 401 executes the game application program to generate a graphical user interface on the touch display screen 403, where the virtual scene on the graphical user interface includes at least one skill control area, and the skill control area includes at least one skill control. The touch display 403 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The radio frequency circuitry 404 may be used to transceive radio frequency signals to establish wireless communications with a network device or other computer device via wireless communications.
The audio circuitry 405 may be used to provide an audio interface between a user and a computer device through speakers, microphones, and so on. The audio circuit 405 may transmit the received electrical signal after audio data conversion to a speaker, where the electrical signal is converted into a sound signal for output; on the other hand, the microphone converts the collected sound signals into electrical signals, which are received by the audio circuit 405 and converted into audio data, which are processed by the audio data output processor 401 and sent via the radio frequency circuit 404 to, for example, another computer device, or which are output to the memory 402 for further processing. The audio circuit 405 may also include an ear bud jack to provide communication of the peripheral ear bud with the computer device.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 407 is used to power the various components of the computer device 1000. Alternatively, the power supply 407 may be logically connected to the processor 401 through a power management system, so as to implement functions of managing charging, discharging, and power consumption management through the power management system. The power supply 407 may also include one or more of any of a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown in fig. 3, the computer device 1000 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the computer device provided in this embodiment controls the target attack behavior by using the first virtual weapon in response to the attack operation for the attack object; and recording attack record information of the target attack behaviors, wherein the attack record information is used for recording the action trace of the target attack behaviors. Therefore, when the virtual weapon performs the attack, the action trace of the attack is recorded, so that a user can know whether the target is hit or which parts of the target are hit when the user attacks by looking up the attack record information.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, an embodiment of the present application provides a computer readable storage medium having stored therein a plurality of computer programs that can be loaded by a processor to perform steps in any of the skills control methods provided by the embodiment of the present application. For example, the computer program may perform the steps of:
controlling the first virtual weapon to conduct target attack behaviors in response to attack operation aiming at an attack object; and recording attack record information of the target attack behaviors, wherein the attack record information is used for recording the action trace of the target attack behaviors.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Wherein the storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
The steps in any action trace playback method provided by the embodiment of the present application can be executed by the computer program stored in the storage medium, so that the beneficial effects that any action trace playback method provided by the embodiment of the present application can be achieved, which are detailed in the previous embodiments and are not described herein.
The foregoing has described in detail the methods, apparatuses, storage media and computer devices for playback of effect marks provided by the embodiments of the present application, and specific examples have been applied to illustrate the principles and embodiments of the present application, and the description of the foregoing embodiments is only for aiding in understanding the methods and core ideas of the present application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, the present description should not be construed as limiting the present application.

Claims (19)

1. A method of action trace playback, wherein a graphical user interface is provided by a terminal device, the graphical user interface displaying content comprising at least a portion of a virtual scene including a first virtual weapon, the method comprising:
Controlling the first virtual weapon to conduct target attack behaviors in response to attack operation aiming at an attack object;
and recording attack record information of the target attack behaviors, wherein the attack record information is used for recording the action trace of the target attack behaviors.
2. The effect trace playback method as claimed in claim 1, wherein the step of recording attack record information of the target attack behaviour comprises:
when the attack object of the target attack behavior is a virtual character, changing the collision volume of the virtual character from an original collision volume to a target collision volume, wherein the target collision volume is larger than the original collision volume;
and recording the action trace on the original collision volume in the continuous process of the target attack behavior, and obtaining an action trace video.
3. The method of action trace playback according to claim 2, wherein the method further comprises:
when other collision volumes are detected to be attacked in the continuous process of the target attack behavior, determining the current video frame as a target video frame, and establishing a corresponding relation between the target video frame and attack positions when the other collision volumes are attacked, wherein the other collision volumes are collision volumes except the original collision volume in the target collision volume;
After the step of recording the action trace on the original collision volume in the duration of the target attack behavior to obtain an action trace video, the method further comprises the following steps:
and generating an action trace in the action trace video at the attack position in the target video frame according to the corresponding relation between the target video frame and the attack position.
4. The action trace playback method according to claim 2, wherein after the step of recording attack record information of the target attack behavior, the attack record information is used to record the action trace of the target attack behavior, further comprising:
and displaying an action trace playback control in the graphical user interface, and playing back the action trace video through the action trace playback control.
5. The method of action trace playback according to claim 4, wherein the action trace playback control further comprises an action trace comparison sub-control, and after the step of controlling target attack using the first virtual weapon in response to an attack operation against an attack object, further comprising:
recording shooting operation data during shooting;
after the step of displaying the effect trace playback control in the graphical user interface, further comprising:
Responsive to a control operation for the action trace comparison sub-control, displaying a virtual weapon selection interface within the graphical user interface, the virtual weapon selection interface including a number of candidate virtual weapons;
selecting a target virtual weapon from the candidate virtual weapons in response to a control operation for the candidate virtual weapons;
and determining a first predicted action trace of the target virtual weapon based on the shooting operation data, shooting parameters of the target virtual weapon and a specified video frame, and displaying the first predicted action trace in the displayed specified video frame.
6. The method of action trace playback according to claim 5, wherein the step of determining a first predicted action trace of the target virtual weapon based on the firing operation data, the firing parameters of the target virtual weapon, and the specified video frame comprises:
acquiring an initial video frame of the action trace video;
determining a shot duration from the initial video frame to the specified video frame;
and controlling the target virtual weapon to perform simulated shooting according to the shooting operation data and the shooting parameters of the target virtual weapon, recording simulated shooting action marks until the simulated shooting time length of the simulated shooting reaches the shooting time length, and determining the recorded simulated shooting action marks as first predicted action marks of the target virtual weapon.
7. The method of action trace playback according to claim 5, wherein the step of determining a first predicted action trace of the target virtual weapon based on the firing operation data, the firing parameters of the target virtual weapon, and the specified video frame comprises:
determining a designated sequence number of the designated video frame in a plurality of video frames included in the effect trace video;
and controlling the target virtual weapon to perform simulated shooting according to the shooting operation data and the shooting parameters of the target virtual weapon, recording simulated shooting effect traces until the sequence number of the last frame in the recorded simulated shooting effect trace video is the same as the designated sequence number, and determining the recorded simulated shooting effect trace as a first predicted effect trace of the target virtual weapon.
8. The active trace playback method as recited in claim 4, wherein the active trace playback control further comprises a play sub-control, further comprising, after the step of displaying a specified video frame in the active trace video within the active trace playback control:
responding to the control operation for the play sub-control, and determining an action trace video to which the appointed video frame belongs;
And playing the action trace video in the action trace playback control.
9. The active trace playback method as recited in claim 8, wherein the active trace playback control further comprises a shot record sub-control, further comprising, after the step of displaying a specified video frame in the active trace video within the active trace playback control:
responding to the control operation aiming at the shooting record sub-control, displaying a plurality of history appointed video frames in a graphical user interface, and recording the recording time points of the action trace track video corresponding to each history appointed video frame;
selecting a target history designation video frame from the history designation video frames in response to a control operation for the history designation video frame;
and displaying the target history specified video frame in the action trace playback control.
10. The method of action trace playback according to claim 9, further comprising, after the step of displaying the target history designation video frame within the action trace playback control:
and responding to the control operation for the play sub-control, and playing the target action trace video to which the target history appointed video frame belongs in the action trace playback control.
11. The method of action trace playback according to claim 4, wherein the action trace playback control further comprises a fire control parameter sub-control, further comprising, after the step of controlling firing of the attack object with the first virtual weapon:
recording shooting operation data during shooting;
after the step of displaying the specified video frame in the active trace video in the active trace playback control, further comprising:
responding to the control operation of the shooting control parameter sub-controls, and displaying a plurality of adjustment sub-controls in a graphical user interface, wherein each adjustment sub-control corresponds to a shooting control parameter type;
determining shooting control parameters corresponding to each shooting control parameter type in response to control operation aiming at the regulator control, and obtaining a shooting control parameter set;
and determining a second predicted action trace predicted according to the shooting control parameter set based on the shooting operation data, the shooting control parameter set and the specified video frame, and displaying the second predicted action trace in the displayed specified video frame.
12. The method of playing back effect tracks according to claim 11, wherein the step of determining a second predicted effect track predicted by the shooting control parameter set based on the shooting operation data, the shooting control parameter set, and the specified video frame, comprises:
Acquiring an initial video frame of the action trace video;
determining a shot duration from the initial video frame to the specified video frame;
and controlling the first virtual weapon to perform simulated shooting according to the shooting operation data and the shooting control parameter set, recording simulated shooting action marks until the simulated shooting time length of the simulated shooting reaches the shooting time length, and determining the recorded simulated shooting action marks as second predicted action marks.
13. The effect trace playback method as recited in claim 11, wherein the step of determining a second predicted effect trace predicted by the shooting control parameter set based on the shooting operation data, the shooting control parameter set, and the specified video frame, comprises:
determining a designated sequence number of the designated video frame in a plurality of video frames included in the effect trace video;
and controlling the first virtual weapon to perform simulated shooting according to the shooting operation data and the shooting control parameter set, recording simulated shooting effect traces until the sequence number of the last frame in the recorded simulated shooting effect trace video is the same as the designated sequence number, and determining the recorded simulated shooting effect trace as a second predicted effect trace.
14. The method of action trace playback according to claim 11, further comprising, prior to the step of displaying a plurality of adjustment sub-controls within the graphical user interface:
simulating shooting of the attack object based on the shooting operation data and candidate shooting control parameter groups formed by shooting control parameters corresponding to different shooting control parameter types, and calculating the hit rate of each group of candidate shooting control parameter groups on the action trace of the attack object and the damage value caused by the hit rate;
screening a target shooting control parameter set from candidate shooting control parameter sets based on the action trace hit rate and the injury value;
determining a target shooting control parameter corresponding to each shooting control parameter type in the target shooting control parameter set;
the step of displaying a plurality of adjustment sub-controls within the graphical user interface includes:
and displaying a plurality of adjustment sub-controls in the graphical user interface, and displaying corresponding target shooting control parameters on each adjustment sub-control in a first display mode based on the shooting control parameter types corresponding to each adjustment sub-control.
15. The effect trace playback method as claimed in claim 1, further comprising, before the step of recording attack record information of the target attack behaviour:
If the operation interval time between the current shooting operation and the last shooting operation reaches the preset operation time; and/or
If the first virtual weapon is a replaced virtual weapon; and/or
And if the attack object is the replaced virtual object, executing the step of recording attack record information of the target attack behavior.
16. The method of playback of effect tracks as recited in claim 1, wherein the step of recording attack log information of the target attack activity comprises:
triggering and recording an action trace video of the first virtual weapon acting on the attack object when the target attack behavior is detected to hit the attack object; or (b)
When the target attack behavior is detected to start attack, triggering and recording the action trace video of the first virtual weapon on the attack object.
17. An action trace playback apparatus, wherein a graphical user interface is provided by a terminal device, the graphical user interface displaying content comprising at least a portion of a virtual scene including a first virtual weapon, the apparatus comprising:
the control module is used for responding to the attack operation aiming at the attack object and controlling the first virtual weapon to carry out target attack behavior;
The recording module is used for recording attack record information of the target attack behaviors, and the attack record information is used for recording the action trace of the target attack behaviors.
18. A computer readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps in the effect trace playback method of any one of claims 1 to 16.
19. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the method of playback of effect traces as claimed in any one of claims 1 to 16 when the program is executed by the processor.
CN202311016958.5A 2023-08-11 2023-08-11 Method and device for playing back action trace, storage medium and computer equipment Pending CN116966564A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311016958.5A CN116966564A (en) 2023-08-11 2023-08-11 Method and device for playing back action trace, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311016958.5A CN116966564A (en) 2023-08-11 2023-08-11 Method and device for playing back action trace, storage medium and computer equipment

Publications (1)

Publication Number Publication Date
CN116966564A true CN116966564A (en) 2023-10-31

Family

ID=88483111

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311016958.5A Pending CN116966564A (en) 2023-08-11 2023-08-11 Method and device for playing back action trace, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN116966564A (en)

Similar Documents

Publication Publication Date Title
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN113398590B (en) Sound processing method, device, computer equipment and storage medium
CN113426124B (en) Display control method and device in game, storage medium and computer equipment
CN113082688B (en) Method and device for controlling virtual character in game, storage medium and equipment
CN113082707B (en) Virtual object prompting method and device, storage medium and computer equipment
CN113082709A (en) Information prompting method and device in game, storage medium and computer equipment
CN113546422A (en) Virtual resource delivery control method and device, computer equipment and storage medium
CN115193064A (en) Virtual object control method and device, storage medium and computer equipment
CN115225926B (en) Game live broadcast picture processing method, device, computer equipment and storage medium
CN115869623A (en) Virtual weapon processing method and device, computer equipment and storage medium
CN115999153A (en) Virtual character control method and device, storage medium and terminal equipment
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
CN116966564A (en) Method and device for playing back action trace, storage medium and computer equipment
CN113398564B (en) Virtual character control method, device, storage medium and computer equipment
CN116850594A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN118022318A (en) Game control processing method and device, computer equipment and storage medium
CN117643723A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN116036589A (en) Attack perception method and device of virtual weapon, storage medium and computer equipment
CN118059493A (en) Game control method, game control device, computer equipment and storage medium
CN116328301A (en) Information prompting method, device, computer equipment and storage medium
CN114042322A (en) Animation display method and device, computer equipment and storage medium
CN118161853A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN116870472A (en) Game view angle switching method and device, computer equipment and storage medium
CN115970282A (en) Virtual lens control method and device, storage medium and computer equipment
CN117942567A (en) Game interaction method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination