CN111475029B - Operation method, device, equipment and storage medium of virtual prop - Google Patents

Operation method, device, equipment and storage medium of virtual prop Download PDF

Info

Publication number
CN111475029B
CN111475029B CN202010299027.0A CN202010299027A CN111475029B CN 111475029 B CN111475029 B CN 111475029B CN 202010299027 A CN202010299027 A CN 202010299027A CN 111475029 B CN111475029 B CN 111475029B
Authority
CN
China
Prior art keywords
virtual
throwing
state
virtual object
prop
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010299027.0A
Other languages
Chinese (zh)
Other versions
CN111475029A (en
Inventor
姚丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010299027.0A priority Critical patent/CN111475029B/en
Publication of CN111475029A publication Critical patent/CN111475029A/en
Application granted granted Critical
Publication of CN111475029B publication Critical patent/CN111475029B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses an operation method, device, equipment and storage medium of a virtual prop, and relates to the field of virtual environments. The method comprises the following steps: displaying a virtual environment interface comprising a virtual object in a ground lying state, wherein the virtual object holds a virtual prop in a pre-throwing state; controlling the virtual prop to cancel a pre-throwing state in response to a movement control operation on the virtual object; and responding to the virtual object ending movement, and restoring the virtual object to hold the virtual prop to be in a pre-cast state. The pre-throwing state of the virtual prop is automatically cancelled in response to the movement control operation, and after the movement is finished, the pre-throwing state of the virtual prop is recovered, so that the problem that the human-computer interaction efficiency is low due to the fact that the movement of a virtual object needs to be realized through more interface operations in the process of pre-throwing the virtual prop in the ground lying state is solved, the human-computer interaction efficiency is improved, and the operation efficiency of the virtual prop is improved.

Description

Operation method, device, equipment and storage medium of virtual prop
Technical Field
The embodiment of the application relates to the field of virtual environments, in particular to an operation method, an operation device, operation equipment and a storage medium of a virtual item.
Background
In applications that include a virtual environment, it is often necessary to perform activities in the virtual environment by controlling virtual objects in the virtual environment, such as: walking, driving, swimming, fighting, picking up objects, applying virtual props, and the like, wherein the virtual props comprise prop types such as virtual firearms, virtual props, virtual throwing props, virtual magic sticks, and the like.
In the correlation technique, when the virtual object is in the ground state of lying prone and throws the stage property, because under the virtual stage property is in the throw state of throwing in advance, the virtual object can't realize the removal of health position, only can cancel the throw state of throwing in advance at first to carry out the position and remove, and trigger again behind the shift position and throw the stage property in advance.
However, when the body position is moved in the process of throwing the virtual prop by the virtual object, the process is complicated, the human-computer interaction efficiency is low, and the control efficiency of the virtual prop is low.
Disclosure of Invention
The embodiment of the application provides an operation method, an operation device, equipment and a storage medium of a virtual prop, and the operation method, the operation device, the equipment and the storage medium can improve the efficiency of human-computer interaction and are low when a body position is moved in the process of throwing the virtual prop by a virtual object. The technical scheme is as follows:
in one aspect, an operation method of a virtual item is provided, where the method includes:
displaying a virtual environment interface, wherein the virtual environment interface comprises a virtual object in a ground lying state, and the virtual object holds the virtual prop in a pre-throwing state;
controlling the virtual prop to cancel the pre-cast state in response to a movement control operation on the virtual object;
controlling the virtual object to move in the virtual environment according to the movement control operation;
in response to the virtual object ending the movement, resuming the virtual object holding the virtual prop in the pre-cast state.
In another aspect, an apparatus for operating a virtual prop is provided, the apparatus including:
the display module is used for displaying a virtual environment interface, wherein the virtual environment interface comprises a virtual object in a ground lying state, and the virtual object holds the virtual prop in a pre-throwing state;
the control module is used for responding to the movement control operation of the virtual object and controlling the virtual prop to cancel the pre-cast state;
the control module is further used for controlling the virtual object to move in the virtual environment according to the movement control operation;
and the restoring module is used for responding to the virtual object ending the movement and restoring the virtual object to hold the virtual prop to be in the pre-cast state.
In another aspect, a computer device is provided, which includes a processor and a memory, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored in the memory, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by the processor to implement the operation method of the virtual prop as provided in the embodiments of the present application.
In another aspect, a computer-readable storage medium is provided, in which at least one instruction, at least one program, a set of codes, or a set of instructions is stored, which is loaded and executed by the processor to implement the operation method of the virtual prop as provided in the embodiments of the present application.
In another aspect, a computer program product is provided, which when run on a computer causes the computer to execute the operation method of the virtual prop as provided in the embodiments of the present application.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
when the virtual object is in the state of lying prone in the virtual environment, and handheld virtual stage property is in the state of throwing in advance, respond to the throwing state of throwing in advance of the automatic virtual stage property of cancellation of mobile control operation, and after removing, resume the throwing state of throwing in advance of virtual stage property, thereby throw the in-process of virtual stage property in advance at the state of lying prone, avoid the removal of virtual object to need to realize through more interface operation and the lower problem of human-computer interaction efficiency that leads to, human-computer interaction efficiency has been improved, and the operating efficiency to the virtual stage property has been improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic illustration of a throwing process for a virtual grenade provided in an exemplary embodiment of the present application;
FIG. 2 is a schematic view of a user interface of a method for operating a virtual item provided in an exemplary embodiment of the present application;
fig. 3 is a block diagram of a terminal according to an exemplary embodiment of the present application;
FIG. 4 is a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 5 is a flowchart of a method for operating a virtual prop according to an exemplary embodiment of the present application;
FIG. 6 is a schematic diagram illustrating a process of controlling a virtual object to move in a pre-cast state in the related art;
FIG. 7 is a flowchart of a method of operating a virtual prop according to another exemplary embodiment of the present application;
FIG. 8 is a timing diagram of a pre-cast process of a virtual prop provided based on the embodiment shown in FIG. 7;
FIG. 9 is a flowchart of a method of operating a virtual prop according to another exemplary embodiment of the present application;
FIG. 10 is a schematic diagram of an interface for the rapid throw of virtual props provided based on the embodiment shown in FIG. 9;
FIG. 11 is a schematic illustration of collision detection provided based on the embodiment shown in FIG. 9;
fig. 12 is a flowchart of a method for operating a virtual prop according to another exemplary embodiment of the present application;
fig. 13 is a block diagram of a structure of a virtual item operating device according to an exemplary embodiment of the present application;
fig. 14 is a block diagram of a virtual item operating device according to another exemplary embodiment of the present application;
fig. 15 is a block diagram of a terminal according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, terms referred to in the embodiments of the present application are briefly described:
virtual environment: is a virtual environment that is displayed (or provided) when an application is run on the terminal. The virtual environment may be a simulation environment of a real world, a semi-simulation semi-fictional environment, or a pure fictional environment. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application. The following embodiments are illustrated with the virtual environment being a three-dimensional virtual environment.
Virtual object: refers to a movable object in a virtual environment. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in a three-dimensional virtual environment. Optionally, the virtual object is a three-dimensional volumetric model created based on animated skeletal techniques. Each virtual object has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment.
Length of thrown time: the time length is the time length for the virtual object to obtain the virtual item and throw the virtual item to the virtual environment, optionally, the thrown time length is divided into two stages: the method comprises a pre-throwing stage and a throwing stage, wherein the pre-throwing stage refers to a stage that a virtual object holds a virtual prop and adjusts throwing path parameters of the virtual prop, the throwing stage refers to a stage that the virtual prop is thrown to a corresponding position and triggers a target function after the virtual object throws the virtual prop along a throwing direction, and the thrown time length can be timed from the starting time of the pre-throwing stage or from the starting time of the throwing stage. In the embodiment of the present application, taking an example that the thrown time length is counted from the start time of the pre-throw stage as an example, as shown in fig. 1, a virtual object 110 is included in a virtual environment interface 100 (fig. 1 shows, for example, a first person perspective, and only a hand of the virtual object 110 is shown), a virtual weapon currently held by the virtual object 110 is a virtual grenade 120, the virtual grenade 120 is a virtual weapon picked up by the virtual object 110 in a virtual environment, and the virtual environment interface 100 further includes a throw control 130. When a pressing operation on the throwing control 130 is received, a pre-throwing stage is started, a countdown 140 (such as countdown for 5 seconds) of the virtual grenade 120 explosion is started, the total countdown duration is a target duration of the virtual grenade 120 explosion, a throwing direction 121 of the virtual grenade 120 is displayed in the virtual environment interface 100, a user can adjust the throwing direction 121 of the virtual grenade 120 in a virtual environment through a dragging operation on the throwing control 130, when the user looses the pressing operation on the throwing control 130, a throwing stage is started, the virtual grenade 120 is thrown along the throwing direction 121, and when the countdown is finished (namely, the thrown duration reaches the target duration), the virtual grenade 120 generates an explosion effect.
Optionally, when the virtual object is in a ground-lying state in the virtual environment and holds that the virtual grenade is in a pre-throwing state, the virtual object cannot move the body position in the virtual environment, that is, the throwing direction to the virtual grenade and optionally the throwing path are limited, so that the method for adjusting the body position when the virtual object is in the ground-lying state and holds the pre-thrown virtual prop is provided in the embodiment of the application.
It should be noted that, in the foregoing embodiment, the virtual prop is taken as an example for explanation, and the virtual prop may also be implemented as another prop that triggers a target function at an interval target duration after throwing, such as a viscous grenade or a smoke bomb.
The method provided in the present application may be applied to a virtual reality application program, a three-dimensional map program, a military simulation program, a First-Person shooter Game (FPS), a Third-Person shooter Game (TPS), a Multiplayer Online tactical sports Game (MOBA), and the like, and the following embodiments are exemplified by applications in Games.
The game based on the virtual environment is often composed of one or more maps of game worlds, the virtual environment in the game simulates the scene of the real world, the user can control the virtual object in the game to perform actions of walking, running, jumping, shooting, fighting, driving, switching to use a virtual weapon, attacking other virtual objects by using the virtual weapon and the like in the virtual environment, the interactivity is strong, and a plurality of users can form a team on line to play a competitive game. When the user controls the virtual object to use the virtual weapon to attack the target virtual object, the user selects a proper virtual weapon to attack the virtual object according to the position of the target virtual object or the operation habit. The virtual weapon comprises at least one of a mechanical weapon, a close-up weapon and a throwing weapon, wherein the mechanical weapon comprises rifles, sniper guns, pistols, shotguns and the like, the close-up weapon comprises at least one of daggers, knives, axes, swords, sticks and pots (such as pans), and the throwing weapon comprises common grenades, viscous grenades, flash bombs, smoke bombs and the like.
An embodiment of the present application provides an operation method of a virtual item, and fig. 2 shows a schematic view of a user interface of the operation method of the virtual item provided in the embodiment of the present application, and the virtual item is taken as a virtual grenade for example to explain, as shown in fig. 2:
a virtual object 210 is displayed in the virtual environment interface 200, the virtual object 210 is in a ground-lying state in the virtual environment, the virtual object 210 holds a virtual grenade 220 in a pre-throwing state by hand, when a mobile control operation is received on the mobile control 230, the virtual grenade 220 is controlled to cancel the pre-throwing state, the virtual object 210 is controlled to move in the ground-lying state, and when the movement is finished, the virtual grenade 220 is restored to be in the pre-throwing state.
The terminal in the present application may be a desktop computer, a laptop computer, a mobile phone, a tablet computer, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4) player, and so on. The terminal is installed and operated with an application program supporting a virtual environment, such as an application program supporting a three-dimensional virtual environment. The application program may be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, and an MOBA game. Alternatively, the application may be a stand-alone application, such as a stand-alone 3D game program, or may be a network online application.
Fig. 3 shows a block diagram of an electronic device according to an exemplary embodiment of the present application. The electronic device 300 includes: an operating system 320 and application programs 322.
Operating system 320 is the base software that provides applications 322 with secure access to computer hardware.
Application 322 is an application that supports a virtual environment. Optionally, application 322 is an application that supports a three-dimensional virtual environment. The application 322 may be any one of a virtual reality application, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, an MOBA game, and a multi-player gunfight type live game. The application 322 may be a stand-alone application, such as a stand-alone 3D game program.
Fig. 4 shows a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 400 includes: a first device 420, a server 440, and a second device 460.
The first device 420 is installed and operated with an application program supporting a virtual environment. The application program can be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, an MOBA game and a multi-player gunfight living game. The first device 420 is a device used by a first user who uses the first device 420 to control a first virtual object located in a virtual environment for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the first virtual object is a first virtual character, such as a simulated persona or an animated persona.
The first device 420 is connected to the server 440 through a wireless network or a wired network.
The server 440 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 440 is used to provide background services for applications that support a three-dimensional virtual environment. Optionally, server 440 undertakes primary computing work and first device 420 and second device 460 undertakes secondary computing work; alternatively, server 440 undertakes secondary computing work and first device 420 and second device 460 undertakes primary computing work; alternatively, the server 440, the first device 420, and the second device 460 perform cooperative computing by using a distributed computing architecture.
The second device 460 is installed and operated with an application program supporting a virtual environment. The application program can be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, an FPS game, an MOBA game and a multi-player gun battle type survival game. The second device 460 is a device used by a second user who uses the second device 460 to control a second virtual object located in the virtual environment to perform activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the second virtual object is a second virtual character, such as a simulated persona or an animated persona.
Optionally, the first virtual character and the second virtual character are in the same virtual environment. Alternatively, the first avatar and the second avatar may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights. Alternatively, the first virtual character and the second virtual character may belong to different teams, different organizations, or two groups with enemy.
Alternatively, the applications installed on the first device 420 and the second device 460 are the same, or the applications installed on the two devices are the same type of application for different control system platforms. The first device 420 may generally refer to one of a plurality of devices, and the second device 460 may generally refer to one of a plurality of devices, and this embodiment is illustrated by the first device 420 and the second device 460. The device types of the first device 420 and the second device 460 are the same or different, and include: at least one of a game console, a desktop computer, a smartphone, a tablet, an e-book reader, an MP3 player, an MP4 player, and a laptop portable computer. The following embodiments are illustrated where the device is a desktop computer.
Those skilled in the art will appreciate that the number of devices described above may be greater or fewer. For example, the number of the devices may be only one, or several tens or hundreds, or more. The number and the type of the devices are not limited in the embodiments of the present application.
With reference to the above noun introduction and implementation environment description, a description is given to an operation method of a virtual item provided in an embodiment of the present application, taking an application of the method in a terminal as an example, as shown in fig. 5, where the method includes:
step 501, displaying a virtual environment interface, wherein the virtual environment interface comprises a virtual object in a ground lying state, and the virtual object holds a virtual prop in a pre-throwing state.
The virtual environment interface includes a picture for observing the virtual environment, where the virtual environment interface may include a picture for observing the virtual environment from a first person perspective of the virtual object, and may also include a picture for observing the virtual environment interface from a third person perspective of the virtual object.
Optionally, the above virtual prop is implemented as a throwing type prop, that is, the virtual prop triggers a target function by being thrown in a virtual environment, and illustratively, the virtual prop may be implemented as a throwing type prop such as a virtual grenade, a viscous grenade, a virtual smoke bomb, a virtual flash bomb, and the like. The virtual grenade and the viscous grenade are realized as props which are thrown to reach the function triggering time length and trigger the explosion effect, wherein the surface of the viscous grenade has the adhesion effect and can be adhered to a first touched virtual object; the virtual smoke bomb is a prop which is thrown to reach the function triggering time length and triggers the smoke diffusion effect; the virtual flash bomb is implemented as a prop that triggers a flashing effect when it is thrown and touches a virtual object.
Optionally, the pre-throwing state is used to indicate a state in which parameters of a throwing path of the virtual prop are to be adjusted, that is, parameters used for determining the throwing path, such as a throwing starting point, a throwing direction, an initial speed, and the like of the virtual prop, may also be adjusted by controlling the interface control. Illustratively, show the throwing controlling part that has virtual stage property in the virtual environment interface, press the operation through carrying on the length on throwing controlling part, control virtual stage property and be in the throw state of throwing in advance, when the length is pressed the operation and is ended, throw virtual stage property, and press the in-process at the length of pressing the operation, can realize the adjustment to the throw direction through the dragging to throwing controlling part, wherein, throw the direction and both included the ascending throw direction of horizontal direction, also include the ascending throw direction of gravity direction, the ascending throw direction of horizontal direction corresponds with the face orientation of virtual object in the virtual environment, the ascending throw direction of gravity direction and the throw height of virtual stage property correspond.
Optionally, the virtual prop is a prop which triggers a target function when the thrown duration reaches the target duration, illustratively, the virtual prop is implemented as a virtual grenade, when the virtual grenade is thrown for the target duration (for example, 5 seconds), an explosion function is triggered, and a virtual object in a preset range of an explosion position of the virtual grenade is damaged by the explosion function, wherein the thrown duration is timed from the virtual grenade being in a pre-thrown state.
Step 502, in response to the movement control operation on the virtual object, controlling the virtual prop to cancel the pre-throwing state.
Optionally, the virtual environment interface further includes a movement control, and the movement of the virtual object in the virtual environment is controlled by executing a movement control operation on the movement control. Because the current virtual object is in the ground state of lying prone, and holds that virtual stage property is in the state of throwing in advance, so, when receiving the movement control operation to the virtual object, control virtual stage property at first and cancel the state of throwing in advance.
Optionally, the holding state of the virtual object to the virtual item is reserved, and after the pre-cast state is cancelled, the virtual item may continue to be held by the virtual object, or the holding state of the virtual item by the virtual object may be cancelled directly.
Optionally, after receiving the movement control operation on the virtual object, the terminal sends a control signal to the server, where the control signal is used to indicate that the virtual object receives the movement control operation, the server feeds back a notice of canceling the pre-throwing to the terminal according to the control signal, and the terminal cancels the pre-throwing state of the virtual item according to the notice of canceling the pre-throwing, that is, when the terminal receives the movement control operation on the virtual object, the terminal automatically controls the virtual item to cancel the pre-throwing state without canceling the pre-throwing state according to the operation of the player on the virtual environment interface.
Step 503, controlling the virtual object to move in the virtual environment according to the movement control operation.
Optionally, the virtual object is controlled to perform corresponding creeping movement according to an operation mode of the movement control operation on the virtual environment interface. Schematically, a movement control is displayed in the virtual environment interface, and when a left dragging operation on the movement control is received, the virtual object is controlled to crawl left in the virtual environment; when an upward drag operation on the mobile control is received, the virtual object is controlled to crawl forward in the virtual environment.
Optionally, when the dragging length of the received dragging operation on the mobile control reaches the first length and does not reach the second length, controlling the virtual object to move according to the dragging operation, and when the dragging operation is finished, controlling the virtual object to finish moving; and when the dragging length of the received dragging operation on the mobile control reaches a second length, controlling the virtual object to continuously move in the virtual environment according to the dragging operation, and when the dragging operation is finished, controlling the virtual object to continuously move along the original moving direction until the movement canceling operation is received on the mobile control again, and controlling the virtual object to finish moving.
And step 504, responding to the end of the movement of the virtual object, and restoring the virtual object to hold the virtual prop to be in a pre-cast state.
Optionally, when the virtual object finishes moving, the terminal determines that the moving speed of the virtual object is 0, that is, determines that the virtual object finishes moving, and automatically restores that the virtual item is held by the virtual object and is in a pre-cast state.
Optionally, when the handheld virtual object prop is recovered to be in the pre-throwing state, first determining a first throwing path parameter of the virtual prop before receiving the movement control operation, adjusting the first throwing path parameter based on the movement control operation to obtain a second throwing path parameter, and recovering the handheld virtual object prop to be in the pre-throwing state by using the second throwing path parameter. Optionally, there is an adjustment of the second throwing path parameter in the start of throwing and/or throwing direction based on the first throwing path parameter.
Optionally, the first throwing path parameter includes a first throwing starting point, and the first throwing starting point is adjusted based on the movement control operation to obtain a second throwing starting point in a second throwing path parameter, where the first throwing starting point and the second throwing starting point correspond to a position where the virtual object is located in the virtual environment. Illustratively, the first throwing starting point is a position coordinate of a position where the virtual object is located in the virtual environment before moving, and after the movement control operation controls the virtual object to move along a corresponding moving path, the moving coordinate corresponding to the moving path is superposed to the position coordinate of the first throwing starting point to obtain a position coordinate of the second throwing starting point, that is, the position starting point at which the virtual object throws the virtual prop. Optionally, the movement control operation further includes a direction control operation, and the second throwing starting point is correspondingly adjusted according to the direction control operation in combination with the corresponding movement of the virtual object hand along with the direction adjustment.
Optionally, the first throwing path parameter includes a first throwing angle, and the first throwing angle is adjusted based on a rotation angle of the mobile control operation to obtain a second throwing angle in the second throwing path parameter, where the first throwing angle and the second throwing angle correspond to a facing direction of the virtual object in the virtual environment.
In summary, according to the operation method of the virtual prop provided by this embodiment, when the virtual object is in a ground-facing state in the virtual environment and the handheld virtual prop is in a pre-throwing state, the pre-throwing state of the virtual prop is automatically cancelled in response to the mobile control operation, and after the movement is finished, the pre-throwing state of the virtual prop is recovered, so that in the process of pre-throwing the virtual prop in the ground-facing state, the problem that the human-computer interaction efficiency is low due to the fact that the movement of the virtual object needs to be realized through more interface operations is avoided, the human-computer interaction efficiency is improved, and the operation efficiency of the virtual prop is improved.
According to the method provided by the embodiment, the first throwing path parameter is adjusted through the movement control operation, and the second throwing path parameter is obtained and used as the throwing path parameter for throwing the virtual prop by the virtual object after movement, so that the problem that the calculation amount of the terminal is large due to the fact that the throwing path parameter of the virtual prop needs to be calculated again according to the position of the virtual object and the environment parameter after movement is solved, the data processing amount of the terminal is reduced, and the operation efficiency of the virtual prop is improved.
Schematically, fig. 6 shows a schematic process diagram for controlling a virtual object to move in a pre-cast state in the related art, as shown in fig. 6, a virtual object 610 is displayed in a virtual environment interface 600, according to a prop casting operation, the virtual object 610 holds a virtual prop 620 in the pre-cast state, and the virtual object 610 is currently in a ground-prone state in the virtual environment, optionally, the virtual environment interface 600 further includes a throw canceling control 630 and a movement control 640, when the virtual object 610 needs to move in the virtual environment, first the throw canceling operation on the throw canceling control 630 is needed, so as to cancel the pre-cast state of the virtual prop, so that after the virtual prop cancels the pre-cast state, the movement control operation on the movement control 640 is received, the virtual object 610 is controlled to move in the virtual environment, and after the movement is completed, and re-receiving the item throwing operation and controlling the virtual item to be in a pre-throwing state. The man-machine interaction in the process is complicated, and the control efficiency of the virtual prop is low.
In an optional embodiment, the virtual item further corresponds to a function trigger duration, where the function trigger duration is used to indicate a maximum duration from when the virtual item is in a pre-cast state to when a target function of the virtual item is triggered, fig. 7 is a flowchart of an operation method of the virtual item provided in another exemplary embodiment of the present application, which is described by taking application of the method in a terminal as an example, as shown in fig. 7, the method includes:
step 701, displaying a virtual environment interface, wherein the virtual environment interface comprises a virtual object in a ground lying state, and the virtual object holds a virtual prop in a pre-throwing state.
Optionally, the virtual prop is implemented as a throw-type prop. Optionally, the pre-throwing state is used to indicate a state in which parameters of a throwing path of the virtual prop are to be adjusted, that is, parameters used for determining the throwing path, such as a throwing starting point, a throwing direction, an initial speed, and the like of the virtual prop, may also be adjusted by controlling the interface control.
Step 702, in response to that the duration of the pre-throwing state of the virtual item does not reach the function trigger duration, and receiving a movement control operation on the virtual object, controlling the virtual item to cancel the pre-throwing state.
Optionally, the virtual environment interface further includes a movement control, and the movement of the virtual object in the virtual environment is controlled by executing a movement control operation on the movement control. Because the current virtual object is in the ground state of lying prone, and holds that virtual stage property is in the state of throwing in advance, so, when receiving the movement control operation to the virtual object, control virtual stage property at first and cancel the state of throwing in advance.
Optionally, in this embodiment of the application, it is described by taking an example that the thrown time of the virtual item is realized as a duration of a pre-throwing state, that is, timing is performed from when the virtual item is in the pre-throwing state until the timing reaches the function triggering time, and then the target function of the virtual item is triggered.
And 703, controlling the virtual object to move in the virtual environment according to the movement control operation.
Optionally, the virtual object is controlled to perform corresponding creeping movement according to an operation mode of the movement control operation on the virtual environment interface. Illustratively, a movement control is displayed in the virtual environment interface, and when a leftward dragging operation on the movement control is received, the virtual object is controlled to move leftward in the virtual environment; when an up-drag operation on the mobile control is received, the virtual object is controlled to move forward in the virtual environment.
Step 704, responding to the end of the movement of the virtual object, and restoring the virtual object to hold the virtual item in the pre-cast state.
Optionally, when the virtual object finishes moving, the terminal determines that the moving speed of the virtual object is 0, that is, determines that the virtual object finishes moving, and automatically restores that the virtual item is held by the virtual object and is in a pre-cast state.
Optionally, determining a duration of a pre-cast state of the virtual item at a time when the mobile control operation is received, where the duration of the pre-cast state is obtained by timing from when the virtual item is in the pre-cast state, and the timing mode may be a countdown mode or a normal timing mode.
Step 705, determining the duration of the pre-throwing state of the virtual prop before receiving the movement control operation, and starting to count time from the duration of the pre-throwing state.
Alternatively, the timing is continued from the time when the movement control operation is received, the pre-throwing state duration period. Illustratively, when the movement control operation is received, the duration of the pre-cast state of the virtual prop is 2 seconds, and when the movement control operation is received, the timing of the duration of the pre-cast state is suspended, and after the movement is finished, the duration of the pre-cast state is continuously timed from 2 seconds.
Schematically, referring to fig. 8, a virtual object 810 is displayed in a virtual environment interface 800, the virtual object 810 is in a ground-facing state in the virtual environment, and a handheld virtual prop 820 is in a pre-cast state, a duration of the current pre-cast state is 3 seconds, when a movement control operation is received, the pre-cast state is cancelled, the virtual object 810 is controlled to move in the virtual environment, and when the movement is finished, the pre-cast state is restored, and timing is continued from 3 seconds.
And step 706, resetting the duration of the pre-cast state and restarting timing.
Optionally, the duration of the pre-cast state is cleared and the timing is restarted. Illustratively, when the movement control operation is received, the duration of the pre-cast state of the virtual prop is 2 seconds, and when the movement control operation is received, the timing of the duration of the pre-cast state is suspended, and after the movement is finished, the duration of the pre-cast state is restarted from 0 second.
And step 707, responding to the duration of the pre-throwing state reaching the function triggering duration, and triggering the target function of the virtual prop.
In summary, according to the operation method of the virtual prop provided by this embodiment, when the virtual object is in a ground-facing state in the virtual environment and the handheld virtual prop is in a pre-throwing state, the pre-throwing state of the virtual prop is automatically cancelled in response to the mobile control operation, and after the movement is finished, the pre-throwing state of the virtual prop is recovered, so that in the process of pre-throwing the virtual prop in the ground-facing state, the problem that the human-computer interaction efficiency is low due to the fact that the movement of the virtual object needs to be realized through more interface operations is avoided, the human-computer interaction efficiency is improved, and the operation efficiency of the virtual prop is improved.
According to the method provided by the embodiment, when the pre-throwing state is recovered, timing is continuously carried out from the duration of the pre-throwing state before receiving the movement control operation, so that the current pre-throwing state is recovered from the interrupted pre-throwing state, and the authenticity of prop operation is improved.
According to the method provided by the embodiment, when the pre-throwing state is recovered, the duration of the pre-throwing state is reset and timing is restarted, so that the virtual prop which is re-pre-thrown after the virtual prop is retracted in the current pre-throwing state is represented, and the authenticity of prop operation is improved.
In an alternative embodiment, the pre-throwing state is realized by a long-press operation on a throwing control, fig. 9 is a flowchart of an operation method of a virtual prop according to another exemplary embodiment of the present application, which is described by taking an example of applying the method to a terminal, and as shown in fig. 9, the method includes:
step 901, receiving a trigger operation on the ground-climbing state control.
Optionally, the virtual environment interface includes a ground-bending state control, and the ground-bending state control is used to control the virtual object to be in a ground-bending state in the virtual environment.
Step 902, controlling the virtual object to be in a ground-bending state in the virtual environment, wherein the virtual object holds the virtual prop, and the virtual prop is a throwing-type prop.
Optionally, the virtual object is controlled to be in the ground-bending state in the virtual environment according to the triggering operation of the ground-bending state control. Optionally, the virtual item held by the virtual object is a throwing type item, in the virtual environment, a throwing mode of the throwing type item includes a fast throwing mode and a pre-throwing mode, where the fast throwing mode refers to that the virtual item is thrown according to a facing direction of the current virtual object and a preset speed after the throwing control is clicked, and for an illustrative purpose, referring to fig. 10, a virtual object 1030 holding a virtual firearm 1010 is displayed in a virtual environment interface 1000, and when the throwing control 1020 is triggered, the virtual object 1030 directly throws the virtual item 1040 according to the facing direction; the throwing in the pre-throwing state refers to that a virtual object firstly controls a virtual prop to be in the pre-throwing state through a throwing control, and after the throwing path parameter of the virtual prop is adjusted, the throwing of the virtual prop is triggered through throwing triggering operation.
Step 903, receiving a long press operation on the throwing control.
Optionally, the virtual prop is controlled to be in a pre-casting state through long-time pressing operation on the casting control, so that casting path parameters of the virtual prop can be adjusted in the pre-casting state.
It should be noted that, in the above steps 901 to 902, and 903 to 904, step 901 and 902 may be executed first, and then step 903 and 904 are executed, or step 903 and 904 may be executed first, and then step 901 and 902 are executed, which is not limited in the embodiment of the present application.
And step 904, controlling the virtual object to pre-throw the virtual prop.
Step 905, responding to the movement control operation of the virtual object, and controlling the virtual prop to cancel the pre-throwing state.
Optionally, the virtual environment interface includes a moving rocker, when the player pushes the moving rocker, the system cancels the pre-throwing state, wherein when the moving rocker receives a trigger operation, the current ground-bending state of the virtual object and the pre-throwing state of the virtual prop are recorded, when the moving speed of the virtual object is greater than 0, the pre-throwing state is automatically cancelled, the virtual object moves in the ground-bending state, and a mark that the pre-throwing state is automatically cancelled is recorded.
Step 906, controlling the virtual object to move in the virtual environment according to the movement control operation.
Step 907, responding to the end of the movement of the virtual object, and restoring the virtual object to hold the virtual item in a pre-cast state.
Optionally, when the movement is finished, in response to the flag that the pre-cast state is automatically cancelled, the pre-cast state of the virtual prop is restored, and the record is cleared.
Optionally, in response to the movement ending and the long press operation on the throwing control not ending, restoring the pre-throwing state of the virtual prop.
And 908, controlling the virtual object to throw the virtual item in response to the end of the long press operation.
Optionally, in the throwing process of the virtual prop, according to the flight trajectory, collision detection rays are made at intervals of a preset time, the virtual object in the virtual environment also corresponds to a collision detection box, and when the rays detect the collision detection box, the virtual prop reaches the terminal and falls to the ground to trigger the target function. Schematically, as shown in fig. 11, a virtual prop 1110 in the process of throwing is displayed in the virtual environment interface 1100, and collision detection is performed by using a collision detection ray 1120 along the flight path of the virtual prop 1110.
In summary, according to the operation method of the virtual prop provided by this embodiment, when the virtual object is in a ground-facing state in the virtual environment and the handheld virtual prop is in a pre-throwing state, the pre-throwing state of the virtual prop is automatically cancelled in response to the mobile control operation, and after the movement is finished, the pre-throwing state of the virtual prop is recovered, so that in the process of pre-throwing the virtual prop in the ground-facing state, the problem that the human-computer interaction efficiency is low due to the fact that the movement of the virtual object needs to be realized through more interface operations is avoided, the human-computer interaction efficiency is improved, and the operation efficiency of the virtual prop is improved.
According to the method provided by the embodiment, the virtual prop is controlled to be in the pre-throwing state through the long-press operation of the throwing control, the pre-throwing state is automatically cancelled when the virtual object needs to move, and the pre-throwing state is recovered when the movement is finished and the long-press operation is not finished, so that the cancellation and recovery of the pre-throwing state are controlled, the human-computer interaction efficiency is improved, and the operation efficiency of the virtual prop is improved.
Fig. 12 is an overall flowchart of an operation method of a virtual item according to another exemplary embodiment of the present application, which is described by taking the method as an example for being applied to a terminal and the virtual item is a throwing object, and as shown in fig. 12, the method includes:
in step 1201, the player switches to throw.
Optionally, the projectile is a virtual weapon that has been acquired and applied by a virtual object. Optionally, the virtual prop held by the virtual object is controlled to be thrown through a handheld control corresponding to the thrower.
Step 1202, determine whether the player is triggered to lie prone.
Optionally, the virtual environment interface includes a state control, wherein the state control includes a groveling control corresponding to the groveling state, and the virtual object is controlled to be in the groveling state through triggering the groveling control.
Step 1203, when the player triggers to lie prone, entering a lying prone state.
In step 1204, it is determined whether the player has pressed the fire key.
Optionally, the firing key is used to control the virtual object to apply the virtual item, and in this embodiment, the firing key is used to control the virtual object to pre-throw and throw the above-mentioned throwing object.
In step 1205, when the player presses the fire key, the player enters a pre-throw state.
When a player presses a firing key for a long time, the throwing object enters a pre-throwing state, a parabola corresponding to the throwing object is displayed in the virtual environment interface in the pre-throwing state, the parabola is the flight track of the throwing object, and the end point of the parabola is the landing point of the throwing object.
In step 1206, it is determined whether the virtual object is moving.
Optionally, a movement rocker is included in the virtual environment interface, and when a trigger operation on the movement rocker is received, it is determined that the virtual object needs to be moved.
In step 1207, when the virtual object moves, the pre-cast state is canceled.
Optionally, when the moving speed of the virtual object is greater than 0, automatically executing cancellation of the pre-cast state, and after the pre-cast state is cancelled, the virtual object is in a state of lying on the stomach and moves, and the system records a mark of the pre-cast state cancelled by the system.
In step 1208, it is determined whether the virtual object stops moving.
In step 1209, when the virtual object stops moving, the pre-cast state is resumed.
Optionally, the throwing object is returned to the pre-throwing state according to the mark that the pre-throwing state is cancelled.
And step 1210, judging whether the long-time pressing operation of the firing key is finished.
In step 1211, the object to be thrown is thrown when the long-press operation of the fire key is completed.
In summary, according to the operation method of the virtual prop provided by this embodiment, when the virtual object is in a ground-facing state in the virtual environment and the handheld virtual prop is in a pre-throwing state, the pre-throwing state of the virtual prop is automatically cancelled in response to the mobile control operation, and after the movement is finished, the pre-throwing state of the virtual prop is recovered, so that in the process of pre-throwing the virtual prop in the ground-facing state, the problem that the human-computer interaction efficiency is low due to the fact that the movement of the virtual object needs to be realized through more interface operations is avoided, the human-computer interaction efficiency is improved, and the operation efficiency of the virtual prop is improved.
Fig. 13 is a schematic structural diagram of an operation device of a virtual prop according to an exemplary embodiment of the present application, and as shown in fig. 13, the device includes:
a display module 1310, configured to display a virtual environment interface, where the virtual environment interface includes a virtual object in a ground-bending state, and the virtual object holds the virtual item in a pre-cast state;
a control module 1320, configured to control the virtual prop to cancel the pre-cast state in response to a movement control operation on the virtual object;
the control module 1320 is further configured to control the virtual object to move in the virtual environment according to the movement control operation;
a restoring module 1330, configured to restore the holding of the virtual prop by the virtual object to the pre-cast state in response to the virtual object ending the moving.
In an alternative embodiment, as shown in fig. 14, the recovery module 1330 includes:
a determining unit 1331, configured to determine a first throwing path parameter of the virtual item before receiving the movement control operation;
an adjusting unit 1332, configured to adjust the first throwing path parameter based on the movement control operation, so as to obtain a second throwing path parameter;
a recovering unit 1333, configured to recover, with the second throwing path parameter, that the virtual object holds the virtual prop to be in the pre-throwing state.
In an alternative embodiment, the first throwing path parameter includes a first throwing start point;
the adjusting unit 1332 is further configured to adjust the first throwing starting point based on the movement path of the movement control operation, so as to obtain a second throwing starting point in the second throwing path parameter, where the first throwing starting point and the second throwing starting point correspond to the position where the virtual object is located in the virtual environment.
In an alternative embodiment, the first throw path parameter comprises a first throw angle;
the adjusting unit 1332 is further configured to adjust the first throwing angle based on the rotation angle of the movement control operation, so as to obtain a second throwing angle in the second throwing path parameter, where the first throwing angle and the second throwing angle correspond to a facing direction of the virtual object in the virtual environment.
In an optional embodiment, the virtual prop further corresponds to a function trigger duration, where the function trigger duration is used to indicate a maximum duration from when the virtual prop is in the pre-throwing state to when a target function of the virtual prop is triggered;
the control module 1320 is further configured to control the virtual item to cancel the pre-cast state in response to that the duration of the pre-cast state of the virtual item does not reach the function trigger duration and that a movement control operation on the virtual object is received.
In an optional embodiment, the apparatus further comprises:
a determining module 1340, configured to determine the duration of the pre-cast state of the virtual prop before receiving the movement control operation;
a timing module 1350, configured to start timing continuously from the duration of the pre-cast state;
a triggering module 1360, configured to trigger the target function of the virtual item in response to the duration of the pre-cast state reaching the function triggering duration.
In an optional embodiment, the apparatus further comprises:
a timing module 1350, configured to zero the duration of the pre-cast state and restart timing;
a triggering module 1360, configured to trigger the target function of the virtual item in response to the duration of the pre-cast state reaching the function triggering duration.
In an optional embodiment, the apparatus further comprises:
the receiving module 1370 is used for receiving the triggering operation of the ground-climbing state control;
the control module 1320 is further configured to control the virtual object to be in the ground-bending state in the virtual environment, where the virtual object holds the virtual prop, and the virtual prop is a throwing-type prop;
the receiving module is also used for receiving the long-press operation of the throwing control;
the control module 1320 is further configured to control the virtual object to pre-throw the virtual item.
In an optional embodiment, the control module 1320 is further configured to control the virtual object to throw the virtual prop in response to the long press operation ending.
To sum up, the operating device of virtual stage property that this embodiment provided is in the state of lying prone in virtual environment as virtual object, and handheld virtual stage property is in when throwing the state in advance, responds to the throwing state in advance of the automatic virtual stage property of cancellation of mobile control operation, and after removing, resume the throwing state in advance of virtual stage property, thereby throw the in-process of virtual stage property in advance at the state of lying prone, avoid the removal of virtual object to need to realize through more interface operation and the lower problem of human-computer interaction efficiency that leads to, human-computer interaction efficiency has been improved, and the operating efficiency to virtual stage property has been improved.
It should be noted that: the operation device of the virtual item provided in the above embodiment is only illustrated by dividing each functional module, and in practical application, the function allocation may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the operation device of the virtual prop and the operation method embodiment of the virtual prop provided by the above embodiments belong to the same concept, and the specific implementation process thereof is described in detail in the method embodiment and is not described herein again.
Fig. 15 shows a block diagram of a terminal 1500 according to an exemplary embodiment of the present invention. The terminal 1500 may be: a smart phone, a tablet computer, an MP3(Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4) player, a notebook computer or a desktop computer. Terminal 1500 may also be referred to as user equipment, a portable terminal, a laptop terminal, a desktop terminal, or other names.
In general, terminal 1500 includes: a processor 1501 and memory 1502.
Processor 1501 may include one or more processing cores, such as 4-core processors, 8-core processors, and the like. The processor 1501 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 1501 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1501 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, processor 1501 may also include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
The memory 1502 may include one or more computer-readable storage media, which may be non-transitory. The memory 1502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer-readable storage medium in memory 1502 is used to store at least one instruction for execution by processor 1501 to implement a method of operation of a virtual prop provided by method embodiments herein.
In some embodiments, the terminal 1500 may further include: a peripheral interface 1503 and at least one peripheral. The processor 1501, memory 1502, and peripheral interface 1503 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 1503 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1504, touch screen display 1505, camera 1506, audio circuitry 1507, positioning assembly 1508, and power supply 1509.
The peripheral interface 1503 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 1501 and the memory 1502. In some embodiments, the processor 1501, memory 1502, and peripheral interface 1503 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1501, the memory 1502, and the peripheral interface 1503 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 1504 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 1504 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1504 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1504 can communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1504 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1505 is used to display a UI (user interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1505 is a touch display screen, the display screen 1505 also has the ability to capture touch signals on or over the surface of the display screen 1505. The touch signal may be input to the processor 1501 as a control signal for processing. In this case, the display screen 1505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 1505 may be one, providing the front panel of terminal 1500; in other embodiments, display 1505 may be at least two, each disposed on a different surface of terminal 1500 or in a folded design; in still other embodiments, display 1505 may be a flexible display disposed on a curved surface or a folded surface of terminal 1500. Even further, the display 1505 may be configured in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 1505 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
The camera assembly 1506 is used to capture images or video. Optionally, the camera assembly 1506 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1506 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1507 may include a microphone and speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1501 for processing or inputting the electric signals to the radio frequency circuit 1504 to realize voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of the terminal 1500. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1501 or the radio frequency circuit 1504 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1507 may also include a headphone jack.
The positioning component 1508 is used to locate the current geographic position of the terminal 1500 for navigation or LBS (Location Based Service). The Positioning component 1508 may be a Positioning component based on the united states GPS (Global Positioning System), the chinese beidou System, or the russian galileo System.
Power supply 1509 is used to power the various components in terminal 1500. The power supply 1509 may be alternating current, direct current, disposable or rechargeable. When the power supply 1509 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 1500 also includes one or more sensors 1510. The one or more sensors 1510 include, but are not limited to: acceleration sensor 1511, gyro sensor 1512, pressure sensor 1513, fingerprint sensor 1514, optical sensor 1515, and proximity sensor 1516.
The acceleration sensor 1511 may detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the terminal 1500. For example, the acceleration sensor 1511 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1501 may control the touch screen display 1505 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1511. The acceleration sensor 1511 may also be used for acquisition of motion data of a game or a user.
The gyroscope sensor 1512 can detect the body direction and the rotation angle of the terminal 1500, and the gyroscope sensor 1512 and the acceleration sensor 1511 cooperate to collect the 3D motion of the user on the terminal 1500. The processor 1501 may implement the following functions according to the data collected by the gyro sensor 1512: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1513 may be disposed on a side bezel of terminal 1500 and/or underneath touch display 1505. When the pressure sensor 1513 is disposed on the side frame of the terminal 1500, the holding signal of the user to the terminal 1500 may be detected, and the processor 1501 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1513. When the pressure sensor 1513 is disposed at a lower layer of the touch display 1505, the processor 1501 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 1505. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1514 is configured to capture a fingerprint of the user, and the processor 1501 identifies the user based on the fingerprint captured by the fingerprint sensor 1514, or the fingerprint sensor 1514 identifies the user based on the captured fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1501 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 1514 may be disposed on the front, back, or side of the terminal 1500. When a physical key or vendor Logo is provided on the terminal 1500, the fingerprint sensor 1514 may be integrated with the physical key or vendor Logo.
The optical sensor 1515 is used to collect ambient light intensity. In one embodiment, processor 1501 may control the brightness of the display on touch screen 1505 based on the intensity of ambient light collected by optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1505 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1505 is turned down. In another embodiment, the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 based on the ambient light intensity collected by the optical sensor 1515.
A proximity sensor 1516, also known as a distance sensor, is typically provided on the front panel of the terminal 1500. The proximity sensor 1516 is used to collect the distance between the user and the front surface of the terminal 1500. In one embodiment, when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually decreases, the processor 1501 controls the touch display 1505 to switch from the bright screen state to the dark screen state; when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually becomes larger, the processor 1501 controls the touch display 1505 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 15 does not constitute a limitation of terminal 1500, and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components may be employed.
An embodiment of the present application further provides a computer device, where the computer device includes a memory and a processor, where the memory stores at least one instruction, at least one program, a code set, or an instruction set, and the at least one instruction, the at least one program, the code set, or the instruction set is loaded by the processor and implements the operation method of the virtual prop as described in any one of fig. 5, fig. 7, and fig. 13.
Embodiments of the present application further provide a computer-readable storage medium, where at least one instruction, at least one program, a code set, or a set of instructions is stored in the computer-readable storage medium, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by the processor to implement the operation method of the virtual prop as described in any one of fig. 5, fig. 7, and fig. 13.
The application also provides a computer program product, which when running on a computer, causes the computer to execute the operation method of the virtual prop provided by the above method embodiments.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, which may be a computer readable storage medium contained in a memory of the above embodiments; or it may be a separate computer-readable storage medium not incorporated in the terminal. The computer readable storage medium has stored therein at least one instruction, at least one program, a set of codes, or a set of instructions that are loaded and executed by the processor to implement the method of operation of the virtual prop as described in any of fig. 5, 7, and 13.
Optionally, the computer-readable storage medium may include: a Read Only Memory (ROM), a Random Access Memory (RAM), a Solid State Drive (SSD), or an optical disc. The Random Access Memory may include a resistive Random Access Memory (ReRAM) and a Dynamic Random Access Memory (DRAM). The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (12)

1. A method of operating a virtual item, the method comprising:
receiving a trigger operation of the ground-climbing state control;
controlling a virtual object to be in a ground-bending state in a virtual environment, wherein the virtual object holds a virtual prop;
receiving a long-press operation on a throwing control;
controlling the virtual object to pre-throw the virtual prop, wherein the virtual object holds the virtual prop in a pre-throwing state;
responding to that the duration of a pre-throwing state of the virtual prop does not reach a function triggering time, receiving a mobile control operation on the virtual object, and controlling the virtual prop to cancel the pre-throwing state, wherein the function triggering time is used for indicating the maximum time from the virtual prop being in the pre-throwing state to triggering a target function of the virtual prop;
controlling the virtual object to move in the virtual environment in the ground-laying state according to the movement control operation;
responding to that the moving speed of the virtual object is zero and the long-press operation on the throwing control is not finished, and recovering that the virtual object holds the virtual prop to be in the pre-throwing state;
and resetting the duration of the pre-throwing state and restarting timing.
2. The method of claim 1, wherein the restoring the virtual object to hold the virtual prop in the pre-cast state comprises:
determining a first throw path parameter of the virtual prop prior to receiving the movement control operation;
adjusting the first throwing path parameter based on the movement control operation to obtain a second throwing path parameter;
and recovering that the virtual object holds the virtual prop to be in the pre-throwing state by the second throwing path parameter.
3. The method of claim 2 wherein the first throw path parameters include a first throw origin;
adjusting the first throwing path parameter based on the movement control operation to obtain a second throwing path parameter, including:
adjusting the first throwing starting point based on a movement path of the movement control operation to obtain a second throwing starting point in the second throwing path parameters, wherein the first throwing starting point and the second throwing starting point correspond to the position of the virtual object in the virtual environment.
4. The method of claim 2 wherein the first throw path parameter includes a first throw angle;
adjusting the first throwing path parameter based on the movement control operation to obtain a second throwing path parameter, including:
adjusting the first throwing angle based on a rotation angle of the movement control operation to obtain a second throwing angle in the second throwing path parameter, wherein the first throwing angle and the second throwing angle correspond to a face direction of the virtual object in the virtual environment.
5. The method of any of claims 1 to 4, wherein after clearing and restarting timing the duration of the pre-cast throw state, further comprising:
and triggering the target function of the virtual prop in response to the duration of the pre-throwing state reaching the function triggering duration.
6. The method of claim 1, further comprising:
and controlling the virtual object to throw the virtual prop in response to the end of the long press operation.
7. An apparatus for operating a virtual prop, the apparatus comprising:
the receiving module is used for receiving the triggering operation of the ground-lying state control;
the control module is used for controlling a virtual object to be in a ground-bending state in a virtual environment, and the virtual object holds the virtual prop;
the receiving module is also used for receiving the long-press operation of the throwing control;
the control module is further configured to control the virtual object to pre-throw the virtual item, and the virtual object holds the virtual item in a pre-throwing state;
the control module is further configured to respond to that the duration of a pre-cast state of the virtual prop does not reach a function trigger time, and receive a movement control operation on the virtual object, control the virtual prop to cancel the pre-cast state, where the function trigger time is used to indicate a maximum time from when the virtual prop is in the pre-cast state to when a target function of the virtual prop is triggered;
the control module is further configured to control the virtual object to move in the ground-laying state in a virtual environment according to the movement control operation;
the recovery module is used for responding to the fact that the moving speed of the virtual object is zero and the long-press operation on the throwing control is not finished, and recovering that the virtual object holds the virtual prop to be in the pre-throwing state;
and the timing module is used for resetting the duration of the pre-throwing state and restarting timing.
8. The apparatus of claim 7, wherein the recovery module comprises:
a determining unit, configured to determine a first throwing path parameter of the virtual prop before receiving the movement control operation;
the adjusting unit is used for adjusting the first throwing path parameter based on the movement control operation to obtain a second throwing path parameter;
and the recovery unit is used for recovering the virtual object to hold the virtual prop to be in the pre-throwing state according to the second throwing path parameter.
9. The apparatus of claim 8 wherein the first throw path parameters include a first throw origin;
the adjusting unit is further configured to adjust the first throwing starting point based on a movement path of the movement control operation, so as to obtain a second throwing starting point in the second throwing path parameter, where the first throwing starting point and the second throwing starting point correspond to a position where the virtual object is located in the virtual environment.
10. The apparatus of claim 8 wherein the first throw path parameter includes a first throw angle;
the adjusting unit is further configured to adjust the first throwing angle based on a rotation angle of the movement control operation, so as to obtain a second throwing angle in the second throwing path parameter, where the first throwing angle and the second throwing angle correspond to a facing direction of the virtual object in the virtual environment.
11. A computer device, characterised in that it comprises a processor and a memory in which is stored at least one instruction, at least one program, set of codes or set of instructions, which is loaded and executed by the processor to implement a method of operation of a virtual prop according to any one of claims 1 to 6.
12. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement a method of operation of a virtual prop according to any one of claims 1 to 6.
CN202010299027.0A 2020-04-16 2020-04-16 Operation method, device, equipment and storage medium of virtual prop Active CN111475029B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010299027.0A CN111475029B (en) 2020-04-16 2020-04-16 Operation method, device, equipment and storage medium of virtual prop

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010299027.0A CN111475029B (en) 2020-04-16 2020-04-16 Operation method, device, equipment and storage medium of virtual prop

Publications (2)

Publication Number Publication Date
CN111475029A CN111475029A (en) 2020-07-31
CN111475029B true CN111475029B (en) 2021-12-14

Family

ID=71754396

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010299027.0A Active CN111475029B (en) 2020-04-16 2020-04-16 Operation method, device, equipment and storage medium of virtual prop

Country Status (1)

Country Link
CN (1) CN111475029B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111744186B (en) * 2020-08-06 2023-08-11 腾讯科技(深圳)有限公司 Virtual object control method, device, equipment and storage medium
CN111905380B (en) * 2020-08-21 2024-04-30 腾讯科技(深圳)有限公司 Virtual object control method, device, terminal and storage medium
CN112190929B (en) * 2020-11-02 2024-05-28 网易(杭州)网络有限公司 Game object control method and device
CN113885731B (en) * 2021-10-13 2023-09-26 网易(杭州)网络有限公司 Virtual prop control method and device, electronic equipment and storage medium
CN114939275B (en) * 2022-05-24 2024-08-09 北京字跳网络技术有限公司 Method, device, equipment and storage medium for object interaction

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0858627A2 (en) * 1996-08-02 1998-08-19 Koninklijke Philips Electronics N.V. Virtual environment manipulation device modelling and control
CN110427111A (en) * 2019-08-01 2019-11-08 腾讯科技(深圳)有限公司 The operating method of virtual item, device, equipment and storage medium in virtual environment
CN110585731A (en) * 2019-09-30 2019-12-20 腾讯科技(深圳)有限公司 Method, device, terminal and medium for throwing virtual article in virtual environment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101328054B1 (en) * 2011-08-09 2013-11-08 엘지전자 주식회사 Apparatus and method for generating sensory vibration

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0858627A2 (en) * 1996-08-02 1998-08-19 Koninklijke Philips Electronics N.V. Virtual environment manipulation device modelling and control
CN110427111A (en) * 2019-08-01 2019-11-08 腾讯科技(深圳)有限公司 The operating method of virtual item, device, equipment and storage medium in virtual environment
CN110585731A (en) * 2019-09-30 2019-12-20 腾讯科技(深圳)有限公司 Method, device, terminal and medium for throwing virtual article in virtual environment

Also Published As

Publication number Publication date
CN111475029A (en) 2020-07-31

Similar Documents

Publication Publication Date Title
CN110427111B (en) Operation method, device, equipment and storage medium of virtual prop in virtual environment
CN110694261B (en) Method, terminal and storage medium for controlling virtual object to attack
CN108434736B (en) Equipment display method, device, equipment and storage medium in virtual environment battle
CN111249730B (en) Virtual object control method, device, equipment and readable storage medium
CN111282275B (en) Method, device, equipment and storage medium for displaying collision traces in virtual scene
CN110755841B (en) Method, device and equipment for switching props in virtual environment and readable storage medium
CN110585710B (en) Interactive property control method, device, terminal and storage medium
CN111475029B (en) Operation method, device, equipment and storage medium of virtual prop
CN110917619B (en) Interactive property control method, device, terminal and storage medium
CN109126129B (en) Method, device and terminal for picking up virtual article in virtual environment
CN110721468B (en) Interactive property control method, device, terminal and storage medium
JP2022517337A (en) How to control a virtual object to mark a virtual item and its equipment and computer program
CN110465098B (en) Method, device, equipment and medium for controlling virtual object to use virtual prop
CN111228809A (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN111714893A (en) Method, device, terminal and storage medium for controlling virtual object to recover attribute value
CN111659119B (en) Virtual object control method, device, equipment and storage medium
CN110755844B (en) Skill activation method and device, electronic equipment and storage medium
CN111744186A (en) Virtual object control method, device, equipment and storage medium
CN113713382A (en) Virtual prop control method and device, computer equipment and storage medium
CN112138384A (en) Using method, device, terminal and storage medium of virtual throwing prop
WO2021143253A1 (en) Method and apparatus for operating virtual prop in virtual environment, device, and readable medium
CN112933601A (en) Virtual throwing object operation method, device, equipment and medium
CN113713383A (en) Throwing prop control method and device, computer equipment and storage medium
CN111672106A (en) Virtual scene display method and device, computer equipment and storage medium
CN111330277A (en) Virtual object control method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40025585

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant