CN113713383A - Throwing prop control method and device, computer equipment and storage medium - Google Patents

Throwing prop control method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN113713383A
CN113713383A CN202111060411.6A CN202111060411A CN113713383A CN 113713383 A CN113713383 A CN 113713383A CN 202111060411 A CN202111060411 A CN 202111060411A CN 113713383 A CN113713383 A CN 113713383A
Authority
CN
China
Prior art keywords
throwing
virtual
scene
picture
prop
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111060411.6A
Other languages
Chinese (zh)
Other versions
CN113713383B (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202111060411.6A priority Critical patent/CN113713383B/en
Publication of CN113713383A publication Critical patent/CN113713383A/en
Application granted granted Critical
Publication of CN113713383B publication Critical patent/CN113713383B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a method and a device for controlling a thrown prop, computer equipment and a storage medium, and belongs to the technical field of virtual scenes. The method comprises the following steps: displaying a virtual scene interface; displaying a first scene picture in the virtual scene interface, wherein the first scene picture comprises at least two throwing props; at least two of the throwing props are located within reach of the first virtual object; in response to receiving a throwing operation of the throwing prop, displaying a second scene picture in the virtual scene interface, wherein the second scene picture is a picture of the first virtual object throwing at least one throwing prop. According to the scheme, the action authenticity of the virtual object is guaranteed, and meanwhile, the human-computer interaction efficiency of the user in controlling and throwing the prop is improved.

Description

Throwing prop control method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of virtual scene technologies, and in particular, to a method and an apparatus for controlling a thrown prop, a computer device, and a storage medium.
Background
At present, in some game type applications, for example, in a first person shooting type game, a throwing prop or a virtual prop called a throwing type is generally provided.
In the related art, in order to simulate the situation that a throwing prop is used by a virtual object in a virtual scene as truly as possible, a series of limb actions between the virtual object taking out the throwing prop and the virtual object throwing out of the throwing prop are designed. And when the user controls the virtual object to throw the throwing prop once, the virtual object can execute the limb action once.
However, the above-mentioned series of limb actions from the taking out of the throwing prop to the throwing of the throwing prop usually takes a long time, so that the time interval between the virtual object and the two successive uses of the throwing prop is long, thereby affecting the human-computer interaction efficiency when the user uses the throwing prop.
Disclosure of Invention
The embodiment of the application provides a method and a device for controlling a throwing prop, computer equipment and a storage medium, which can improve the human-computer interaction efficiency of a user when the throwing prop is used. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a method for controlling a throwing prop, where the method includes:
displaying a virtual scene interface, wherein the virtual scene interface is used for displaying a scene picture of a virtual scene, and the virtual scene comprises a first virtual object;
displaying a first scene picture in the virtual scene interface, wherein the first scene picture comprises at least two throwing props; at least two of the throwing props are located within reach of the first virtual object;
in response to receiving a throwing operation of the throwing prop, displaying a second scene picture in the virtual scene interface, wherein the second scene picture is a picture of the first virtual object throwing at least one throwing prop.
On the other hand, this application embodiment provides a throw stage property controlling means, the device includes:
the interface display module is used for displaying a virtual scene interface, the virtual scene interface is used for displaying a scene picture of a virtual scene, and the virtual scene comprises a first virtual object;
the first picture display module is used for displaying a first scene picture in the virtual scene interface, and the first scene picture comprises at least two throwing props; at least two of the throwing props are located within reach of the first virtual object;
and the second picture display module is used for displaying a second scene picture in the virtual scene interface in response to receiving the throwing operation of the throwing prop, wherein the second scene picture is a picture of throwing at least one throwing prop by the first virtual object.
In one possible implementation, the first scene picture is a picture in which at least two of the throwing props are suspended within a hand-reachable range of the first virtual object;
the second scene picture is an animation picture that the first virtual object throws at least one throwing prop through a hand.
In a possible implementation manner, the second picture displaying module is configured to,
responding to the received throwing operation of the throwing prop, and acquiring the operation mode of the throwing operation;
and displaying the second scene picture in the virtual scene interface based on the operation mode.
In a possible implementation manner, the second picture displaying module is configured to,
and in response to the fact that the operation mode is single click operation, displaying a picture of throwing the single throwing prop by the first virtual object in the virtual scene interface.
In a possible implementation manner, the second picture displaying module is configured to,
and displaying a picture that the first virtual object throws the throwing prop continuously in the virtual scene interface in response to the operation mode being a double-click operation or a sliding operation.
In a possible implementation manner, the second picture displaying module is configured to,
and in response to the fact that the operation mode is long press operation or sliding operation, displaying a picture that the first virtual object continuously throws the throwing prop within the operation duration of the throwing operation (namely, the long press operation or the sliding operation) in the virtual scene interface.
In one possible implementation, the apparatus further includes:
and the property adding module is used for responding to the first condition, and adding the throwing property within the reachable range of the first virtual object.
In a possible implementation manner, the property adding module is configured to add the throwing property within a reachable range of the first virtual object in response to the throwing property hitting a target object after being thrown.
In a possible implementation manner, the first screen displaying module is configured to display the first scene screen in the virtual scene interface in response to receiving an operation of releasing a target skill.
In one possible implementation, the target skill has a duration;
the second picture display module is used for responding to the throwing operation of the throwing prop received within the duration time of the target skill, and displaying a second scene picture in the virtual scene interface.
In one possible implementation, the apparatus further includes:
a time increase module to increase a duration of the target skill in response to a second condition being met.
In one possible implementation, the time increase module is to increase the duration of the target skill in response to a number of target objects hit by the throwing prop reaching a number threshold within the duration of the target skill.
In one possible implementation, the apparatus further includes:
and the timing information display module is used for displaying timing information in the virtual scene interface within the duration of the target skill, wherein the timing information is used for indicating the remaining duration of the target skill.
In another aspect, embodiments of the present application provide a computer device comprising a processor and a memory, wherein the memory has at least one computer instruction stored therein, and the at least one computer instruction is loaded and executed by the processor to implement the method for controlling a throwing prop according to the above aspect.
In another aspect, embodiments of the present application provide a computer-readable storage medium having at least one computer instruction stored therein, the at least one computer instruction being loaded and executed by a processor to implement a method of throwing prop control as described in the above aspects.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the throwing prop control method provided in the various alternative implementations of the above aspects.
The technical scheme provided by the embodiment of the application has the beneficial effects that at least:
before the throwing prop is thrown, at least two throwing props are placed in the accessible range of a virtual object controlled by a user, when the user triggers throwing operation, the virtual object can throw one or more throwing props through continuous throwing action, so that the action of the virtual object in the throwing control process of the throwing prop is reduced, the continuous throwing time interval is reduced under the condition that the throwing action of the virtual object is natural enough and close to reality, the action authenticity of the virtual object is guaranteed, the man-machine interaction efficiency when the user controls the throwing prop is improved, the duration time of single-game battles can be reduced, and the electric quantity and the data flow consumed by the terminal are saved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 2 is a schematic illustration of a display interface of a virtual scene provided by an exemplary embodiment of the present application;
FIG. 3 is a flow chart of a method of controlling a throwing prop provided in an exemplary embodiment of the present application;
FIG. 4 is a schematic view of the presentation of a throwing prop according to the embodiment of FIG. 3;
FIG. 5 is a flow chart of a method of controlling a throwing prop according to an exemplary embodiment of the present application;
FIG. 6 is a schematic diagram of a throw prop trigger according to the embodiment of FIG. 5;
FIG. 7 is a schematic view of a street furniture toss according to the embodiment shown in FIG. 5;
FIG. 8 is a schematic illustration of a remaining duration presentation according to the embodiment shown in FIG. 5;
FIG. 9 is a flow chart illustrating control of a projectile weapon according to an exemplary embodiment of the present application;
FIG. 10 is a block diagram of a throwing prop control apparatus shown in an exemplary embodiment of the present application;
FIG. 11 is a block diagram of a computer device provided in an exemplary embodiment of the present application;
fig. 12 is a block diagram illustrating a computer device according to an exemplary embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
It is to be understood that reference herein to "a number" means one or more and "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The embodiment of the application provides a prop throwing control method, which can enable the aiming direction of a target virtual prop to be controllable in a continuous shooting process. For ease of understanding, several terms referred to in this application are explained below.
1) Virtual scene
A virtual scene is a virtual scene that is displayed (or provided) when an application program runs on a terminal. The virtual scene can be a simulation environment scene of a real world, can also be a semi-simulation semi-fictional three-dimensional environment scene, and can also be a pure fictional three-dimensional environment scene. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, and a three-dimensional virtual scene, and the following embodiments are illustrated by way of example, but not limited thereto, in which the virtual scene is a three-dimensional virtual scene. Optionally, the virtual scene may also be used for virtual scene engagement between at least two virtual characters. Optionally, the virtual scene may also be used for a virtual firearm fight between at least two virtual characters. Optionally, the virtual scene may also be used for fighting between at least two virtual characters using a virtual firearm within a target area that may be continually smaller over time in the virtual scene.
Virtual scenes are typically rendered based on hardware (e.g., a screen) in a terminal generated by an application in a computer device, such as a terminal. The terminal can be a mobile terminal such as a smart phone, a tablet computer or an electronic book reader; alternatively, the terminal may be a personal computer device such as a notebook computer or a stationary computer.
2) Virtual object
A virtual object refers to a movable object in a virtual scene. The movable object may be at least one of a virtual character, a virtual animal, a virtual vehicle. Optionally, when the virtual scene is a three-dimensional virtual scene, the virtual object is a three-dimensional stereo model created based on an animated skeleton technique. Each virtual object has its own shape, volume and orientation in the three-dimensional virtual scene and occupies a portion of the space in the three-dimensional virtual scene.
3) Virtual prop
The virtual props refer to props which can be used by virtual objects in a virtual environment, and comprise virtual weapons such as pistols, rifles, sniper guns, daggers, knives, swords and axes which can hurt other virtual objects, and supply props such as bullets, wherein quick clips, sighting scopes, silencers and the like are installed on the appointed virtual weapons, and can provide virtual pendants added with partial attributes for the virtual weapons, and defense props such as shields, armors and armored cars.
In the embodiment of the present application, the virtual prop includes a throwing prop, such as a virtual flying knife, a virtual flying axe, a virtual grenade, a virtual flash bomb, and the like.
4) First person shooting game
The first-person shooting game is a shooting game that a user can play from a first-person perspective, and a screen of a virtual environment in the game is a screen that observes the virtual environment from a perspective of a first virtual object. In the game, at least two virtual objects carry out a single-game fighting mode in a virtual environment, the virtual objects achieve the purpose of survival in the virtual environment by avoiding the injury initiated by other virtual objects and the danger (such as poison circle, marshland and the like) existing in the virtual environment, when the life value of the virtual objects in the virtual environment is zero, the life of the virtual objects in the virtual environment is ended, and finally the virtual objects which survive in the virtual environment are winners. Optionally, each client may control one or more virtual objects in the virtual environment, with the time when the first client joins the battle as a starting time and the time when the last client exits the battle as an ending time. Optionally, the competitive mode of the battle may include a single battle mode, a double group battle mode or a multi-person group battle mode, and the battle mode is not limited in the embodiment of the present application.
FIG. 1 illustrates a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application. The implementation environment may include: a first terminal 110, a server 120, and a second terminal 130.
The first terminal 110 is installed and operated with an application 111 supporting a virtual environment, and the application 111 may be a multiplayer online battle program. When the first terminal runs the application 111, a user interface of the application 111 is displayed on the screen of the first terminal 110. The application 111 may be any one of military Simulation programs, Multiplayer Online Battle Arena Games (MOBA), large-escape shooting Games, and Simulation strategy Games (SLG). In the present embodiment, the application 111 is an FPS (First Person shooter Game) for example. The first terminal 110 is a terminal used by the first user 112, and the first user 112 uses the first terminal 110 to control a first virtual object located in the virtual environment for activity, where the first virtual object may be referred to as a master virtual object of the first user 112. The activities of the first virtual object include, but are not limited to: adjusting at least one of body posture, crawling, walking, running, riding, flying, jumping, driving, picking, shooting, attacking, throwing, releasing skills. Illustratively, the first virtual object is a first virtual character, such as a simulated character or an animation character.
The second terminal 130 is installed and operated with an application 131 supporting a virtual environment, and the application 131 may be a multiplayer online battle program. When the second terminal 130 runs the application 131, a user interface of the application 131 is displayed on the screen of the second terminal 130. The client may be any one of a military simulation program, an MOBA game, a large fleeing and killing shooting game, and an SLG game, and in this embodiment, the application 131 is an FPS game as an example. The second terminal 130 is a terminal used by the second user 132, and the second user 132 uses the second terminal 130 to control a second virtual object located in the virtual environment to perform an activity, where the second virtual object may be referred to as a master virtual character of the second user 132. Illustratively, the second virtual object is a second virtual character, such as a simulated character or an animation character.
Optionally, the first virtual object and the second virtual object are in the same virtual world. Optionally, the first virtual object and the second virtual object may belong to the same camp, the same team, the same organization, a friend relationship, or a temporary communication right. Alternatively, the first virtual object and the second virtual object may belong to different camps, different teams, different organizations, or have a hostile relationship.
Optionally, the applications installed on the first terminal 110 and the second terminal 130 are the same, or the applications installed on the two terminals are the same type of application on different operating system platforms (android or IOS). The first terminal 110 may generally refer to one of a plurality of terminals, and the second terminal 130 may generally refer to another of the plurality of terminals, and this embodiment is only illustrated by the first terminal 110 and the second terminal 130. The device types of the first terminal 110 and the second terminal 130 are the same or different, and include: at least one of a smartphone, a tablet, an e-book reader, an MP3 player, an MP4 player, a laptop portable computer, and a desktop computer.
Only two terminals are shown in fig. 1, but there are a plurality of other terminals that may access the server 120 in different embodiments. Optionally, one or more terminals are terminals corresponding to the developer, a development and editing platform for supporting the application program in the virtual environment is installed on the terminal, the developer can edit and update the application program on the terminal and transmit the updated application program installation package to the server 120 through a wired or wireless network, and the first terminal 110 and the second terminal 130 can download the application program installation package from the server 120 to update the application program.
The first terminal 110, the second terminal 130, and other terminals are connected to the server 120 through a wireless network or a wired network.
The server 120 includes at least one of a server, a server cluster composed of a plurality of servers, a cloud computing platform, and a virtualization center. The server 120 is used to provide background services for applications that support a three-dimensional virtual environment. Optionally, the server 120 undertakes primary computational work and the terminals undertake secondary computational work; alternatively, the server 120 undertakes the secondary computing work and the terminal undertakes the primary computing work; alternatively, the server 120 and the terminal perform cooperative computing by using a distributed computing architecture.
In one illustrative example, the server 120 includes a memory 121, a processor 122, a user account database 123, a combat services module 124, and a user-oriented Input/Output Interface (I/O Interface) 125. The processor 122 is configured to load an instruction stored in the server 120, and process data in the user account database 123 and the combat service module 124; the user account database 123 is configured to store data of a user account used by the first terminal 110, the second terminal 130, and other terminals, such as a head portrait of the user account, a nickname of the user account, a fighting capacity index of the user account, and a service area where the user account is located; the fight service module 124 is used for providing a plurality of fight rooms for the users to fight, such as 1V1 fight, 3V3 fight, 5V5 fight and the like; the user-facing I/O interface 125 is used to establish communication with the first terminal 110 and/or the second terminal 130 through a wireless network or a wired network to exchange data.
The virtual scene may be a three-dimensional virtual scene, or the virtual scene may also be a two-dimensional virtual scene. Taking the example that the virtual scene is a three-dimensional virtual scene, please refer to fig. 2, which shows a schematic view of a display interface of the virtual scene according to an exemplary embodiment of the present application. As shown in fig. 2, the display interface of the virtual scene includes a scene screen 200, and the scene screen 200 includes a currently controlled virtual object 210, an environment screen 220 of the three-dimensional virtual scene, and a virtual object 240. The virtual object 240 may be a virtual object controlled by a user or a virtual object controlled by an application program corresponding to other terminals.
In fig. 2, the currently controlled virtual object 210 and the virtual object 240 are three-dimensional models in a three-dimensional virtual scene, and the environment picture of the three-dimensional virtual scene displayed in the scene picture 200 is an object observed from the perspective of the currently controlled virtual object 210, for example, as shown in fig. 2, the environment picture 220 of the three-dimensional virtual scene displayed from the perspective of the currently controlled virtual object 210 is the ground 224, the sky 225, the horizon 223, the hill 221, and the factory building 222.
The currently controlled virtual object 210 may release skills or use virtual props, move and execute a specified action under the control of the user, and the virtual object in the virtual scene may show different three-dimensional models under the control of the user, for example, a screen of the terminal supports touch operation, and a scene screen 200 of the virtual scene includes a virtual control, so that when the user touches the virtual control, the currently controlled virtual object 210 may execute the specified action in the virtual scene and show a currently corresponding three-dimensional model.
Fig. 3 shows a flowchart of a method of controlling a throwing prop according to an exemplary embodiment of the present application. The method for controlling the throwing prop can be executed by a computer device, and the computer device can be a terminal or a server, or the computer device can also comprise the terminal and the server. As shown in fig. 3, the method for controlling the throwing prop includes:
step 310, displaying a virtual scene interface, where the virtual scene interface is used to display a scene picture of a virtual scene, and the virtual scene includes a first virtual object.
In this embodiment of the present application, the first virtual object may be a virtual object controlled by a terminal that displays the virtual scene interface. Wherein the first virtual object has the ability to use throwing props in a virtual scene.
Step 320, displaying a first scene picture in the virtual scene interface, wherein the first scene picture comprises at least two throwing props; at least two throwing props are located within reach of the first virtual object.
The touchable range of the first virtual object may be a range in which the first virtual object directly interacts without moving. For example, the above-mentioned reachable range is a range in which the first virtual object directly interacts with the limb without moving. That is, the first virtual object may interact directly with the virtual prop within reach through a limb (such as a hand or foot, etc.).
In this embodiment, the first virtual object may have a plurality of throwing props, and when the plurality of throwing props are used, they may be simultaneously placed in the virtual scene within the reach corresponding to the first virtual object.
In one possible implementation of the embodiments of the present application, the user may trigger the placement of at least two throwing props within the reach of the first virtual object by a specified operation.
The throwing prop refers to a virtual prop which is preset with a flight track or a drop point position before being thrown and acts after being thrown. Throwing props include, without limitation, virtual flying knives, virtual flying axes, virtual grenades, virtual flash bombs, virtual smoke bombs, virtual landmines, virtual supply packs, and the like.
For example, please refer to fig. 4, which shows a schematic view of a display of a throwing prop according to an embodiment of the present application. As shown in fig. 4, at least two throwing props 41 may be exhibited in the virtual scene interface when a first virtual object is ready to use the throwing prop, and the positions of the at least two throwing props 41 in the virtual scene are within the reach of the first virtual object, e.g., in fig. 4, the at least two throwing props are arranged in a queue in front of and close to the first virtual object.
Referring to the interface shown in fig. 4, when multiple throwing props are available for use with a first virtual object, they can be removed by the first virtual object and simultaneously placed in accessible locations of the first virtual object before the user controls the first virtual object to throw the throwing prop, without the need to remove one throwing prop after it is thrown.
Step 330, in response to receiving the throwing operation for throwing the prop, displaying a second scene picture in the virtual scene interface, wherein the second scene picture is a picture of the first virtual object throwing at least one throwing prop.
In embodiments of the present application, the throwing operation may be triggered when the user determines to throw one or more throwing objects, at which point the computer device may control the first virtual object to throw out one or more throwing objects that are already within reach of the first virtual object. When the first virtual object throws two or more throwing props continuously, only the throwing action needs to be executed continuously, and the action of taking out the throwing props does not need to be executed, so that the action amplitude of the first virtual object when throwing the throwing props continuously is reduced, the action in the process of throwing the virtual props by the first virtual object can be natural and close to reality as much as possible, and meanwhile, the interval between the two throwing props thrown by the first virtual object continuously can be shortened as much as possible.
For example, taking the interface shown in fig. 4 as an example, after receiving a throwing operation of a user to a throwing prop, the computer device may control the first virtual object to throw out one throwing prop of the plurality of throwing props that have been displayed, at this time, the first virtual object performs one throwing motion, and then, when it is necessary to throw the next throwing prop continuously, the computer device controls the first virtual object to perform the throwing motion again to throw out another throwing prop, wherein, because two throwing props that have been thrown are displayed in advance in the reachable range of the first virtual object, between the two adjacent throwing motions, the first virtual object does not need to perform redundant motions such as taking out another throwing prop, thereby reducing motions during continuous throwing of the first virtual object, and under the condition that it is ensured that the throwing motion of the first virtual object is natural and sufficiently realistic, the time interval of continuous throwing can be greatly reduced.
To sum up, the throw stage property control scheme that this application embodiment provided, before throw stage property is thrown, place two at least throw stage properties in user control's virtual object's accessible range, when the user has triggered the throw operation like this, virtual object can throw one or more throw stage properties through continuous throwing action, thereby the action of virtual object in the throw control process of throwing stage property in succession has been reduced, under the enough nature of throwing action of guaranteeing virtual object and the condition of pressing close to reality, reduce the time interval of continuous throwing, thereby when guaranteeing virtual object's action authenticity, the human-computer interaction efficiency when improving user control throw, and then can reduce the duration of single office fight, electric quantity and the data traffic that saving terminal consumed.
Fig. 5 shows a flowchart of a method of controlling a throwing prop according to an exemplary embodiment of the present application. The virtual object control method may be executed by a computer device, and the computer device may be a terminal or a server, or the computer device may include the terminal and the server. As shown in fig. 5, the method for controlling the throwing prop includes:
step 501, displaying a virtual scene interface, where the virtual scene interface is used to display a scene picture of a virtual scene, and the virtual scene includes a first virtual object.
In the embodiment of the application, after a user opens an application (such as an application of a shooting game) corresponding to a virtual scene in a terminal and triggers entry into the virtual scene, a computer device may present a virtual scene interface in the terminal through the application.
In a possible implementation manner, the virtual scene interface may include, in addition to the scene picture of the virtual scene, various operation controls, which may be used to control the virtual scene, for example, to control the first virtual object to act (e.g., throw, move, shoot, interact, etc.) in the virtual scene, to open or close a thumbnail map of the virtual scene, to exit the virtual scene, and so on.
Step 502, displaying a first scene picture in a virtual scene interface, wherein the first scene picture comprises at least two throwing props; at least two throwing props are located within reach of the first virtual object.
In one possible implementation, in response to receiving an operation to release a target skill, a computer device presents a first scene screen in a virtual scene interface.
In the embodiment of the present application, the first scene picture may be triggered to be displayed by a trigger operation of a user. For example, the user may release the target skills to trigger the placement of at least two throwing props within reach of the first virtual object.
The target skill may be a skill of the first virtual object itself, or the target skill may also be a skill of the first virtual object after the first virtual object acquires or equips the target virtual prop (for example, acquires a specific chip prop, or acquires a specific weapon prop, or the like).
For example, please refer to fig. 6, which shows a schematic diagram of triggering of throwing a prop according to an embodiment of the present application. As shown in fig. 6, when the first virtual object has a target skill, a skill release control 62 corresponding to the target skill may be displayed in the virtual scene interface 61, and after the user clicks the skill release control 62, the virtual scene interface 61 may switch to a screen displaying a plurality of throwing objects (such as virtual flying knives) within a reachable range near the first virtual object, where the screen displaying the plurality of throwing objects within the reachable range near the first virtual object may refer to fig. 4 described above.
In a possible implementation manner, when the first scene picture is displayed by being triggered by the target skill, the target skill may have a certain cooling time or charging time, that is, after the user places at least two throwing objects in the reachable range of the first virtual object through the target skill triggering display, a certain time period (which may be configured in advance by the developer) needs to be waited before the user can trigger the placement of at least two throwing objects in the reachable range of the first virtual object again. For example, taking fig. 6 as an example, after the user has triggered skill release control 62, the computer device places at least two throwing props within reach of the first virtual object while skill release control 62 is set to an inoperable state (or skill release control 62 is directly deselected); after a period of time has elapsed, the skill release control 62 may again enter the operable state (or the skill release control 62 may again be displayed).
In another possible implementation, in addition to limiting the target skills through cooling time or charging time, the use of the target skills may be limited through props or resources. For example, as also shown in fig. 6, after the user triggers the skill release control 62, the computer device sets the skill release control 62 to an inoperable state (or directly deselects the skill release control 62); thereafter, when the user acquires a particular prop, or after a specified number of resources are collected, the computer device may be triggered to control skill release control 62 to re-enter an operable state, or to re-display skill release control 62.
In an embodiment of the present application, the manner in which at least two throwing objects located within the reach of the first virtual object are thrown may be determined by the manner of operation of the throwing operation. The process of controlling the throwing pattern of the throwing prop through the operation pattern of the throwing operation may refer to the subsequent steps.
Step 503, in response to receiving the throwing operation to the throwing prop, obtaining the operation mode of the throwing operation.
In this embodiment, the throwing motion of the throwing prop may be triggered by throwing operations in different operation modes.
Optionally, the triggering manner of the throwing operation may include a single click, a double click, a sliding operation, a long press, and the like, and the embodiment of the present application is not limited to the triggering manner of the throwing operation.
Step 504, displaying a second scene picture in the virtual scene interface based on the operation mode; the second scene picture is a picture in which the first virtual object throws at least one throwing prop.
In this embodiment, when the first virtual object throws the throwing prop, the throwing prop may be thrown in a direction described by the current sight.
In the embodiment of the present application, when the throwing operation is triggered in multiple ways, the multiple ways may trigger different throwing ways.
In a possible implementation manner, the process of displaying the second scene picture in the virtual scene interface based on the operation manner may include:
and displaying a picture that the first virtual object throws a single throwing prop in the virtual scene interface in response to the operation mode being the clicking operation.
In the embodiment of the application, the user can trigger the throwing of a single throwing prop through a single clicking operation. For example, the user may trigger to throw a single thrown prop by clicking a thrown control shown in the virtual scene interface, or may trigger to throw a single thrown prop by clicking an area in the virtual scene interface where the control is not shown. That is, each time the computer device detects that the user clicks the throwing control or the blank area (i.e., the area without the control), the computer device controls the first virtual object to throw one throwing prop within the reachable range, so as to realize the quick single-shot throwing of the throwing prop.
For example, please refer to fig. 7, which shows a diagram of a road casting according to an embodiment of the present application. As shown in fig. 7, when the user clicks the throwing control or the blank area in the virtual scene interface 71, the first virtual object throws the throwing prop 73 within the reach by the hand 72 toward the direction of the sight of the current sight.
In a possible implementation manner, the process of displaying the second scene picture in the virtual scene interface based on the operation manner may include:
and displaying a picture that the first virtual object throws the throwing prop continuously in the virtual scene interface in response to the operation mode being double-click operation or sliding operation.
In the present embodiment, multiple throwing props within reach as described above may also indicate a continuous throw. For example, the user may trigger continuous throwing of the throwing prop within the reachable range by double-clicking a throwing control displayed in the virtual scene interface, or by a sliding operation (such as a sliding operation, and the like) performed from the position of the throwing control; alternatively, the user may trigger continuous throwing of the throwing prop by double-clicking a region in the virtual scene interface where the control is not shown, or by a sliding operation performed from the region where the control is not shown.
For example, after detecting that the user double-clicks the throwing control or the blank area (i.e., the area without the control), the computer device controls the first virtual object to throw the plurality of throwing objects within the reach range, and continuously throws the objects at certain time intervals. For example, when the computer device detects that the user double-clicks the throwing control, the computer device controls the first virtual object to throw the 5 throwing props one by one at a small time interval.
For example, taking fig. 7 as an example, after the user double-clicks the throwing control or the blank area in the virtual scene interface 71, the first virtual object is thrown by the hand 72 toward the direction aimed by the current sight bead first of the throwing control 73 in the reachable range, and then without further operation by the user, the next throwing control 74 can be thrown, and so on until there is no throwing prop in the reachable range.
In a possible implementation manner, the process of displaying the second scene picture in the virtual scene interface based on the operation manner may include:
and displaying a picture that the first virtual object continuously throws the throwing prop within the operation duration of the throwing operation (namely, the long-press operation or the sliding operation) in the virtual scene interface in response to the operation mode being the long-press operation or the sliding operation.
In the embodiment of the present application, the number of the throwing objects thrown continuously by the first virtual object may also be controlled by the throwing operation. For example, when the throwing operation is a long press operation or a slide operation, the computer device may control the first virtual object to continuously throw the throwing prop within the reachable range for the duration of the long press operation or the slide operation.
For example, referring to fig. 7 as an example, after the user presses the throwing control or the blank area in the virtual scene interface 71 for a long time, the first virtual object first throws the throwing control 73 within the reachable range toward the direction aimed by the current sight through the hand 72, and then throws the next throwing control 74 without further operation of the user, and so on, the long press operation is finished, or the throwing prop is not present within the reachable range.
In one possible implementation, the first scene picture is a picture in which at least two throwing props are suspended within a hand-reachable range of the first virtual object.
In embodiments of the present application, when the first virtual object is a virtual object having a hand, such as a human-shaped virtual object, the computer device may hover the at least two throwing props within reach of the hand of the first virtual object.
The hand-reachable range is a spatial range within which the first virtual object can be reached by a hand motion without moving the first virtual object. When at least two throwing props are arranged in a floating mode and are arranged in the reach range of the hands of the first virtual object, the first virtual object can throw the throwing props in the reach range through the actions of the hands.
In one possible implementation, when the first scene picture is a picture in which at least two throwing props are suspended within a hand-reachable range of the first virtual object, the second scene picture is an animated picture in which the first virtual object throws at least one throwing prop by hand.
Wherein, when the computer device levitates at least two throwing props disposed within reach of the hands of the first virtual object, in response to receiving the throwing operation, the computer device may control the first virtual object to throw the throwing prop within reach using the hands thereof.
Because the throwing prop is suspended in the reachable range of the hand of the first virtual object, when the first virtual object throws the throwing prop in the reachable range by using the hand, the hand action of the first virtual object can be reduced as much as possible, so that the interval between two successive throws is reduced while the reality of the action of the first virtual object is ensured.
Step 505, in response to the first condition being met, adding a throwing prop within reach of the first virtual object.
In the embodiment of the present application, in order to make the throwing motion of the first virtual object natural enough, the above-mentioned reachable range is limited, and correspondingly, the number of throwing props displayed simultaneously in the reachable range is also limited, which may cause the user to consume the throwing props in the reachable range quickly, and if the user needs to continue throwing, the user needs to wait for the next time to trigger the display of the first scene picture, thereby causing the throwing efficiency to be low. In this regard, in the embodiment of the present application, a scheme of automatically supplementing a throwing prop in a reachable range is also involved, that is, when a user controls a first virtual object to throw the throwing prop in the reachable range through a throwing operation, whether the throwing prop is newly added in the reachable range can be determined by detecting a first condition.
The first condition may be a condition corresponding to an effect generated by the throwing prop consumed by the first virtual object, for example, whether to hit a valid target object, whether the first virtual object moves to a specified range of the throwing prop thrown and landed, and the like.
In one possible implementation, in response to a first condition being met, adding a throw prop within reach of a first virtual object, comprising:
in response to the throwing prop hitting the target object after being thrown, a throwing prop is newly added within the reach of the first virtual object.
In the embodiment of the present application, for example, whether to supplement the throwing prop is determined on the condition that whether the throwing prop concentrates on the target object, when the computer device detects that the throwing prop thrown by the first virtual object hits the target object, one or more throwing props are newly added within the reachable range of the first virtual object.
The scheme shown in the above embodiment of the present application is described by taking as an example that the first condition includes that the thrown prop hits the target object after being thrown, and optionally, the first condition may also include other conditions, for example, the first condition may include that the target object is eliminated by throwing the prop, and the like.
In one possible implementation, when the target skill has a duration, the presenting, in response to receiving the throwing operation to throw the prop, the second scene screen in the virtual scene interface may include:
in response to receiving a throwing operation to throw the prop within a duration of the target skill, a second scene screen is presented in the virtual scene interface.
In the embodiment of the application, the use time of the throwing prop can be limited, that is, after the user triggers the target skill, the user can trigger the first virtual object to throw the throwing prop within the reachable range through the control operation within the duration of the target skill. Optionally, if the duration of the target skill is over, the state in which the throwing prop within the reach can be thrown is exited, and correspondingly, the throwing prop set within the reach can be cancelled by the computer device.
In one possible implementation, the duration of the target skill is increased in response to the second condition being met.
In this embodiment of the application, the computer device may also adjust the duration of the target skill by combining the second condition according to the condition of the first virtual object in the virtual scene.
In a possible implementation manner, the above-mentioned process of increasing the duration of the target skill in response to the second condition being met may include:
in response to the number of target objects hit by the thrown prop reaching a number threshold for the duration of the target skill, increasing the duration of the target skill.
In this embodiment of the application, in order to improve the efficiency of the user in controlling the first virtual object to throw the throwing prop continuously, the computer device may further extend the duration of the target skill by combining the number of target objects hit by the throwing prop.
The scheme shown in the above embodiment of the present application is only described by taking as an example that the second condition includes that the number of target objects hit by the throwing prop reaches the threshold number, and optionally, the second condition may also include other conditions, for example, the second condition may include that the number of target objects eliminated by throwing the prop reaches the threshold number, and so on.
In one possible implementation, the computer device may further display timing information in the virtual scene interface within the duration of the target skill, the timing information indicating a remaining duration of the target skill.
In this embodiment of the application, when the target skill has a duration, in order to enable the user to know the relevant condition of the duration of the target skill at any time and to better control the first virtual object to throw, the computer device may display the remaining duration of the target skill in the virtual scene interface.
For example, the computer device may display the remaining duration corresponding to the skill release control of the target skill, for example, please refer to fig. 8, which shows a remaining duration display diagram according to an embodiment of the present application. As shown in fig. 8, a skill release control 81 is shown in the virtual scene interface, and after the skill release control 81 is triggered, a timing bar 82 may be shown on the periphery of the skill release control 81, and the length of the timing bar 82 represents the remaining time length. Alternatively, the computer may display the value of the remaining time length at the center of the skill releasing control 81.
The embodiment of the present application is not limited to the display manner of the timing information.
To sum up, the throw stage property control scheme that this application embodiment provided, before throw stage property is thrown, place two at least throw stage properties in user control's virtual object's accessible range, when the user has triggered the throw operation like this, virtual object can throw one or more throw stage properties through continuous throwing action, thereby the action of virtual object in the throw control process of throwing stage property in succession has been reduced, under the enough nature of throwing action of guaranteeing virtual object and the condition of pressing close to reality, reduce the time interval of continuous throwing, thereby when guaranteeing virtual object's action authenticity, the human-computer interaction efficiency when improving user control throw, and then can reduce the duration of single office fight, electric quantity and the data traffic that saving terminal consumed.
Taking the application of the scheme shown in the above embodiment of the present application to a game application scenario as an example, a flight skill weapon can be added to a game through the scheme shown in the embodiment of the present application, for example, a weapon in the scheme is a large move skill weapon, a player needs to activate the large move weapon before using the weapon, and the activation mode may be time cooling, that is, the player can use the weapon after waiting for a certain time. After activation of the recruit weapon, the hand of the player-controlled game character may display a number of flies, such as 5 flies, all of which 5 flies may be thrown before the end of the recruiting skill use. Correspondingly, by the scheme shown in the embodiment of the application, the fly-cutter operation mode can be added in the game, for example, the fly cutter can be operated by clicking a firing key, and a game role controlled by a player can throw a fly cutter in an accurate direction by clicking the firing key once, if the target is hit, one fly may be immediately recovered, if the player double-clicks the fire button quickly, the game character may release all of the flies remaining, wherein, the fly-cutter can fly out one by one according to a certain interval instead of flying out at one time, wherein, the interval can be relatively short, the operation is suitable for the situation of multiple targets on site, and the flying direction of each fly cutter can be changed along with the change of the alignment center, for example, if the player controls the movement of the game character during the continuous release of the fly, the position of the fly's hand will also follow the current position of the game character.
The judgment principle of single and double click can be as follows: firstly, judging whether firing key buttons are clicked or not, wherein each button has a central point and a click radius R; when a player clicks a screen by a finger, the computer equipment acquires the clicked position, then calculates the distance between the centers of the firing keys, and clicks the firing key if the distance is smaller than the radius R. When the user presses the firing key, the computer device records the time of clicking, the first flying knife flies out, if the player clicks the firing key once again within the specified time, the player is judged to double-click, and then the rest flying knife is thrown according to the setting.
The flying track of the throwing prop is determined by parameters such as the direction of initial speed, acceleration and the like; shooting a ray forward in the flying knife flying process, wherein the ray is used for detecting an obstacle (such as a target object) in front; when an object is detected, it means that the next frame will touch the obstacle, and then a collision point is obtained, where if a target object is touched, the damage can be calculated.
Taking a game scenario as an example, please refer to fig. 9, which is a flowchart illustrating a control process of a projectile weapon (corresponding to the aforementioned projectile) according to an exemplary embodiment of the present application. As shown in fig. 9, the projectile weapon may be controlled as follows:
s901, the player activates a flyer skill picking weapon.
S902, judging whether the player clicks and uses a flyer skill weapon; if yes, the process proceeds to S903, otherwise, the process returns.
S903, 5 flying knives (i.e., the above-mentioned throwing tool) are switched around the game character.
S904, judging whether the player presses a firing key; if so, the process proceeds to S905, otherwise, the process returns to S903.
S905, throwing a fly cutter by the game role, and subtracting one fly cutter around the game role.
S906, judging whether the flying cutter hits the target object; if so, the process proceeds to S907, otherwise, the process proceeds to S908.
S907, a fly cutter is added around the game character.
S908, judging whether the player double-clicks a firing key; if yes, the process proceeds to S909, otherwise, the process returns to S904.
S909, the game character throws out the remaining flies one by one.
And S910, judging whether the Dazhu flying knife Dazhu skill is finished, if so, entering S911, otherwise, returning to S904.
S911, switching back to a common weapon, for example, switching back to a weapon used in addition to a weapon that uses a fly knife to move skill.
Fig. 10 shows a block diagram of a throwing prop control apparatus provided in an exemplary embodiment of the present application. The throwing prop control apparatus may be employed in a computer device to perform all or part of the steps of the method as shown in fig. 3 or 5. As shown in fig. 10, the throwing prop control device includes:
an interface display module 1001, configured to display a virtual scene interface, where the virtual scene interface is used to display a scene picture of a virtual scene, and the virtual scene includes a first virtual object;
a first image displaying module 1002, configured to display a first scene image in the virtual scene interface, where the first scene image includes at least two throwing objects; at least two of the throwing props are located within reach of the first virtual object;
a second screen displaying module 1003, configured to display, in response to receiving a throwing operation on the throwing prop, a second scene screen in the virtual scene interface, where the second scene screen is a screen in which the first virtual object throws at least one throwing prop.
In one possible implementation, the first scene picture is a picture in which at least two of the throwing props are suspended within a hand-reachable range of the first virtual object;
the second scene picture is an animation picture that the first virtual object throws at least one throwing prop through a hand.
In a possible implementation manner, the second screen displaying module 1003 is configured to,
responding to the received throwing operation of the throwing prop, and acquiring the operation mode of the throwing operation;
and displaying the second scene picture in the virtual scene interface based on the operation mode.
In a possible implementation manner, the second screen displaying module 1003 is configured to,
and in response to the fact that the operation mode is single click operation, displaying a picture of throwing the single throwing prop by the first virtual object in the virtual scene interface.
In a possible implementation manner, the second screen displaying module 1003 is configured to,
and displaying a picture that the first virtual object throws the throwing prop continuously in the virtual scene interface in response to the operation mode being a double-click operation or a sliding operation.
In a possible implementation manner, the second screen displaying module 1003 is configured to,
and displaying a picture of the first virtual object throwing the throwing prop continuously within the operation duration of the throwing operation in the virtual scene interface in response to the operation mode being long press operation or sliding operation.
In one possible implementation, the apparatus further includes:
and the property adding module is used for responding to the first condition, and adding the throwing property within the reachable range of the first virtual object.
In a possible implementation manner, the property adding module is configured to add the throwing property within a reachable range of the first virtual object in response to the throwing property hitting a target object after being thrown.
In a possible implementation manner, the first screen displaying module 1002 is configured to, in response to receiving an operation of releasing a target skill, display the first scene screen in the virtual scene interface.
In one possible implementation, the target skill has a duration;
the second screen displaying module 1003 is configured to, in response to receiving a throwing operation on the throwing prop within the duration of the target skill, display a second scene screen in the virtual scene interface.
In one possible implementation, the apparatus further includes:
a time increase module to increase a duration of the target skill in response to a second condition being met.
In one possible implementation, the time increase module is to increase the duration of the target skill in response to a number of target objects hit by the throwing prop reaching a number threshold within the duration of the target skill.
In one possible implementation, the apparatus further includes:
and the timing information display module is used for displaying timing information in the virtual scene interface within the duration of the target skill, wherein the timing information is used for indicating the remaining duration of the target skill.
To sum up, the throw stage property control scheme that this application embodiment provided, before throw stage property is thrown, place two at least throw stage properties in user control's virtual object's accessible range, when the user has triggered the throw operation like this, virtual object can throw one or more throw stage properties through continuous throwing action, thereby the action of virtual object in the throw control process of throwing stage property in succession has been reduced, under the enough nature of throwing action of guaranteeing virtual object and the condition of pressing close to reality, reduce the time interval of continuous throwing, thereby when guaranteeing virtual object's action authenticity, the human-computer interaction efficiency when improving user control throw, and then can reduce the duration of single office fight, electric quantity and the data traffic that saving terminal consumed.
Fig. 11 shows a block diagram of a computer device 1100 provided in an exemplary embodiment of the present application. The computer device 1100 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Computer device 1100 may also be referred to by other names such as user equipment, portable terminals, laptop terminals, desktop terminals, and the like.
Generally, the computer device 1100 includes: a processor 1101 and a memory 1102.
Processor 1101 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1101 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1101 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1101 may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and rendering content that the display screen needs to display. In some embodiments, the processor 1101 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1102 may include one or more computer-readable storage media, which may be non-transitory. Memory 1102 can also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1102 is used to store at least one computer instruction for execution by processor 1101 to implement the throw prop control method provided by method embodiments herein.
In some embodiments, the computer device 1100 may also optionally include: a peripheral interface 1103 and at least one peripheral. The processor 1101, memory 1102 and peripheral interface 1103 may be connected by a bus or signal lines. Various peripheral devices may be connected to the peripheral interface 1103 by buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1104, display screen 1105, camera assembly 1106, audio circuitry 1107, positioning assembly 1108, and power supply 1109.
The peripheral interface 1103 may be used to connect at least one peripheral associated with I/O (Input/Output) to the processor 1101 and the memory 1102. In some embodiments, the processor 1101, memory 1102, and peripheral interface 1103 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1101, the memory 1102 and the peripheral device interface 1103 may be implemented on separate chips or circuit boards, which is not limited by this embodiment.
The Radio Frequency circuit 1104 is used to receive and transmit RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuit 1104 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1104 converts an electric signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electric signal. Optionally, the radio frequency circuit 1104 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1104 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1104 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1105 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1105 is a touch display screen, the display screen 1105 also has the ability to capture touch signals on or over the surface of the display screen 1105. The touch signal may be input to the processor 1101 as a control signal for processing. At this point, the display screen 1105 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display screen 1105 may be one, disposed on the front panel of the computer device 1100; in other embodiments, the display screens 1105 may be at least two, each disposed on a different surface of the computer device 1100 or in a folded design; in other embodiments, the display 1105 may be a flexible display disposed on a curved surface or on a folded surface of the computer device 1100. Even further, the display screen 1105 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The Display screen 1105 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
Camera assembly 1106 is used to capture images or video. Optionally, camera assembly 1106 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1106 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1107 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1101 for processing or inputting the electric signals to the radio frequency circuit 1104 to achieve voice communication. The microphones may be multiple and placed at different locations on the computer device 1100 for stereo sound acquisition or noise reduction purposes. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1101 or the radio frequency circuit 1104 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1107 may also include a headphone jack.
The Location component 1108 is used to locate the current geographic Location of the computer device 1100 for navigation or LBS (Location Based Service). The Positioning component 1108 may be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
The power supply 1109 is used to provide power to the various components within the computer device 1100. The power supply 1109 may be alternating current, direct current, disposable or rechargeable. When the power supply 1109 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the computer device 1100 also includes one or more sensors 1110. The one or more sensors 1110 include, but are not limited to: acceleration sensor 1111, gyro sensor 1112, pressure sensor 1113, fingerprint sensor 1114, optical sensor 1115, and proximity sensor 1116.
The acceleration sensor 1111 can detect the magnitude of acceleration in three coordinate axes of a coordinate system established with the computer apparatus 1100. For example, the acceleration sensor 1111 may be configured to detect components of the gravitational acceleration in three coordinate axes. The processor 1101 may control the display screen 1105 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1111. The acceleration sensor 1111 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1112 may detect a body direction and a rotation angle of the computer device 1100, and the gyro sensor 1112 may cooperate with the acceleration sensor 1111 to acquire a 3D motion of the user on the computer device 1100. From the data collected by gyroscope sensor 1112, processor 1101 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensors 1113 may be disposed on the side bezel of the computer device 1100 and/or underneath the display screen 1105. When the pressure sensor 1113 is disposed on the side frame of the computer device 1100, the holding signal of the user to the computer device 1100 can be detected, and the processor 1101 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1113. When the pressure sensor 1113 is disposed at the lower layer of the display screen 1105, the processor 1101 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 1105. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1114 is configured to collect a fingerprint of the user, and the processor 1101 identifies the user according to the fingerprint collected by the fingerprint sensor 1114, or the fingerprint sensor 1114 identifies the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the user is authorized by the processor 1101 to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 1114 may be disposed on the front, back, or side of the computer device 1100. When a physical key or vendor Logo is provided on the computer device 1100, the fingerprint sensor 1114 may be integrated with the physical key or vendor Logo.
Optical sensor 1115 is used to collect ambient light intensity. In one embodiment, the processor 1101 may control the display brightness of the display screen 1105 based on the ambient light intensity collected by the optical sensor 1115. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1105 is increased; when the ambient light intensity is low, the display brightness of the display screen 1105 is reduced. In another embodiment, processor 1101 may also dynamically adjust the shooting parameters of camera assembly 1106 based on the ambient light intensity collected by optical sensor 1115.
The proximity sensor 1116, also referred to as a distance sensor, is typically disposed on a front panel of the computer device 1100. The proximity sensor 1116 is used to capture the distance between the user and the front of the computer device 1100. In one embodiment, the display screen 1105 is controlled by the processor 1101 to switch from a bright screen state to a dark screen state when the proximity sensor 1116 detects that the distance between the user and the front face of the computer device 1100 is gradually decreasing; when the proximity sensor 1116 detects that the distance between the user and the front face of the computer device 1100 becomes progressively larger, the display screen 1105 is controlled by the processor 1101 to switch from a breath-screen state to a light-screen state.
Those skilled in the art will appreciate that the configuration illustrated in FIG. 11 does not constitute a limitation of the computer device 1100, and may include more or fewer components than those illustrated, or may combine certain components, or may employ a different arrangement of components.
Fig. 12 shows a block diagram of a computer device 1200 according to an exemplary embodiment of the present application. The computer device may be implemented as a protection blocking device in the above-mentioned aspect of the present application. The computer apparatus 1200 includes a Central Processing Unit (CPU) 1201, a system Memory 1204 including a Random Access Memory (RAM) 1202 and a Read-Only Memory (ROM) 1203, and a system bus 1205 connecting the system Memory 1204 and the CPU 1201. The computer device 1200 also includes a basic Input/Output system (I/O system) 1206, which facilitates transfer of information between various devices within the computer, and a mass storage device 1207 for storing an operating system 1213, application programs 1214, and other program modules 1215.
The basic input/output system 1206 includes a display 1208 for displaying information and an input device 1209, such as a mouse, keyboard, etc., for a user to input information. Wherein the display 1208 and input device 1209 are connected to the central processing unit 1201 through an input-output controller 1210 coupled to the system bus 1205. The basic input/output system 1206 may also include an input/output controller 1210 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, input-output controller 1210 also provides output to a display screen, a printer, or other type of output device.
The mass storage device 1207 is connected to the central processing unit 1201 through a mass storage controller (not shown) connected to the system bus 1205. The mass storage device 1207 and its associated computer-readable media provide non-volatile storage for the computer device 1200. That is, the mass storage device 1207 may include a computer-readable medium (not shown) such as a hard disk or a Compact disk-Only Memory (CD-ROM) drive.
Without loss of generality, the computer-readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), flash Memory or other solid state Memory technology, CD-ROM, Digital Versatile Disks (DVD), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage, or other magnetic storage devices. Of course, those skilled in the art will appreciate that the computer storage media is not limited to the foregoing. The system memory 1204 and mass storage device 1207 described above may be collectively referred to as memory.
The computer device 1200 may also operate as a remote computer connected to a network via a network, such as the internet, in accordance with various embodiments of the present disclosure. That is, the computer device 1200 may connect to the network 1212 through a network interface unit 1211 coupled to the system bus 1205, or may connect to other types of networks or remote computer systems (not shown) using the network interface unit 1211.
The memory further includes at least one computer instruction stored therein, and the central processor 1201 implements all or part of the steps of the throwing prop control method shown in the various embodiments described above by executing the at least one instruction, at least one program, set of codes, or set of instructions.
In an exemplary embodiment, a non-transitory computer readable storage medium including instructions, such as a memory including at least one computer instruction, executable by a processor to perform all or part of the steps of the method shown in any of the embodiments of fig. 3 or 5 described above is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product or a computer program is also provided, which comprises computer instructions, which are stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform all or part of the steps of the method shown in any of the embodiments of fig. 3 or fig. 5.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (17)

1. A method of controlling a throwing prop, the method comprising:
displaying a virtual scene interface, wherein the virtual scene interface is used for displaying a scene picture of a virtual scene, and the virtual scene comprises a first virtual object;
displaying a first scene picture in the virtual scene interface, wherein the first scene picture comprises at least two throwing props; at least two of the throwing props are located within reach of the first virtual object;
in response to receiving a throwing operation of the throwing prop, displaying a second scene picture in the virtual scene interface, wherein the second scene picture is a picture of the first virtual object throwing at least one throwing prop.
2. The method of claim 1,
the first scene picture is a picture in which at least two of the throwing props are suspended within a hand-reachable range of the first virtual object;
the second scene picture is an animation picture that the first virtual object throws at least one throwing prop through a hand.
3. The method of claim 1, wherein said presenting a second scene screen in the virtual scene interface in response to receiving a throwing operation of the throwing prop comprises:
responding to the received throwing operation of the throwing prop, and acquiring the operation mode of the throwing operation;
and displaying the second scene picture in the virtual scene interface based on the operation mode.
4. The method according to claim 3, wherein the presenting the second scene screen in the virtual scene interface based on the operation manner comprises:
and in response to the fact that the operation mode is single click operation, displaying a picture of throwing the single throwing prop by the first virtual object in the virtual scene interface.
5. The method according to claim 3, wherein the presenting the second scene screen in the virtual scene interface based on the operation manner comprises:
and displaying a picture that the first virtual object throws the throwing prop continuously in the virtual scene interface in response to the operation mode being a double-click operation or a sliding operation.
6. The method according to claim 3, wherein the presenting the second scene screen in the virtual scene interface based on the operation manner comprises:
and displaying a picture of the first virtual object throwing the throwing prop continuously within the operation duration of the throwing operation in the virtual scene interface in response to the operation mode being long press operation or sliding operation.
7. The method of claim 1, further comprising:
in response to a first condition being met, adding the throwing prop within reach of the first virtual object.
8. The method of claim 7, wherein said adding the throwing prop within reach of the first virtual object in response to a first condition being met comprises:
and responding to the throwing prop hitting a target object after being thrown, and adding the throwing prop within the reachable range of the first virtual object.
9. The method of claim 1, wherein said presenting a first scene screen in said virtual scene interface comprises:
in response to receiving an operation of releasing a target skill, presenting the first scene screen in the virtual scene interface.
10. The method of claim 9, wherein the target skill has a duration;
the displaying a second scene picture in the virtual scene interface in response to receiving a throwing operation of the throwing prop includes:
in response to receiving a throwing operation of the throwing prop within a duration of the target skill, presenting a second scene screen in the virtual scene interface.
11. The method of claim 10, further comprising:
in response to a second condition being met, increasing a duration of the target skill.
12. The method of claim 11, wherein increasing the duration of the target skill in response to a second condition being met comprises:
increasing the duration of the target skill in response to the number of target objects hit by the throwing prop reaching a number threshold within the duration of the target skill.
13. The method of claim 10, further comprising:
and displaying timing information in the virtual scene interface within the duration of the target skill, wherein the timing information is used for indicating the remaining duration of the target skill.
14. A throwing prop control apparatus, the apparatus comprising:
the interface display module is used for displaying a virtual scene interface, the virtual scene interface is used for displaying a scene picture of a virtual scene, and the virtual scene comprises a first virtual object;
the first picture display module is used for displaying a first scene picture in the virtual scene interface, and the first scene picture comprises at least two throwing props; at least two of the throwing props are located within reach of the first virtual object;
and the second picture display module is used for displaying a second scene picture in the virtual scene interface in response to receiving the throwing operation of the throwing prop, wherein the second scene picture is a picture of throwing at least one throwing prop by the first virtual object.
15. A computer device comprising a processor and a memory having stored therein at least one computer instruction loaded and executed by the processor to implement a method of throwing prop control according to any of claims 1 to 13.
16. A computer readable storage medium having stored therein at least one computer instruction, which is loaded and executed by a processor to implement a method of throwing prop control according to any one of claims 1 to 13.
17. A computer program product, characterized in that it comprises computer instructions which are read and executed by a processor of a computer device, causing the computer device to perform the method of throwing prop control according to any one of claims 1 to 13.
CN202111060411.6A 2021-09-10 2021-09-10 Throwing prop control method, throwing prop control device, computer equipment and storage medium Active CN113713383B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111060411.6A CN113713383B (en) 2021-09-10 2021-09-10 Throwing prop control method, throwing prop control device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111060411.6A CN113713383B (en) 2021-09-10 2021-09-10 Throwing prop control method, throwing prop control device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113713383A true CN113713383A (en) 2021-11-30
CN113713383B CN113713383B (en) 2023-06-27

Family

ID=78683233

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111060411.6A Active CN113713383B (en) 2021-09-10 2021-09-10 Throwing prop control method, throwing prop control device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113713383B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023179292A1 (en) * 2022-03-21 2023-09-28 北京字跳网络技术有限公司 Virtual prop driving method and apparatus, electronic device and readable storage medium
WO2024037150A1 (en) * 2022-08-19 2024-02-22 腾讯科技(深圳)有限公司 Human-computer interaction method and apparatus based on virtual world, and device, medium and product
WO2024098984A1 (en) * 2022-11-07 2024-05-16 腾讯科技(深圳)有限公司 Virtual-prop control method and apparatus, and device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0990458A2 (en) * 1998-09-28 2000-04-05 Konami Co., Ltd. Video game machine, method for switching viewpoint on gamescreen of video game, and computer-readable recording medium containing game-screen-viewpoint switching program
JP2018089120A (en) * 2016-12-02 2018-06-14 株式会社コナミデジタルエンタテインメント Game control device, game system and program
CN110427111A (en) * 2019-08-01 2019-11-08 腾讯科技(深圳)有限公司 The operating method of virtual item, device, equipment and storage medium in virtual environment
CN112121414A (en) * 2020-09-29 2020-12-25 腾讯科技(深圳)有限公司 Tracking method and device in virtual scene, electronic equipment and storage medium
CN112138384A (en) * 2020-10-23 2020-12-29 腾讯科技(深圳)有限公司 Using method, device, terminal and storage medium of virtual throwing prop
CN113069772A (en) * 2021-03-31 2021-07-06 网易(杭州)网络有限公司 Method and device for assembling virtual props in game and electronic equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0990458A2 (en) * 1998-09-28 2000-04-05 Konami Co., Ltd. Video game machine, method for switching viewpoint on gamescreen of video game, and computer-readable recording medium containing game-screen-viewpoint switching program
JP2018089120A (en) * 2016-12-02 2018-06-14 株式会社コナミデジタルエンタテインメント Game control device, game system and program
CN110427111A (en) * 2019-08-01 2019-11-08 腾讯科技(深圳)有限公司 The operating method of virtual item, device, equipment and storage medium in virtual environment
CN112121414A (en) * 2020-09-29 2020-12-25 腾讯科技(深圳)有限公司 Tracking method and device in virtual scene, electronic equipment and storage medium
CN112138384A (en) * 2020-10-23 2020-12-29 腾讯科技(深圳)有限公司 Using method, device, terminal and storage medium of virtual throwing prop
CN113069772A (en) * 2021-03-31 2021-07-06 网易(杭州)网络有限公司 Method and device for assembling virtual props in game and electronic equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023179292A1 (en) * 2022-03-21 2023-09-28 北京字跳网络技术有限公司 Virtual prop driving method and apparatus, electronic device and readable storage medium
WO2024037150A1 (en) * 2022-08-19 2024-02-22 腾讯科技(深圳)有限公司 Human-computer interaction method and apparatus based on virtual world, and device, medium and product
WO2024098984A1 (en) * 2022-11-07 2024-05-16 腾讯科技(深圳)有限公司 Virtual-prop control method and apparatus, and device and storage medium

Also Published As

Publication number Publication date
CN113713383B (en) 2023-06-27

Similar Documents

Publication Publication Date Title
CN110694261B (en) Method, terminal and storage medium for controlling virtual object to attack
CN108434736B (en) Equipment display method, device, equipment and storage medium in virtual environment battle
WO2021143259A1 (en) Virtual object control method and apparatus, device, and readable storage medium
CN110585710B (en) Interactive property control method, device, terminal and storage medium
CN110917619B (en) Interactive property control method, device, terminal and storage medium
WO2021184806A1 (en) Interactive prop display method and apparatus, and terminal and storage medium
CN110465098B (en) Method, device, equipment and medium for controlling virtual object to use virtual prop
CN112076467B (en) Method, device, terminal and medium for controlling virtual object to use virtual prop
WO2021203856A1 (en) Data synchronization method and apparatus, terminal, server, and storage medium
CN111282275A (en) Method, device, equipment and storage medium for displaying collision traces in virtual scene
CN111475029B (en) Operation method, device, equipment and storage medium of virtual prop
CN110917623B (en) Interactive information display method, device, terminal and storage medium
CN113713383B (en) Throwing prop control method, throwing prop control device, computer equipment and storage medium
CN112870715B (en) Virtual item putting method, device, terminal and storage medium
CN111714893A (en) Method, device, terminal and storage medium for controlling virtual object to recover attribute value
CN113289331B (en) Display method and device of virtual prop, electronic equipment and storage medium
CN113713382B (en) Virtual prop control method and device, computer equipment and storage medium
CN112138384A (en) Using method, device, terminal and storage medium of virtual throwing prop
CN110755844B (en) Skill activation method and device, electronic equipment and storage medium
CN111744184A (en) Control display method in virtual scene, computer equipment and storage medium
CN111744186A (en) Virtual object control method, device, equipment and storage medium
CN111659117A (en) Virtual object display method and device, computer equipment and storage medium
WO2021143253A1 (en) Method and apparatus for operating virtual prop in virtual environment, device, and readable medium
CN112316421A (en) Equipment method, device, terminal and storage medium of virtual prop
CN111921190A (en) Method, device, terminal and storage medium for equipping props of virtual objects

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant