CN115624753A - Virtual gun interaction method and device, storage medium and electronic equipment - Google Patents

Virtual gun interaction method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN115624753A
CN115624753A CN202211274167.8A CN202211274167A CN115624753A CN 115624753 A CN115624753 A CN 115624753A CN 202211274167 A CN202211274167 A CN 202211274167A CN 115624753 A CN115624753 A CN 115624753A
Authority
CN
China
Prior art keywords
bullet
control
initial
touch operation
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211274167.8A
Other languages
Chinese (zh)
Inventor
江阳晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202211274167.8A priority Critical patent/CN115624753A/en
Publication of CN115624753A publication Critical patent/CN115624753A/en
Priority to PCT/CN2023/083854 priority patent/WO2024082552A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the disclosure provides a virtual firearm interaction method, a virtual firearm interaction device, a medium and equipment; the method comprises the following steps: acquiring the number of initial bullet props of the virtual firearm; the number of the initial bullet props is larger than a first preset value, and the initial bullet props of the first number are combined and generated into a first bullet prop in response to a first touch operation acted on the bullet changing control; and controlling the virtual firearm to interact with the virtual object in the virtual scene through the first bullet prop in response to the use operation for the first bullet prop. Through the technical scheme of the embodiment of the disclosure, the interaction efficiency and the flexibility of virtual firearm interaction can be improved.

Description

Virtual gun interaction method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of virtual interaction technologies, and in particular, to a virtual firearm interaction method, a virtual firearm interaction apparatus, a computer-readable storage medium, and an electronic device.
Background
A virtual character may be included in the virtual scene, and the user may control the virtual character to shoot in the virtual firearm to interact with other virtual objects in the virtual scene. The result of the interaction includes a change in an attribute, such as a vital value attribute.
The current shooting interaction scheme is to use a virtual firearm to shoot and interact a virtual object based on the existing bullets, and the effect achieved by each interaction is completely the same.
With the above scheme, the user only interacts with the virtual firearm based on fixed bullets. For users with skilled shooting technology, the effect that the user needs to carry out multiple interactions to accumulate the data to achieve a better effect can be achieved; or for the user who is sparsely skilled in shooting technique, bullets are easily wasted, and more times of interaction cannot be realized for practice or other purposes.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the disclosed embodiments is to provide a virtual firearm interaction method, a virtual firearm interaction device, a computer-readable storage medium, and an electronic device. According to the embodiment of the disclosure, the interaction efficiency and the fault tolerance rate of virtual firearm interaction are improved.
A first aspect of the embodiments of the present disclosure provides a virtual firearm interaction method, in which a terminal device provides a graphical user interface, and at least part of the graphical user interface displays a virtual scene and a bomb-change control, where the virtual scene includes a virtual object, and the method includes: acquiring the number of initial bullet props of the virtual firearm; the number of the initial bullet props is larger than a first preset value, and the initial bullet props of the first number are combined and generated into a first bullet prop in response to a first touch operation acting on the bullet changing control; wherein the first number is less than or equal to the number of the initial bullet prop, the attack force of the first bullet prop being higher than the attack force of the initial bullet prop; and controlling the virtual firearm to interact with the virtual object in the virtual scene through the first bullet prop in response to the use operation for the first bullet prop.
According to a second aspect of the embodiments of the present disclosure, a virtual firearm interaction apparatus is provided, where a terminal device provides a graphical user interface, where at least part of the graphical user interface displays a virtual scene and a pop-up control, the virtual scene includes a virtual object, and the apparatus includes: the system comprises an initial quantity acquisition module, a data acquisition module and a data processing module, wherein the initial quantity acquisition module is used for acquiring the quantity of initial bullet props of a virtual firearm; the property synthesis module is used for responding to a first touch operation acted on the bullet changing control, and synthesizing and generating a first number of initial bullet properties into a first bullet property, wherein the number of the initial bullet properties is larger than a first preset value; wherein the first number is less than or equal to the number of the initial bullet prop, the attack force of the first bullet prop being higher than the attack force of the initial bullet prop; and the shooting interaction module is used for responding to the use operation aiming at the first bullet prop and controlling the virtual firearm to interact with the virtual object in the virtual scene through the first bullet prop.
According to a third aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having a computer program stored thereon, wherein the program, when executed by a processor, implements a virtual firearm interaction method as the first aspect in the above embodiments.
According to a fourth aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including: one or more processors; a storage device to store one or more programs that, when executed by one or more processors, cause the one or more processors to implement the virtual firearm interaction method of the first aspect as in the above embodiments.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
in the technical solutions provided in some embodiments of the present disclosure, the number of initial bullet props of a virtual firearm may be obtained, where the number of initial bullet props is greater than a first preset value, a first touch operation applied to a cartridge change control is responded, the first number of initial bullet props are combined and generated into a first bullet prop, and a use operation for the first bullet prop is responded, so that the virtual firearm is controlled to interact with a virtual object in a virtual scene through the first bullet prop. By implementing the embodiment of the disclosure, on one hand, when the number of the bullet props is greater than the first preset value, the bullet props can be combined to enhance the preset shooting attribute of the virtual firearm, so that a better interaction effect can be achieved without a large amount of interaction operation, and the interaction efficiency is improved; on the other hand, the interaction mode of the virtual gun can be defined by the user, the interaction in a single fixed mode is avoided, and the flexibility and the degree of freedom of the interaction process are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 schematically illustrates a schematic diagram of an exemplary terminal device to which a virtual firearm interaction method and a virtual firearm interaction apparatus according to an embodiment of the disclosure may be applied;
FIG. 2 schematically illustrates a flow diagram of a virtual firearm interaction method according to one embodiment of the present disclosure;
FIG. 3 schematically illustrates an interface diagram of a first quantity selection control comprising a plurality of quantity nodes, according to one embodiment of the present disclosure;
FIG. 4 schematically illustrates an interface diagram for highlighting a trimmable area in a preset manner, according to one embodiment of the present disclosure;
FIG. 5 schematically illustrates an interface diagram showing a composition control and toggle display of a bounce control as a cancel control according to one embodiment of the present disclosure;
FIG. 6 schematically illustrates an interface diagram showing a split control and toggle display of a bounce control as a cancel control according to one embodiment of the present disclosure;
FIG. 7 schematically illustrates an interface diagram showing the visualization of a numerical value of a duration in accordance with one embodiment of the present disclosure;
FIG. 8 schematically illustrates a schematic diagram of an application scenario in an exemplary embodiment of the present disclosure;
FIG. 9 schematically shows a block diagram of a virtual firearm interaction device in an embodiment in accordance with the present disclosure;
fig. 10 schematically illustrates a structural diagram of a computer system suitable for use in implementing a terminal device of an embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the embodiments of the disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 is a schematic diagram of an exemplary terminal device to which a virtual firearm interaction method and a virtual firearm interaction apparatus according to an embodiment of the disclosure may be applied.
As shown in fig. 1, the terminal devices may include one or more of the terminal devices 101, 102, 103. The terminal devices 101, 102, 103 may be various electronic devices having a display screen, including but not limited to desktop computers, portable computers, smart phones, tablet computers, and the like.
The virtual gun interaction method in the embodiment of the present disclosure may be executed on a local terminal device or a server. When the virtual firearm interaction method runs on a server, the method can be implemented and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and a client device.
In an optional embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud games. Taking a cloud game as an example, the cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, a running main body of a game program and a game picture presenting main body are separated, the storage and running of the virtual gun interaction method are completed on a cloud game server, and the client device is used for receiving and sending data and presenting a game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the cloud game server which performs information processing is a cloud. When a game is played, a player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the client device through a network, and finally the data are decoded through the client device and the game pictures are output.
In an optional implementation manner, taking a game as an example, the local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, a game program is downloaded and installed and operated through the terminal device conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the player by holographic projection. By way of example, the local terminal device may include a display screen for presenting a graphical user interface including game screens and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
It should be noted that the computer readable storage medium shown in the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. The computer-readable storage medium may be included in the terminal device described in the above embodiments; or may exist separately without being assembled into the terminal device. The computer-readable storage medium carries one or more programs which, when executed by the terminal device, cause the terminal device to implement the method in the embodiments described below. For example, the terminal device may implement the steps shown in fig. 2, and the like.
In an exemplary embodiment of the present disclosure, the virtual scene may be a digital scene outlined by an intelligent terminal device such as a computer, a mobile phone, a tablet computer, and the like through a digital communication technology, and the digital scene may be on a display screen of the intelligent terminal device or projected onto other display devices. The virtual scene may include buildings or structures such as houses, buildings, gardens, bridges, pools, etc., and may further include natural landscapes such as mountains, rivers, lakes, etc., and any virtual objects such as weapons, tools, creatures, etc., which are not particularly limited in this exemplary embodiment. Virtual characters are three-dimensional models created based on animated skeletal techniques. Each virtual character has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment.
The virtual scene picture presented by the graphical user interface may be, for example, a game picture, which is captured by a virtual camera provided in the virtual scene. The virtual camera may acquire the virtual scene based on the first view, the second view, or other views, for example, a top view, which is a view for viewing the virtual environment from an aerial top view. When a top-down perspective is employed, the virtual camera may be positioned overhead of the virtual scene. According to the height difference of the virtual camera, when the height of the virtual camera is lower, the obtained picture can be only a part of a virtual scene, such as Multiplayer Online Battle sports game (MOBA for short); when the height of the virtual camera is high, the captured picture may be a full view of the virtual scene.
The present example embodiment provides a virtual firearm interaction method, as shown in fig. 2, a graphical user interface is provided by a terminal device, where the graphical user interface at least partially displays a virtual scene and a missile change control, the virtual scene includes a virtual object, and the method includes:
step S210, obtaining the number of initial bullet props of the virtual firearm;
step S220, combining the initial bullet props of a first number into a first bullet prop in response to a first touch operation acting on the bullet changing control piece, wherein the number of the initial bullet props is larger than a first preset value; wherein the first number is less than or equal to the number of the initial bullet prop, the attack force of the first bullet prop being higher than the attack force of the initial bullet prop;
and step S230, responding to the use operation aiming at the first bullet prop, and controlling the virtual firearm to interact with a virtual object in the virtual scene through the first bullet prop.
By implementing the virtual firearm interaction method shown in fig. 2, the number of initial bullet props of the virtual firearm can be obtained, the number of the initial bullet props is greater than a first preset value, the first number of initial bullet props are combined and generated into a first bullet prop in response to a first touch operation acting on the bullet changing control, and the virtual firearm is controlled to interact with a virtual object in a virtual scene through the first bullet prop in response to a use operation for the first bullet prop. By implementing the embodiment of the disclosure, on one hand, when the number of the bullet props is greater than the first preset value, the bullet props can be combined to enhance the preset shooting attribute of the virtual firearm, so that a better interaction effect can be achieved without a large amount of interaction operation, and the interaction efficiency is improved; on the other hand, the interaction mode of the virtual gun can be defined by the user, the interaction in a single fixed mode is avoided, and the flexibility and the degree of freedom of the interaction process are improved.
Next, the above-described steps of the present exemplary embodiment will be described in more detail.
In step S210, the number of initial bullet props of the virtual firearm is obtained;
in an exemplary embodiment of the present disclosure, the virtual firearm is a virtual prop, which can be used by a user through a terminal device to control a virtual character in a virtual scene. A bullet prop is a prop that is associated with a virtual firearm, for example, the number of bullet props limits the number of times a virtual firearm can be fired, one type of bullet prop matches a particular virtual firearm.
It is understood that similar to virtual firearms and bullets, virtual long bows and arrows may of course be applied to embodiments of the present disclosure, as well as other combinations of virtual props of similar use. The disclosed embodiments are not limited herein.
It should be noted that the number of initial bullet props of a virtual firearm may refer to the number of bullet props that have been fitted into the virtual firearm or its clip, or may be the number of bullet props that are present in a virtual warehouse or virtual backpack. The disclosed embodiments are not limited herein.
In step S220, the number of initial bullet props is greater than a first preset value, and the first number of initial bullet props are combined and generated into a first bullet prop in response to a first touch operation applied to the bullet changing control;
in an exemplary embodiment of the present disclosure, the first preset value may be a value configured according to actual needs. For example, it may be 1 or 4. When the first preset value is 1, when the number of sub-ballistic tools is 2 or more than 2, synthesis may be performed.
The ammunition exchange control may be configured to update the number of ammunition loaded by the virtual firearm to a maximum capacity value in response to a single-click touch operation, and correspondingly reduce the number of ammunition props in the virtual warehouse or virtual backpack. And combining and generating a first number of initial bullet properties into a first bullet property in response to the first touch operation acting on the bullet changing control. Wherein the first number is less than or equal to the number of the initial bullet prop, the attack force of the first bullet prop being higher than the attack force of the initial bullet prop.
For example, the first touch operation acting on the pop-up control is a long-press touch operation in response to the pop-up control. For example, the first number may be determined according to the duration of the long-press touch operation; the first number may be 1 when the duration is 1 second and 2 when the duration is 2 seconds. After the merging is completed, the first number of initial bullet props is reduced and the first bullet props are increased.
In one exemplary embodiment of the present disclosure, a first number of initial bullet properties are combined and generated as a first bullet property when the duration reaches a preset time.
In an exemplary embodiment of the present disclosure, the first bullet prop has a higher force of attack than the initial bullet prop. For example, two initial bullet props are combined into one first bullet prop. When the virtual firearm interacts with the virtual object based on the initial bullet prop, it can cause a reduction in the life value attribute of the virtual object by 20; the life value attribute of the virtual object can be caused to decrease 40 when the virtual firearm interacts with the virtual object based on the first bullet prop.
In step S230, in response to the use operation for the first bullet prop, the virtual firearm is controlled to interact with the virtual object in the virtual scene through the first bullet prop.
In an exemplary embodiment of the present disclosure, the virtual firearm is controlled to interact with a virtual object in the virtual scene through the first bullet prop in response to the use operation directed to the first bullet prop. In the interaction, a hit judgment algorithm can be adopted, a ray is generated according to the shooting angle and the direction of the virtual gun, the intersection point of the ray and the virtual object of the virtual scene is calculated, and the hit virtual object is judged according to the intersection point. Or other hit algorithm, determines the virtual object that was hit and interacts with the virtual object in the virtual scene through the first bullet prop in the virtual firearm. For example, the value of the movement speed attribute or the value of the life attribute of the virtual object may be reduced, and may be configured specifically according to actual needs, and the embodiment of the present disclosure does not limit the interaction effect here.
In an exemplary embodiment of the disclosure, the number of initial bullet props is greater than a first preset value, a first number selection control is displayed on the graphical user interface in response to a first touch operation applied to the bullet changing control, a target node is determined among the number of nodes in response to a user operation, and the first number of initial bullet props are combined and generated into the first bullet prop according to the first number corresponding to the target node. Combining and generating a first number of initial bullet properties into a first bullet property according to a first number corresponding to the target node, may include the following steps S310 to S330:
step S310, responding to a first touch operation acted on a bullet changing control, and displaying a first quantity selection control on a graphical user interface, wherein the quantity of the initial bullet props is larger than a first preset value; wherein the first quantity selection control comprises a plurality of quantity nodes;
step S320, responding to user operation, determining a target node in a plurality of nodes;
step S330, combining and generating a first number of initial bullet properties into a first bullet property according to the first number corresponding to the target node.
In an exemplary embodiment of the disclosure, when the number of the initial bullet props is greater than a first preset value, a first number selection control is displayed on the graphical user interface in response to a first touch operation applied to the bullet changing control. The first number selection control may be a bar control, and the first number selection control is sequentially arranged with a number of nodes. For example, the number nodes may correspond to positive integers 1, 2, 3 … …, and the positive integer with the largest number node may be used to indicate the number of initial bullet props. For example, the number of initial bullet props is 4, and the first number selection control includes 4 nodes.
The first touch operation acting on the bullet changing control can be continuous left-right sliding touch operation after the bullet changing control is pressed, so that different numbers of nodes are selected by moving left and right on the first number of selection controls. A movable cursor can also be displayed on the graphical user interface, and the cursor is controlled to move on the first quantity selection control by responding to left-right sliding touch operation so as to select different quantity nodes.
For example, as shown in fig. 3, the graphical user interface includes a pop-up control 310, a first number of nodes 321, a second number of nodes 322, a third number of nodes 323, a fourth number of nodes 324, and four number of nodes sequentially arranged on the first number of selection controls.
Through the steps S310 to S330, the number of the initial bullet props is greater than the first preset value, a first touch operation acting on the bullet changing control is responded, a first number selection control is displayed on the graphical user interface, a target node is determined among the number of nodes in response to the user operation, and the first number of the initial bullet props are combined and generated into the first bullet prop according to the first number corresponding to the target node.
In an exemplary embodiment of the disclosure, a composite control may be displayed in response to a first touch operation on the pop-up control, and a first number of selection controls may be displayed on the graphical user interface in response to the touch operation on the composite control. Responding to the touch operation acting on the composite control, displaying a first number of selection controls on the graphical user interface, which may include the following steps S410 to S420:
step S410, responding to a first touch operation acting on the bullet changing control, and displaying a composite control;
step S420, in response to the touch operation applied to the composite control, displaying a first number of selection controls on the graphical user interface.
In an exemplary embodiment of the disclosure, the composite control may be displayed in response to a first touch operation acting on the bounce control. Specifically, after receiving a first touch operation acting on the pop-up control, a composite control may be displayed in the graphical user interface. The composition control can be used for receiving touch operation so as to display the first number of selection controls in the graphical user interface.
It should be noted that the present disclosure is not limited to the specific form of the synthesized control.
Further, the composite control may be displayed in response to the first touch operation applied to the bullet changing control only when the initial bullet prop is of the target bullet type or only when the virtual firearm is of the target firearm type.
In an exemplary embodiment of the present disclosure, a target node may be determined among a plurality of number nodes in response to a stroking operation acting on a composite control. Specifically, the composite control may include a plurality of quantity nodes, and the target node may be determined among the plurality of quantity nodes, so as to determine the specific numerical value of the first quantity.
It should be noted that, the present disclosure is not limited to a specific manner of determining the target node among a plurality of nodes.
Through the above steps S410 to S420, the composite control may be displayed in response to the first touch operation applied to the pop-up control, and the first number of selection controls may be displayed on the graphical user interface in response to the touch operation applied to the composite control.
In an exemplary embodiment of the present disclosure, a plurality of number nodes corresponding to the maximum loading number of bullet properties may be displayed in the first number selection control according to the maximum loading number of bullet properties of the virtual firearm, and the adjustable region may be highlighted in a preset manner. Highlighting the adjustable region in a predetermined manner may include the following steps S510 to S520:
step S510, displaying a plurality of quantity nodes corresponding to the maximum loading quantity of the bullet props in a first quantity selection control according to the maximum loading quantity of the bullet props of the virtual firearm;
in step S520, the adjustable region is highlighted in a predetermined manner.
In an exemplary embodiment of the present disclosure, a plurality of number nodes corresponding to the maximum number of bullet items loaded may be displayed in the first number selection control according to the maximum number of bullet items loaded of the virtual firearm. Wherein the number of nodes includes the same number of adjustable nodes as the initial bullet prop, and the adjustable nodes correspond to an adjustable region.
Specifically, based on the display of the first number of selection controls, the adjustable region may be highlighted in a preset manner.
For example, as shown in fig. 4, the maximum loading number of bullet props is 4, but the number of initial bullet props that can currently be used for merging is 3. The first number node 401, the second number node 402 and the third number node 403 are displayed in a light color and the fourth number node 404 is displayed in a dark color to highlight the trimmable area in a predetermined manner.
Through the steps S510 to S520, a plurality of quantity nodes corresponding to the maximum loading quantity of the bullet properties can be displayed in the first quantity selection control according to the maximum loading quantity of the bullet properties of the virtual firearm, and the adjustable region is highlighted in a preset manner. With embodiments of the present disclosure, the number that can be used for consolidation and the maximum number can be graphically displayed at the first number selection control. The user can intuitively know the synthesis process, and the interaction efficiency is improved.
In an exemplary embodiment of the disclosure, a candidate target node may be determined among a plurality of nodes in response to a swiping operation acting on the composite control, a duration of the swiping operation on the graphical user interface after the candidate target node is determined is obtained, and the candidate target node is determined as the target node when the duration is greater than or equal to a numerical value corresponding to the candidate target node. When the duration is greater than or equal to the value corresponding to the candidate target node, determining the candidate target node as the target node may include the following steps S610 to S630:
step S610, responding to the sliding operation acted on the synthesis control, and determining a candidate target node in a plurality of quantity nodes; the plurality of quantity nodes comprise candidate target nodes;
step S620, obtaining the duration of the swiping operation on the graphical user interface after the candidate target node is determined;
step S630, when the duration is greater than or equal to the value corresponding to the candidate target node, determining the candidate target node as the target node.
In an exemplary embodiment of the present disclosure, in the determining the target node, a candidate target node is first determined among the plurality of number nodes, and then the candidate target node is determined as the target node according to a duration of the stroke operation on the graphical user interface after the candidate target node is determined. For example, a second node from left to right in the first number of selection controls may be determined as a candidate target node, a duration of the paddling operation on the graphical user interface after the determination of the candidate target node is obtained, and when the paddling operation is kept unchanged for two seconds, the candidate target node is further determined as the target node.
Through the steps S610 to S630, a candidate target node may be determined among the plurality of nodes in response to the swiping operation acting on the composite control, the duration of the swiping operation on the graphical user interface after the candidate target node is determined is obtained, and when the duration is greater than or equal to the value corresponding to the candidate target node, the candidate target node is determined as the target node.
In an exemplary embodiment of the disclosure, in response to that the duration of the first touch operation acting on the pop-up control is longer than a preset duration, a composite control is displayed and the pop-up control is switched and displayed as a cancel control. The cancellation control is used for responding to touch operation acting on the cancellation control to cancel the merging process.
For example, when the first touch operation acting on the bullet changing control is the long-press touch operation, and the duration of the long-press touch operation is determined to be longer than the preset duration, for example, 2 seconds, the synthesized control is displayed, and the bullet changing control is switched and displayed as the cancel control. The position of the composite control display may be proximate to the cancel control.
For example, the cancel control 501 is displayed in response to the duration of the first touch operation acting on the bounce control being longer than the preset duration, and the display composition control 502 is generated at the same time, and the merge process may be cancelled in response to the touch operation acting on the cancel control 501 again.
In an exemplary embodiment of the present disclosure, the initial bullet properties in the virtual firearm may be combined and generated as the first bullet property when the number of initial bullet properties in the virtual firearm is equal to the maximum loaded number of bullet properties of the virtual firearm. Wherein the number of initial bullet props in the virtual firearm is a first number. Specifically, when the number of initial bullet properties in the virtual firearm reaches the maximum loading number of bullet properties of the virtual firearm, all the initial bullet properties in the virtual firearm can be combined and generated into the first bullet property.
In an exemplary embodiment of the present disclosure, the force of attack of the first bullet prop is associated with a first quantity. Specifically, the attack force of the first bullet prop may be in direct proportion to the first amount.
In an exemplary embodiment of the disclosure, in response to a first touch operation applied to the bullet changing control, a first number of initial bullet properties are merged according to a preset merging manner to generate a plurality of first bullet properties. The preset combination mode is used for indicating the number of initial bullet props required by combination generation to obtain one first bullet prop. For example, the predetermined combination is to combine every two initial bullet properties into one first bullet property, for example, 12 initial bullet properties, and the 12 initial bullet properties can be combined into 6 first bullet properties.
It should be noted that the specific form of the preset combining manner is not particularly limited in the present disclosure.
In an exemplary embodiment of the present disclosure, a first number of initial bullet properties may be merged and generated as a plurality of first candidate bullet properties in response to a first touch operation applied to the bullet change control, and the plurality of first candidate bullet properties may be merged and generated as the first bullet properties in response to a third touch operation applied to the bullet change control. Responding to a third touch operation acting on the bullet changing control, combining and generating a plurality of first candidate bullets into a first bullet, which may include the following steps S710 to S720:
step S710, responding to a first touch operation acting on a bullet changing control, combining a first number of initial bullet props to generate a plurality of first candidate bullet props;
in step S720, in response to the third touch operation applied to the bullet changing control, the plurality of first candidate bullets are combined and generated into the first bullet.
In an exemplary embodiment of the present disclosure, a first number of initial bullet properties may be merged and generated as a plurality of first candidate bullet properties in response to a first touch operation applied to the bullet change control, and the plurality of first candidate bullet properties may be merged and generated as the first bullet properties in response to a third touch operation applied to the bullet change control. Specifically, a first number of initial bullet properties may be combined to generate a plurality of first candidate bullet properties, and then the plurality of first candidate bullet properties may be combined to generate the first bullet properties.
For example, the initial bullet prop has 12 shots, and the 12 initial bullet props may be combined into 6 first candidate bullet props, and the 6 first candidate bullet props may be generated into 3 first bullet props.
It should be noted that the specific manner of combining the first number of initial bullet properties into the plurality of first candidate bullet properties and combining the plurality of first candidate bullet properties into the first bullet properties is not particularly limited in this disclosure.
Through the above steps S710 to S720, the first number of initial bullet properties can be combined to generate a plurality of first candidate bullet properties in response to the first touch operation applied to the bullet changing control, and the plurality of first candidate bullet properties can be combined to generate the first bullet properties in response to the third touch operation applied to the bullet changing control.
In an exemplary embodiment of the present disclosure, a first number of initial bullets are combined and generated to correspond to a first bullet prop with a combination time, the first number being proportional to the combination time. Specifically, the larger the value corresponding to the first number is, the longer the synthesis time required for combining and generating the first number of initial bullets into the first bullet props is; the smaller the value corresponding to the first quantity, the shorter the synthesis time required to combine and generate the first quantity of initial props into the first prop.
In an exemplary embodiment of the present disclosure, the number of the initial bullet props is smaller than a second preset value, the initial bullet props are disassembled into a second number of second bullet props in response to a second touch operation applied to the bullet changing control, and the virtual firearm is controlled to interact with the virtual objects in the virtual scene through the second bullet props in response to a use operation for the second bullet props. The method may include the following steps S810 to S820:
step S810, the number of the initial bullet props is smaller than a second preset value, and the initial bullet props are disassembled into a second number of second bullet props in response to a second touch operation acted on the bullet changing control; wherein the second number is less than or equal to the maximum loading number of the bullet props of the virtual firearm, the attack force of the second bullet prop being lower than the attack force of the initial bullet prop;
in an exemplary embodiment of the present disclosure, the second preset value may be a value configured according to actual needs. For example, the second preset value may be 2, that is, the number of the initial bullet props is 1, and at this time, one initial bullet prop may be detached.
The ammunition exchange control may be configured to update the number of ammunition loaded by the virtual firearm to a maximum capacity value in response to a single-click touch operation, and correspondingly reduce the number of ammunition props in the virtual warehouse or virtual backpack. And responding to a second touch operation acted on the bullet changing control, and disassembling the initial bullet prop into a second bullet prop. Wherein the second number is less than or equal to the maximum loading number of the bullet props of the virtual firearm, the attack force of the second bullet prop being lower than the attack force of the initial bullet prop;
for example, the second touch operation acting on the pop-up control is a long-press touch operation in response to the pop-up control. For example, the second number may be determined according to the duration of the long-press touch operation; the second number may be 1 when the duration is 1 second and 2 when the duration is 2 seconds. And after the disassembly is finished, deleting the initial bullet prop and adding a second bullet prop.
In an exemplary embodiment of the disclosure, the initial bullet prop is split to generate a preset number of second bullet props when the duration reaches a preset duration.
In an exemplary embodiment of the present disclosure, the attack force of the second bullet prop is lower than the attack force of the initial bullet prop. For example, the initial bullet prop splits to generate two second bullet props. When the virtual firearm interacts with the virtual object based on the initial bullet prop, it can cause a reduction in the life value attribute of the virtual object by 40; then when the virtual firearm interacts with the virtual object based on the second bullet property, it can cause the life value attribute of the virtual object to decrease by 20.
And step S820, in response to the operation of using the second bullet prop, controlling the virtual firearm to interact with the virtual object in the virtual scene through the second bullet prop.
In an exemplary embodiment of the present disclosure, the virtual firearm is controlled to interact with a virtual object in the virtual scene through the second bullet prop in response to the use operation directed to the second bullet prop. In the interaction, a hit judgment algorithm can be adopted, a ray is generated according to the shooting angle and the direction of the virtual gun, the intersection point of the ray and the virtual object of the virtual scene is calculated, and the hit virtual object is judged according to the intersection point. Or other hit algorithm, determines the virtual object that was hit and interacts with the virtual object in the virtual scene through the second bullet prop in the virtual firearm. For example, the value of the movement speed attribute or the value of the life attribute of the virtual object may be reduced, and may be configured specifically according to actual needs, and the embodiment of the present disclosure does not limit the interaction effect here.
Through the steps S810 to S820, the number of the initial bullet properties is smaller than the second preset value, the initial bullet properties are disassembled into the second number of second bullet properties in response to the second touch operation applied to the bullet changing control, and the virtual firearm is controlled to interact with the virtual object in the virtual scene through the second bullet properties in response to the use operation for the second bullet properties.
In an exemplary embodiment of the disclosure, the number of the initial bullet props is smaller than a second preset value, in response to a second touch operation applied to the bullet changing control, a second number selection control is displayed on the graphical user interface, in response to a user operation, a target node is determined among the number of nodes, and the initial bullet props are disassembled into a second number of second bullet props according to the second number corresponding to the target node. Splitting the initial bullet prop into a second number of second bullet props according to a second number corresponding to the target node, which may include the following steps S910 to S930:
step S910, the number of the initial bullet props is smaller than a second preset value, a second touch operation acting on the bullet changing control is responded, and a second number of selection controls are displayed on a graphical user interface; wherein the second quantity selection control comprises a plurality of quantity nodes;
step S920, determining a target node among the plurality of number nodes in response to a user operation;
in step S930, the initial bullet prop is disassembled into a second number of second bullet props according to the second number corresponding to the target node.
In an exemplary embodiment of the disclosure, when the number of the initial bullet props is smaller than the second preset value, the second number selection control is displayed on the graphical user interface in response to the second touch operation acting on the bullet changing control. The second quantity selection control may be a bar control, and the second quantity selection control is sequentially arranged with a number of nodes. For example, the number nodes may correspond to positive integers 1, 2, 3 … …, and the positive integer with the largest number node may be used to indicate the number of second bullet props to be split and generated.
The second touch operation acting on the bullet changing control can be continuous left-right sliding touch operation after the bullet changing control is pressed, so that different numbers of nodes are selected by left-right movement on the second number selection control. A movable cursor can also be displayed on the graphical user interface, and the cursor is controlled to move on the second quantity selection control by responding to the left-right sliding touch operation so as to select different quantity nodes.
Through the above steps S910 to S930, the number of the initial bullet props is smaller than the second preset value, the second touch operation performed on the bullet changing control is responded, the second number selection control is displayed on the graphical user interface, the target node is determined among the plurality of number nodes in response to the user operation, and the initial bullet props are disassembled into the second number of second bullet props according to the second number corresponding to the target node.
In an exemplary embodiment of the disclosure, the splitting control may be displayed in response to a second touch operation applied to the pop-up control, and the second number of selection controls may be displayed on the graphical user interface in response to the touch operation applied to the splitting control. Responding to the touch operation acting on the split control, displaying a second number of selection controls on the graphical user interface, which may include the following steps S1010 to S1020:
step S1010, responding to a second touch operation acting on the bullet changing control, and displaying a splitting control;
step S1020, in response to the touch operation applied to the split control, displays a second number of selection controls on the graphical user interface.
In an exemplary embodiment of the disclosure, the split control may be displayed in response to a second touch operation acting on the bounce control. Specifically, after receiving the second touch operation acting on the pop-up control, the split control may be displayed in the graphical user interface. The splitting control can be used for receiving touch operation so as to display a second number of selection controls in the graphical user interface.
It should be noted that the specific form of the split control is not particularly limited in the present disclosure.
Further, the split control can be displayed in response to a second touch operation applied to the bullet changing control only when the initial bullet prop is of the target bullet type or only when the virtual firearm is of the target firearm type.
In an exemplary embodiment of the disclosure, the second number of selection controls may be displayed on the graphical user interface in response to a touch operation acting on the split control. Specifically, the split control may include a plurality of quantity nodes, and the target node may be determined in the plurality of quantity nodes, so as to determine a specific numerical value of the second quantity.
It should be noted that, the present disclosure does not specifically limit the specific manner of determining the target node among a plurality of nodes.
Through the steps S1010 to S1020, the splitting control may be displayed in response to the second touch operation applied to the bullet changing control, and the second number of selection controls may be displayed on the graphical user interface in response to the touch operation applied to the splitting control.
In an exemplary embodiment of the disclosure, a candidate target node may be determined among a plurality of nodes in response to a swiping operation applied to the split control, a duration of the swiping operation on the graphical user interface after the candidate target node is determined is obtained, and when the duration is greater than or equal to a numerical value corresponding to the candidate target node, the candidate target node is determined as the target node. When the duration is greater than or equal to the value corresponding to the candidate target node, determining the candidate target node as the target node may include the following steps S1110 to S1130:
step S1110, determining a candidate target node from a plurality of nodes in response to the swiping operation acting on the split control;
step S1120, obtaining a duration of the swiping operation on the graphical user interface after the candidate target node is determined;
in step S1130, when the duration is greater than or equal to the value corresponding to the candidate target node, the candidate target node is determined as the target node.
In an exemplary embodiment of the present disclosure, in the determining the target node, a candidate target node is first determined among the plurality of number nodes, and then the candidate target node is determined as the target node according to a duration of the stroke operation on the graphical user interface after the candidate target node is determined. For example, a second node from left to right in the second number of selection controls may be determined as a candidate target node, a duration of the stroke operation on the graphical user interface after the candidate target node is determined may be obtained, and when the stroke operation is kept unchanged for two seconds, the candidate target node is further determined as the target node.
Through the steps S1110 to S1130, a candidate target node may be determined among the plurality of nodes in response to the wipe operation performed on the split control, the duration of the wipe operation on the graphical user interface after the candidate target node is determined is obtained, and when the duration is greater than or equal to a numerical value corresponding to the candidate target node, the candidate target node is determined as the target node.
In an exemplary embodiment of the disclosure, in response to that the duration of the second touch operation acting on the pop-up control is longer than a preset duration, the split control is displayed and the pop-up control is switched and displayed as the cancel control. The canceling control is used for responding to touch operation acting on the canceling control to cancel the splitting process.
For example, when the second touch operation acting on the bullet changing control is a long-press touch operation, and the duration of the long-press touch operation is determined to be longer than a preset duration, for example, 2 seconds, the split control is displayed, and the bullet changing control is switched and displayed as a cancel control. The split control display may be located proximate to the cancel control.
For example, the cancel control 601 responds to the second touch operation acting on the bounce control and displays the second touch operation with a duration longer than a preset duration, and generates the display splitting control 602, and may respond to the second touch operation acting on the cancel control 601 again to cancel the merging process.
In an exemplary embodiment of the present disclosure, a node schematic graph may be displayed at a position corresponding to the candidate target node, and the value of the duration may be visually displayed according to the value of the duration and the value corresponding to the candidate target node, in combination with the node schematic graph. The step of visually displaying the numerical value of the duration time by combining the node schematic graph may include the following steps S1210 to S1220:
step S1210, displaying a node schematic graph at a position corresponding to the candidate target node;
step S1220, according to the value of the duration and the value corresponding to the candidate target node, and in combination with the node schematic graph, visually displaying the value of the duration.
In an exemplary embodiment of the present disclosure, a node schematic graph may be displayed at a position corresponding to the candidate target node. For example, the node schematic may be circular or annular.
Specifically, the numerical value of the duration may be visually displayed in combination with the node schematic graph according to the numerical value of the duration and the numerical value corresponding to the candidate target node.
For example, the filling speed of the circle or the display speed of the ring may be calculated according to the value of the duration and the value corresponding to the candidate target node.
For example, as shown in fig. 7, the control 702 may be a composite control or a split control, an annular progress bar is provided in the third number of nodes (candidate target nodes), and the progress of the progress bar is calculated by the value of the duration and the value corresponding to the candidate target node. It is also possible to display a completed symbol such as a hook symbol at the position of the third skilled node when the display progress of the circular progress bar reaches one hundred percent. During the time schedule and after the schedule is completed, the composition process or the splitting process can be cancelled through the cancellation control 701.
Through the steps S1210 to S1220, a node schematic graph may be displayed at a position corresponding to the candidate target node, and the value of the duration may be visually displayed according to the value of the duration and the value corresponding to the candidate target node, in combination with the node schematic graph. According to the embodiment of the disclosure, the time progress in the merging and splitting process of the determined target node can be visually displayed, the progress information can be visually displayed, and the interaction efficiency and the intuitiveness of information transmission are improved.
In an exemplary embodiment of the present disclosure, the force of attack of the second bullet prop is correlated to a second quantity. In particular, the force of attack of the second bullet prop may be inversely related to the second quantity.
In an exemplary embodiment of the present disclosure, the disassembling of the initial bullet prop into a second number of second bullet props corresponds to a disassembling time, the second number being proportional to the disassembling time. Specifically, the larger the numerical value corresponding to the second number is, the longer the splitting time required for splitting the initial bullet prop into the second bullet prop is; the smaller the value corresponding to the second number, the shorter the disassembly time required to disassemble the initial bullet prop into a second bullet prop.
An application scenario of the present embodiment is provided below. A schematic diagram of a graphical user interface is shown in fig. 8. The interface includes a game scene and a virtual object located in the game scene, which are not shown in fig. 8. The player controls the movement of the virtual object in the game scene. The interface in FIG. 8 shows a variety of controls that are displayed over a game scene.
Wherein, 800 are knapsack controls, through the touch-control operation that acts on this control, triggers to show the knapsack interface in graphical user interface for look over the article stage property of knapsack, the article stage property can include but not limited to: props such as medicines, weapons, accessories, guards, etc. In an alternative embodiment, the edge of the backpack control is provided with an edge line for indicating the amount of items in the backpack. The touch operation acting on the backpack control can be clicking, double-clicking, long-pressing, sliding and the like.
802 is a move control and the shaded filled circle is a rocker in the move control. In an optional embodiment, the movement control is configured with a movement response area, where the movement response area may be the same as the size of the movement control, that is, an area in the graphical user interface corresponding to the movement control is the movement response area. In an optional embodiment, the size of the movement response area may be larger than that of the movement control, that is, the movement control is disposed in the movement response area, and in terms of distance, the graphical user interface is divided into left and right side areas from the middle, where the left side area is the movement response area of the movement control. And triggering a movement control instruction through touch operation of a movement response area acting on the movement control, and controlling the movement direction of the virtual object according to the movement control instruction. The touch operation acting on the movement response area of the movement control can be click, double click, long press, sliding and other operations.
When the touch operation triggering the movement control instruction is detected to meet the preset condition, a 'run' instruction is triggered, and the virtual object is controlled to automatically and continuously run in the game scene. As shown in fig. 8, in an optional embodiment, when the touch operation triggering the movement control command satisfies the predetermined condition, the touch operation applied to the joystick area slides along the predetermined direction to a predetermined distance or a predetermined area (e.g., a position indicated by an arrow above the joystick). For example, when a touch operation drags the joystick upward, a sprint control 806 appears, which when triggered, the virtual object enters a sprint mode. Meanwhile, the 830 control is also a sprint control, and when the 830 control is clicked, the virtual object enters a fast running mode. In an optional embodiment, when the touch operation triggering the movement control command satisfies the preset condition, the touch operation acting on the rocker area satisfies a preset duration, or satisfies a preset pressure, and the like.
804 and 814 are attack controls, an attack instruction is triggered through touch operation acting on an attack response area of the attack controls, and the virtual character is controlled to execute attack operation in a game scene according to the attack instruction. For example, when the virtual object is equipped with a long-distance weapon, the attack control triggers that the attack behavior is a shooting behavior, that is, the left hand can execute a shooting operation by triggering 804 the control, and the right hand can execute a shooting operation by triggering 814 the control; when the virtual character is equipped with a close-combat weapon, the attack control triggers the attack behavior to be a hacking behavior and the like. In an alternative embodiment, only one of attack control 804 and attack control 814 is displayed. The display position of the attack control can be adjusted according to a position setting instruction, wherein the position setting instruction can be triggered on a setting interface or triggered in the process of dealing with the office. The touch operation acting on the attack control can be clicking, double clicking, long pressing, sliding and the like.
808 and 810 are gesture controls, and a gesture adjusting instruction is triggered by touch operation on the gesture controls to adjust the posture of the virtual object in the game scene. The trigger 808 control can control the weapon to be located at the left arm position of the virtual object and control the virtual object to stand in a left head-turning posture; the trigger 810 control may control the weapon to be in the right arm position of the virtual object and control the virtual object to stand in a right-handed head-twisted posture. 808 and 810 are used for controlling the posture of the virtual object when the virtual object avoids obstacles such as a wall body, a big tree and the like. The touch operation acting on the gesture control can be clicking, double-clicking, long-pressing, sliding and the like.
812 is a weapons slot that is triggered to switch weapons used by a player. The left side "burst" is used to indicate the current firing pattern in which multiple rounds are fired in succession. In addition to the "continuous fire" mode, it is also possible to switch to the "single fire" mode, in which a single bullet is fired. The lower shaded bar is used to show the amount of bullets in the weapon. When shooting, the bullet quantity is continuously reduced, and when the bullet quantity becomes zero, the bullet can be automatically changed, and the bullet quantity is the maximum quantity. In addition, the 816 control is a bullet changing control, the bullet changing control can be triggered to add bullets to the currently used weapon, and after the bullets are changed, the bullet amount is the maximum value. The touch operation applied to Wu Qicao may be a click, a double click, a long press, a slide, or the like.
818 and 820 are action controls, and the virtual objects are controlled to execute corresponding actions in the game scene through touch operation acting on the action controls. For example, trigger 818 may control the virtual object to squat, and trigger 820 may control the virtual object to assume a voltage ground. 822 the control may control the virtual object to perform a jump action. The touch operation acting on the action control can be clicking, double-clicking, long-pressing, sliding and the like.
The 824 control is used for controlling a current game scene to enter a mirror opening mode, when a first touch operation acting on the 824 control is detected, the game scene is controlled to enter the mirror opening mode, in the mirror opening mode, a sighting lens is displayed in the game scene, the game scene seen by the virtual object through the sighting lens is also displayed, and the shooting accuracy rate can be improved in the mirror opening mode. When a second touch operation acting on the control 824 is detected, control exits the open mirror mode. In an optional embodiment, the first touch operation and the second touch operation are independent touch operations, for example, clicking 824 the control to control the current game scene to enter the open mirror mode, clicking 824 the control again to exit the open mirror mode, and displaying the game scene shot by the virtual camera following the virtual object again. In an optional embodiment, the first touch operation and the second touch operation may be a start operation and an end operation of one touch operation, for example, when the touch 824 is detected, the game scene is controlled to enter the open mirror mode, during the continuous pressing, the game scene is controlled to be continuously in the open mirror mode, and when the touch cutoff triggering the touch operation is detected to leave the touch detection area, the game scene is controlled to exit the open mirror mode.
826 controls are used to control the camera parameters of a virtual camera presenting a game scene to adjust the view of the game scene displayed in the graphical user interface. In an alternative embodiment, it is determined that a touch operation is detected on 826 the control, and shooting parameters of the virtual camera are adjusted according to the touch operation, where the shooting parameters include, but are not limited to, a position, an orientation, an angle of view, and the like of the virtual camera. The touch operation on the 826 control can be clicking, double clicking, long pressing, sliding and the like.
The 828 control is a marking control, and marks a virtual object, a virtual object and the like in the game scene through touch operation acting on the marking control. The 832 control is a setting control, clicking the 832 control, and displaying a setting menu for setting the basic functions of the current game. The touch operation acting on the 828 control can be clicking, double clicking, long pressing, sliding and the like.
834 and 836 controls are message controls, where 834 can be used to view system notifications, 836 can be used to view messages sent by teammates, or to send messages to teammates. 838 is a small map that shows the position of the virtual object controlled by a player, and may also show the position of some other player's virtual object.
On the basis of the application scenario, the number of initial bullet props of the virtual firearm can be obtained, the number of the initial bullet props is larger than a first preset value, the initial bullet props of the first number are combined and generated into first bullet props in response to first touch operation acting on the cartridge changing control, and the virtual firearm is controlled to interact with virtual objects in the virtual scenario through the first bullet props in response to using operation aiming at the first bullet props so as to execute the virtual firearm interaction method.
On the basis of the application scenario described above, the player may trigger execution of the virtual firearm interaction method in this embodiment through the bullet-changing control 816.
In the technical solutions provided in some embodiments of the present disclosure, the number of initial bullet props of a virtual firearm may be obtained, where the number of initial bullet props is greater than a first preset value, a first touch operation applied to a cartridge change control is responded, the first number of initial bullet props are combined and generated into a first bullet prop, and a use operation for the first bullet prop is responded, so that the virtual firearm is controlled to interact with a virtual object in a virtual scene through the first bullet prop. By implementing the embodiment of the disclosure, on one hand, when the number of the bullet props is greater than the first preset value, the bullet props can be combined to enhance the preset shooting attribute of the virtual firearm, so that a better interaction effect can be achieved without a large amount of interaction operation, and the interaction efficiency is improved; on the other hand, the interaction mode of the virtual gun can be defined by the user, the interaction in a single fixed mode is avoided, and the flexibility and the degree of freedom of the interaction process are improved.
Further, in the present exemplary embodiment, a virtual firearm interaction device 900 is also provided. Providing a graphical user interface through the terminal equipment, wherein at least part of the graphical user interface displays a virtual scene and a bullet changing control, and the virtual scene comprises a virtual object. Referring to fig. 9, the virtual firearm interface 900 may include:
an initial number obtaining module 910, configured to obtain a number of initial bullet properties of the virtual firearm; the property synthesizing module 920 is configured to respond to a first touch operation acting on the cartridge changing control, and combine the initial bullet properties of a first number into a first bullet property, where the number of the initial bullet properties is greater than a first preset value; wherein the first number is less than or equal to the number of the initial bullet prop, the attack force of the first bullet prop being higher than the attack force of the initial bullet prop; and a shooting interaction module 930 for controlling the virtual firearm to interact with the virtual object in the virtual scene through the first bullet prop in response to the use operation for the first bullet prop.
In an exemplary embodiment of the present disclosure, the apparatus further comprises: the second preset value comparison unit is used for responding to a second touch operation acted on the bullet changing control piece when the number of the initial bullet props is smaller than a second preset value, and splitting the initial bullet props into a second number of second bullet props; wherein the second number is less than or equal to the maximum loading number of the bullet props of the virtual firearm, the attack force of the second bullet prop being lower than the attack force of the initial bullet prop; and the second use operation response unit is used for responding to the use operation aiming at the second bullet prop and controlling the virtual firearm to interact with the virtual object in the virtual scene through the second bullet prop.
In an exemplary embodiment of the present disclosure, the apparatus further includes a first touch control unit, responsive to a first touch operation on the cartridge change control unit, for combining and generating the first number of initial cartridge props into a first cartridge prop, and the apparatus further includes: the first preset value comparison unit is used for responding to a first touch operation acted on the bullet changing control piece when the number of the initial bullet props is larger than a first preset value, and displaying a first number selection control piece on a graphical user interface; wherein the first quantity selection control comprises a plurality of quantity nodes; a first user operation response unit configured to determine a target node among the plurality of number nodes in response to a user operation; the first number reference unit is used for combining and generating a first number of initial bullet properties into a first bullet property according to a first number corresponding to the target node.
In an exemplary embodiment of the disclosure, the step of displaying a first number of selection controls on the graphical user interface in response to a first touch operation on the changer control, the apparatus further includes: the synthetic control display unit is used for responding to a first touch operation acting on the bullet changing control and displaying the synthetic control; and the synthesized touch response unit is used for responding to the touch operation acted on the synthesized control and displaying the first quantity of selection controls on the graphical user interface.
In an exemplary embodiment of the present disclosure, in response to a user operation, the step of determining a target node among the plurality of number nodes, the apparatus further includes: and the first sliding operation response unit is used for responding to the sliding operation acted on the composite control and determining a target node in the plurality of nodes.
In an exemplary embodiment of the disclosure, the step of displaying the first number of selection controls on the graphical user interface, the apparatus further comprises: a loading quantity reference unit for displaying a plurality of quantity nodes corresponding to the maximum loading quantity of the bullet properties in a first quantity selection control according to the maximum loading quantity of the bullet properties of the virtual firearm; wherein the plurality of number nodes comprises the same number of adjustable nodes as the initial bullet prop, and the adjustable nodes correspond to an adjustable area; and the highlighting unit is used for highlighting the adjustable area in a preset mode.
In an exemplary embodiment of the disclosure, the step of determining a target node among a plurality of number nodes in response to a stroking operation acting on the composite control, the apparatus further includes: a first candidate target node determining unit, configured to determine a candidate target node among the plurality of nodes in response to a stroking operation applied to the composite control; the plurality of quantity nodes comprise candidate target nodes; the first duration obtaining unit is used for obtaining the duration of the swiping operation on the graphical user interface after the candidate target node is determined; and the first duration comparison unit is used for determining the candidate target node as the target node when the duration is greater than or equal to the value corresponding to the candidate target node.
In an exemplary embodiment of the present disclosure, the step of displaying the composite control in response to the first touch operation applied to the pop-up control, the apparatus further includes: the first duration comparison unit is used for responding that the duration of the first touch operation acting on the bullet changing control is longer than the preset duration, displaying the synthesized control and switching and displaying the bullet changing control into a cancel control; the cancel control is used for responding to the touch operation acted on the cancel control to cancel the merging process.
In an exemplary embodiment of the present disclosure, in response to a first touch operation being applied to the charge changing control, a first number of initial cartridge props are combined and generated as a first cartridge prop, the apparatus further comprising: a maximum loading number synthesizing unit for synthesizing and generating the initial bullet props in the virtual firearms into first bullet props when the number of the initial bullet props in the virtual firearms is equal to the maximum loading number of the bullet props of the virtual firearms; wherein the number of initial bullet props in the virtual firearm is a first number.
In an exemplary embodiment of the present disclosure, the force of attack of the first bullet prop is associated with a first quantity.
In an exemplary embodiment of the present disclosure, in response to a first touch operation being applied to the charge changing control, a first number of initial cartridge props are combined and generated as a first cartridge prop, the apparatus further comprising: the system comprises a preset merging unit, a first control unit and a second control unit, wherein the preset merging unit is used for responding to a first touch operation acting on a bullet changing control and merging a first number of initial bullet props into a plurality of first bullet props according to a preset merging mode; the preset combination mode is used for indicating the number of initial bullet props required by combination generation to obtain one first bullet prop.
In an exemplary embodiment of the present disclosure, combining and generating a first number of initial bullet properties as a first bullet property in response to a first touch operation acting on a charge exchange control comprises: the first candidate bullet prop merging unit is used for merging a first number of initial bullet props into a plurality of first candidate bullet props in response to a first touch operation acting on the bullet changing control; and the first bullet prop resynthesis unit is used for combining and generating a plurality of first candidate bullet props into the first bullet prop in response to the third touch operation acting on the bullet changing control.
In an exemplary embodiment of the present disclosure, a first number of initial bullets are combined and generated to correspond to a first bullet prop with a combination time, the first number being proportional to the combination time.
In an exemplary embodiment of the present disclosure, the apparatus further includes a first touch control unit configured to perform a first touch control operation on the cartridge changer control to detach the initial cartridge prop into a first number of first cartridge props, and a second touch control unit configured to perform a second touch control operation on the cartridge changer control to detach the initial cartridge prop into a second number of second cartridge props, where the first number of first cartridge props is less than a second preset value: the second preset value comparison unit is used for responding to a second touch operation acted on the bullet changing control and displaying a second number of selection controls on the graphical user interface, wherein the number of the initial bullet props is smaller than a second preset value; wherein the second quantity selection control comprises a plurality of quantity nodes; a user operation response unit configured to determine a target node among the plurality of number nodes in response to a user operation; and the initial bullet prop splitting unit is used for splitting the initial bullet props into second bullet props of a second quantity according to the second quantity corresponding to the target node.
In an exemplary embodiment of the disclosure, the step of displaying a second number of selection controls on the graphical user interface in response to a second touch operation acting on the changer control, the apparatus further includes: the splitting control display unit is used for responding to a second touch operation acting on the bullet changing control and displaying the splitting control; and the split control touch unit is used for responding to touch operation acted on the split control and displaying a second number of selection controls on the graphical user interface.
In an exemplary embodiment of the present disclosure, in response to a user operation, the step of determining a target node among the plurality of number nodes, the apparatus further includes: and the second sliding operation response unit is used for acting on the sliding operation of the split control and determining a target node in the plurality of nodes.
In an exemplary embodiment of the present disclosure, in response to a swiping operation acting on the split control, the step of determining a target node among the plurality of number nodes, the apparatus further includes: the second candidate target node determining unit is used for responding to the sliding operation acting on the split control and determining a candidate target node in the plurality of nodes; the second duration obtaining unit is used for obtaining the duration of the swiping operation on the graphical user interface after the candidate target node is determined; and the second duration comparison unit is used for determining the candidate target node as the target node when the duration is greater than or equal to the value corresponding to the candidate target node.
In an exemplary embodiment of the present disclosure, in response to a second touch operation applied to the pop-up control, the step of displaying the split control includes: the second duration comparison unit is used for responding that the duration of a second touch operation acting on the bullet changing control is longer than the preset duration, displaying the splitting control and switching and displaying the bullet changing control into a cancelling control; the canceling control is used for responding to touch operation acting on the canceling control to cancel the splitting process.
In an exemplary embodiment of the present disclosure, the apparatus further comprises: the node schematic graph display unit is used for displaying a node schematic graph at a position corresponding to the candidate target node; and the visual display unit is used for visually displaying the numerical value of the duration time by combining the node schematic graph according to the numerical value of the duration time and the numerical value corresponding to the candidate target node.
In an exemplary embodiment of the present disclosure, the force of attack of the second bullet prop is correlated to a second quantity.
In an exemplary embodiment of the present disclosure, the disassembling of the initial bullet prop into a second number of second bullet props corresponds to a disassembling time, the second number being proportional to the disassembling time.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the virtual firearm interaction method is also provided.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 1000 according to such an embodiment of the present disclosure is described below with reference to fig. 10. The electronic device 1000 shown in fig. 10 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 10, the electronic device 1000 is embodied in the form of a general purpose computing device. The components of the electronic device 1000 may include, but are not limited to: the at least one processing unit 1010, the at least one memory unit 1020, a bus 1030 connecting different system components (including the memory unit 1020 and the processing unit 1010), and a display unit 1040.
Where the storage unit stores program code that may be executed by the processing unit 1010 to cause the processing unit 1010 to perform the steps according to various exemplary embodiments of the present disclosure described in the "exemplary methods" section above in this specification. For example, the processing unit 1010 may execute step S210 as shown in fig. 2, obtaining the number of initial bullet props of the virtual firearm; step S220, combining the initial bullet props of a first number into a first bullet prop in response to a first touch operation acting on the bullet changing control piece, wherein the number of the initial bullet props is larger than a first preset value; wherein the first number is less than or equal to the number of the initial bullet prop, the attack force of the first bullet prop being higher than the attack force of the initial bullet prop; and step S230, in response to the operation of using the first bullet prop, controlling the virtual firearm to interact with the virtual object in the virtual scene through the first bullet prop.
As another example, the electronic device may implement the various steps shown in FIG. 2.
The memory unit 1020 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM) 1021 and/or a cache memory unit 1022, and may further include a read-only memory unit (ROM) 1023.
Storage unit 1020 may also include a program/utility 1024 having a set (at least one) of program modules 1025, such program modules 1025 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 1030 may be any one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, and a local bus using any of a variety of bus architectures.
The electronic device 1000 may also communicate with one or more external devices 1070 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 1000, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 1000 to communicate with one or more other computing devices. Such communication may occur through input/output (I/O) interfaces 1050. Also, the electronic device 1000 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet) via the network adapter 1060. As shown, the network adapter 1060 communicates with the other modules of the electronic device 1000 over the bus 1030. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 1000, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the above-mentioned "exemplary methods" section of this specification, when the program product is run on the terminal device.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (24)

1. A virtual firearm interaction method is characterized in that a graphical user interface is provided through a terminal device, the graphical user interface at least partially displays a virtual scene and a bullet changing control, the virtual scene comprises virtual objects, and the method comprises the following steps:
acquiring the number of initial bullet props of the virtual firearm;
the number of the initial bullet props is larger than a first preset value, and the initial bullet props of the first number are combined and generated into a first bullet prop in response to a first touch operation acting on the bullet changing control; wherein the first number is less than or equal to the number of the initial bullet prop, the first bullet prop having a higher force of attack than the initial bullet prop;
controlling the virtual firearm to interact with a virtual object in a virtual scene through the first bullet prop in response to a use operation directed to the first bullet prop.
2. The method of claim 1, further comprising:
the number of the initial bullet props is smaller than a second preset value, and the initial bullet props are disassembled into a second number of second bullet props in response to second touch operation acting on the bullet changing control; wherein the second number is less than or equal to the maximum loaded number of bullet props of the virtual firearm, the attack force of the second bullet prop being lower than the attack force of the initial bullet prop;
controlling the virtual firearm to interact with a virtual object in a virtual scene through the second bullet prop in response to a use operation directed to the second bullet prop.
3. The method of claim 1, wherein the number of initial bullet properties is greater than a first predetermined value, and wherein the step of combining and generating the first number of initial bullet properties into the first bullet property in response to the first touch operation applied to the charge change control comprises:
the number of the initial bullet props is larger than a first preset value, a first number selection control is displayed on the graphical user interface in response to a first touch operation acting on the bullet changing control; wherein the first quantity selection control comprises a plurality of quantity nodes;
determining a target node among the plurality of quantity nodes in response to a user operation;
and combining and generating the first number of initial bullet properties into a first bullet property according to the first number corresponding to the target node.
4. The method of claim 3, wherein the step of displaying a first number of selection controls on the graphical user interface in response to a first touch operation on the changer control comprises:
responding to a first touch operation acting on the bullet changing control, and displaying a synthesized control;
and responding to the touch operation acted on the synthesis control, and displaying a first number of selection controls on the graphical user interface.
5. The method of claim 3, wherein the step of determining a target node among the plurality of quantity nodes in response to a user action comprises:
determining a target node among the plurality of quantity nodes in response to a stroking operation acting on the composite control.
6. The method of claim 3, wherein the step of displaying a first number of selection controls on the graphical user interface comprises:
displaying a plurality of quantity nodes corresponding to the maximum loading quantity of the bullet props in the first quantity selection control according to the maximum loading quantity of the bullet props of the virtual firearm; wherein the number of nodes includes the same number of adjustable nodes as the initial bullet prop, the adjustable nodes corresponding to an adjustable region;
and highlighting the adjustable area in a preset mode.
7. The method of claim 5, wherein the step of determining a target node among the plurality of nodes in response to a stroking operation on the composite control comprises:
determining a candidate target node among the plurality of quantity nodes in response to a stroking operation acting on the composite control; wherein the plurality of quantity nodes comprise candidate target nodes;
obtaining the duration of the stroking operation on the graphical user interface after the candidate target node is determined;
and when the duration is greater than or equal to the value corresponding to the candidate target node, determining the candidate target node as the target node.
8. The method of claim 4, wherein the step of displaying a composite control in response to the first touch operation on the swap control comprises:
responding that the duration of a first touch operation acting on the bullet changing control is longer than a preset duration, displaying a synthesized control and switching and displaying the bullet changing control as a cancel control;
the cancellation control is used for responding to touch operation acting on the cancellation control to cancel the merging process.
9. The method of claim 1, wherein combining and generating a first number of initial bullets as first bullets in response to a first touch operation applied to the changer control comprises:
combining and generating initial bullet properties in the virtual firearm as first bullet properties when the number of initial bullet properties in the virtual firearm is equal to the maximum loaded number of bullet properties for the virtual firearm;
wherein the number of initial bullet props in the virtual firearm is a first number.
10. The method of claim 1, wherein the force of attack of the first bullet prop is associated with the first quantity.
11. The method of claim 1, wherein combining and generating a first number of initial bullets as first bullets in response to a first touch operation applied to the changer control comprises:
responding to a first touch operation acted on the bullet changing control, and combining a first number of initial bullet properties according to a preset combination mode to generate a plurality of first bullet properties;
the preset combination mode is used for indicating the number of initial bullet props required by combination generation to obtain one first bullet prop.
12. The method of claim 1, wherein combining and generating a first number of initial bullets as first bullets in response to a first touch operation applied to the changer control comprises:
in response to a first touch operation acting on the bullet changing control, combining a first number of initial bullet props to generate a plurality of first candidate bullet props;
and combining a plurality of first candidate bullets into a first bullet in response to a third touch operation acting on the bullet changing control.
13. The method of claim 1, wherein combining and generating the first number of initial bullets into the first bullet prop corresponds to a synthesis time, and wherein the first number is proportional to the synthesis time.
14. The method of claim 2, wherein the number of initial bullet properties is less than a second predetermined value, and the step of disassembling the initial bullet properties into a second number of second bullet properties in response to a second touch operation applied to the charge change control comprises:
the number of the initial bullet props is smaller than a second preset value, a second touch operation acting on the bullet changing control is responded, and a second number of selection controls are displayed on the graphical user interface; wherein the second quantity selection control comprises a plurality of quantity nodes;
determining a target node among the plurality of quantity nodes in response to a user operation;
and splitting the initial bullet prop into a second number of second bullet props according to the second number corresponding to the target node.
15. The method of claim 14, wherein the step of displaying a second number of selection controls on the graphical user interface in response to a second touch operation on the bounce control comprises:
responding to a second touch operation acting on the bullet changing control, and displaying a splitting control;
and responding to the touch operation acted on the split control, and displaying a second number of selection controls on the graphical user interface.
16. The method of claim 14, wherein the step of determining a target node among the plurality of quantity nodes in response to a user action comprises:
determining a target node among the plurality of quantity nodes in response to a swiping operation acting on the split control.
17. The method of claim 16, wherein the step of determining a target node among the plurality of nodes in response to a stroking operation applied to the split control comprises:
determining a candidate target node among the plurality of quantity nodes in response to a swiping operation acting on the split control;
obtaining the duration of the stroke operation on the graphical user interface after the candidate target node is determined;
and when the duration is greater than or equal to the value corresponding to the candidate target node, determining the candidate target node as the target node.
18. The method of claim 15, wherein the step of displaying a split control in response to a second touch operation applied to the bounce control comprises:
responding that the duration of a second touch operation acting on the bullet changing control is longer than a preset duration, displaying a splitting control and switching and displaying the bullet changing control as a cancelling control;
the cancellation control is used for responding to touch operation acting on the cancellation control to cancel the splitting process.
19. The method according to claim 12 or 17, further comprising:
displaying a node schematic graph at a position corresponding to the candidate target node;
and according to the numerical value of the duration and the numerical value corresponding to the candidate target node, combining the node schematic graph, and visually displaying the numerical value of the duration.
20. The method of claim 2, wherein the force of attack of the second bullet prop is correlated to the second quantity.
21. The method of claim 2, wherein said disassembling the initial bullet prop into a second number of second bullet props corresponds to a disassembly time, said second number being proportional to said disassembly time.
22. A virtual firearm interaction device is characterized in that a graphical user interface is provided through a terminal device, the graphical user interface at least partially displays a virtual scene and a bullet changing control, the virtual scene comprises virtual objects, and the device comprises:
the initial number acquisition module is used for acquiring the number of the initial bullet props of the virtual firearm;
the property synthesis module is used for responding to a first touch operation acted on the bullet changing control, and synthesizing and generating a first number of initial bullet properties into a first bullet property, wherein the number of the initial bullet properties is larger than a first preset value; wherein the first number is less than or equal to the number of the initial bullet hole, the first bullet hole having a higher force of attack than the initial bullet hole;
and the shooting interaction module is used for responding to the use operation aiming at the first bullet prop and controlling the virtual firearm to interact with a virtual object in a virtual scene through the first bullet prop.
23. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a virtual firearm interaction method according to any one of claims 1 to 21.
24. An electronic device, comprising:
one or more processors;
a storage device to store one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the virtual firearm interaction method of any of claims 1-21.
CN202211274167.8A 2022-10-18 2022-10-18 Virtual gun interaction method and device, storage medium and electronic equipment Pending CN115624753A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211274167.8A CN115624753A (en) 2022-10-18 2022-10-18 Virtual gun interaction method and device, storage medium and electronic equipment
PCT/CN2023/083854 WO2024082552A1 (en) 2022-10-18 2023-03-24 Interaction method and apparatus for virtual gun, and storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211274167.8A CN115624753A (en) 2022-10-18 2022-10-18 Virtual gun interaction method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN115624753A true CN115624753A (en) 2023-01-20

Family

ID=84906569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211274167.8A Pending CN115624753A (en) 2022-10-18 2022-10-18 Virtual gun interaction method and device, storage medium and electronic equipment

Country Status (2)

Country Link
CN (1) CN115624753A (en)
WO (1) WO2024082552A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024082552A1 (en) * 2022-10-18 2024-04-25 网易(杭州)网络有限公司 Interaction method and apparatus for virtual gun, and storage medium and electronic device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000317135A (en) * 1999-05-10 2000-11-21 Namco Ltd Gun game device and method for determining its shot position
CN112752945A (en) * 2018-07-02 2021-05-04 梦境沉浸股份有限公司 Firearm simulation arrangement for virtual reality systems
CN110681152B (en) * 2019-10-17 2022-07-22 腾讯科技(深圳)有限公司 Virtual object control method, device, terminal and storage medium
CN110947176B (en) * 2019-11-29 2022-03-25 腾讯科技(深圳)有限公司 Virtual object control method, bullet number recording method, device, and medium
CN112973115A (en) * 2021-03-05 2021-06-18 网易(杭州)网络有限公司 Method and device for processing shooting in game and electronic equipment
CN112870709B (en) * 2021-03-22 2023-03-17 腾讯科技(深圳)有限公司 Display method and device of virtual prop, electronic equipment and storage medium
CN115624753A (en) * 2022-10-18 2023-01-20 网易(杭州)网络有限公司 Virtual gun interaction method and device, storage medium and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024082552A1 (en) * 2022-10-18 2024-04-25 网易(杭州)网络有限公司 Interaction method and apparatus for virtual gun, and storage medium and electronic device

Also Published As

Publication number Publication date
WO2024082552A1 (en) 2024-04-25

Similar Documents

Publication Publication Date Title
EP3939681A1 (en) Virtual object control method and apparatus, device, and storage medium
JP2022522699A (en) Virtual object control methods, devices, terminals and programs
JP2021515601A (en) Information processing methods, devices, storage media and electronic devices
CN112076473B (en) Control method and device of virtual prop, electronic equipment and storage medium
CN113398601B (en) Information transmission method, information transmission device, computer-readable medium, and apparatus
JP7477640B2 (en) Virtual environment screen display method, device, and computer program
CN111888766B (en) Information processing method and device in game, electronic equipment and storage medium
WO2022227958A1 (en) Virtual carrier display method and apparatus, device, and storage medium
CN112138385B (en) Virtual shooting prop aiming method and device, electronic equipment and storage medium
US20230330537A1 (en) Virtual object control method and apparatus, terminal and storage medium
CN113975807A (en) Method and device for generating information in game, electronic equipment and readable storage medium
CN115624753A (en) Virtual gun interaction method and device, storage medium and electronic equipment
CN113198177A (en) Game control display method and device, computer storage medium and electronic equipment
WO2024001191A1 (en) Operation method and apparatus in game, nonvolatile storage medium, and electronic apparatus
WO2023201985A1 (en) Virtual equipment skin changing method and apparatus, storage medium and electronic device
CN115645923A (en) Game interaction method and device, terminal equipment and computer-readable storage medium
CN112843718B (en) Equipment switching method, device, medium and electronic equipment
CN113499582B (en) Method and device for checking virtual articles in game
JP7423137B2 (en) Operation presentation method, device, terminal and computer program
JP2024509072A (en) Virtual object interaction methods, devices, storage media and electronic devices
CN113941152A (en) Virtual object control method and device, electronic equipment and storage medium
CN111905380A (en) Virtual object control method, device, terminal and storage medium
US20230264103A1 (en) Virtual accessory using method, related apparatus, device, and storage medium
CN113663326B (en) Aiming method and device for game skills
WO2024032176A1 (en) Virtual item processing method and apparatus, electronic device, storage medium, and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination