CN111760284A - Virtual item control method, device, equipment and storage medium - Google Patents

Virtual item control method, device, equipment and storage medium Download PDF

Info

Publication number
CN111760284A
CN111760284A CN202010808039.1A CN202010808039A CN111760284A CN 111760284 A CN111760284 A CN 111760284A CN 202010808039 A CN202010808039 A CN 202010808039A CN 111760284 A CN111760284 A CN 111760284A
Authority
CN
China
Prior art keywords
virtual
prop
virtual prop
area
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010808039.1A
Other languages
Chinese (zh)
Inventor
徐育通
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010808039.1A priority Critical patent/CN111760284A/en
Publication of CN111760284A publication Critical patent/CN111760284A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Abstract

The application discloses a virtual prop control method, device, equipment and storage medium, and belongs to the technical field of computers. In the embodiment of the application, on the one hand, a novel throwing type virtual prop is introduced, virtual prop in the correlation technique is different from and is thrown and then run into the bounce of a virtual object, the novel throwing type virtual prop has the adhesion capability, can not be rebounded after running into any virtual object when being thrown, but adhere to the surface of the virtual object, the situation that the rebound leads to the expected movement track of a deviated user can not occur, the deformation position and the like of the virtual prop accord with the expectation of the user, the user can accurately control the virtual prop, the precision of controlling the virtual prop is improved, and the control effect on the virtual prop is good. On the other hand, the virtual prop can deform after being adhered and display a first target special effect, the type of the virtual prop and the effect generated after the virtual prop is adhered can be visually prompted through the deformation and the first target special effect, and the display effect is better.

Description

Virtual item control method, device, equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for controlling a virtual prop.
Background
With the development of computer technology and the diversification of terminal functions, more and more games can be played on the terminal. The shooting game is a more popular game, the terminal can display a virtual scene in the interface and display a virtual character in the virtual scene, and the virtual character can control the virtual prop to fight against other virtual characters.
At present, taking a throwing type virtual item as an example, the control method of the part of virtual item generally includes: the terminal controls the virtual character to throw the virtual prop according to the throwing instruction, and when the virtual prop collides with any virtual object in the virtual scene, the virtual prop is controlled to rebound until the virtual prop is thrown for a certain time, and then the virtual prop is controlled to explode.
The virtual prop rebounds after being touched by a virtual object, the motion track of the virtual prop after rebounding deviates from the expectation of a user, and the user cannot accurately master the range of injury caused by the virtual prop, so that the control precision of the virtual prop is low, the expectation of the user is not met, and the control effect is poor.
Disclosure of Invention
The embodiment of the application provides a method, a device, equipment and a storage medium for controlling a virtual prop, which can improve the control precision and the control effect of the virtual prop and have better display effect. The technical scheme is as follows:
in one aspect, a method for controlling a virtual item is provided, where the method includes:
responding to a throwing instruction of the virtual prop, and controlling the virtual prop to move in a virtual scene according to a target motion track;
responding to collision between the virtual prop and any virtual object on the target motion track in the virtual scene, controlling the virtual prop to be adhered to the surface of the virtual object, and deforming at an adhered position;
and displaying a first target special effect in a first area corresponding to the adhesion position.
In one possible implementation manner, the controlling the virtual prop to adhere to the surface of the virtual object in response to the virtual prop colliding with any virtual object on the target motion track in the virtual scene includes:
in response to the target motion trail and the surface of any virtual object in the virtual scene having an intersection point, controlling the virtual prop to be adhered to the surface of the virtual object at the intersection point.
In one aspect, a virtual prop control apparatus is provided, the apparatus including:
the control module is used for responding to a throwing instruction of the virtual prop and controlling the virtual prop to move in a virtual scene according to a target motion track;
the control module is further used for responding to the collision between the virtual prop and any virtual object on the target motion track in the virtual scene, controlling the virtual prop to be adhered to the surface of the virtual object and deform at the adhered position;
and the display module is used for displaying the first target special effect in the first area corresponding to the adhesion position.
In one possible implementation manner, the control module is configured to, in response to a collision of the virtual prop with any virtual object on the target motion trajectory in the virtual scene, control the virtual prop to be attached to the surface of the virtual object at a collision position according to the collision position of the virtual prop with the surface of the virtual object.
In one possible implementation manner, the control module is configured to, in response to that there is an intersection between the target motion trajectory and a surface of any virtual object in the virtual scene, control the virtual prop to be attached to the surface of the virtual object at the intersection.
In one possible implementation, the control module includes a determination unit and a control unit;
the determining unit is used for determining the deformation direction of the virtual prop according to the surface of the virtual object and the adhesion position;
the control unit is used for controlling the virtual prop to deform on the adhesion position along the deformation direction.
In a possible implementation manner, the determining unit is configured to obtain a normal direction at the adhesion position on the surface of the virtual object, and determine the normal direction as a deformation direction of the virtual prop.
In one possible implementation, the display module includes an acquisition unit and a display unit;
the acquisition unit is used for acquiring a first area corresponding to the adhesion position according to the adhesion position;
the display unit is used for displaying a first target special effect in the first area.
In one possible implementation manner, the acquiring unit is configured to acquire, as the first area corresponding to the adhesion position, an area with the adhesion position as a center and a radius as a target radius.
In one possible implementation, the apparatus further includes:
the first acquisition module is used for acquiring a second area corresponding to the adhesion position according to the adhesion position;
the control module is further used for responding to that any virtual character is located in the second area when the virtual prop deforms, and controlling the virtual life value of the virtual character to be reduced.
In one possible implementation, the apparatus further includes:
the second acquisition module is used for acquiring a third area corresponding to the adhesion position according to the adhesion position;
the control module is further used for responding to any virtual character in the third area and controlling the virtual life value of the virtual character to continuously reduce.
In a possible implementation, the first region and the third region are the same, or the third region includes the first region and the third region is larger than the first region; or, the third area is located in the first area, and the third area is smaller than the first area.
In one possible implementation manner, the control module is further configured to control the virtual life value of the currently controlled virtual character to continuously decrease in response to the currently controlled virtual character being located in the third area;
the display module is further used for displaying a second target special effect in the user graphical interface, wherein the second target special effect is used for prompting that the virtual life value of the currently controlled virtual role is reduced.
In one possible implementation, the control module is further configured to perform controlling the third area to move as the virtual object moves in response to the virtual object moving in the virtual scene.
In a possible implementation manner, the display duration of the first target special effect is a target duration, and a function that is located in the third area after the target duration and causes damage to the virtual character disappears.
In one aspect, an electronic device is provided, and the electronic device includes one or more processors and one or more memories, where at least one program code is stored in the one or more memories, and the at least one program code is loaded and executed by the one or more processors to implement various optional implementations of the above-described virtual prop control method.
In one aspect, a computer-readable storage medium is provided, in which at least one program code is stored, and the at least one program code is loaded and executed by a processor to implement various optional implementations of the above virtual prop control method.
In one aspect, a computer program product or computer program is provided that includes one or more program codes stored in a computer-readable storage medium. One or more processors of the electronic device can read the one or more program codes from the computer-readable storage medium, and the one or more processors execute the one or more program codes, so that the electronic device can execute the virtual prop control method of any one of the above possible embodiments.
In the embodiment of the application, on the one hand, a novel throwing type virtual prop is introduced, virtual prop in the correlation technique is different from and is thrown and then run into the bounce of a virtual object, the novel throwing type virtual prop has the adhesion capability, can not be rebounded after running into any virtual object when being thrown, but adhere to the surface of the virtual object, the situation that the rebound leads to the expected movement track of a deviated user can not occur, the deformation position and the like of the virtual prop accord with the expectation of the user, the user can accurately control the virtual prop, the precision of controlling the virtual prop is improved, and the control effect on the virtual prop is good. On the other hand, the virtual prop can deform after being adhered and display a first target special effect, the type of the virtual prop and the effect generated after the virtual prop is adhered can be visually prompted through the deformation and the first target special effect, and the display effect is better.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to be able to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment of a virtual item control method provided in an embodiment of the present application;
fig. 2 is a flowchart of a method for controlling a virtual item according to an embodiment of the present application;
fig. 3 is a flowchart of a method for controlling a virtual item according to an embodiment of the present application;
FIG. 4 is a schematic illustration of an introduction interface of a virtual item provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of a terminal interface provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of a terminal interface provided in an embodiment of the present application;
FIG. 7 is a schematic diagram of a radiation detection method provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a terminal interface provided in an embodiment of the present application;
FIG. 9 is a schematic diagram of a terminal interface provided in an embodiment of the present application;
FIG. 10 is a terminal interface schematic provided by an embodiment of the present application;
fig. 11 is a schematic diagram of a terminal interface and a schematic diagram of obtaining a damaged area according to an embodiment of the present application;
fig. 12 is a flowchart of a method for controlling a virtual item according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a virtual item control device according to an embodiment of the present application;
fig. 14 is a block diagram of a terminal according to an embodiment of the present disclosure;
fig. 15 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The terms "first," "second," and the like in this application are used for distinguishing between similar items and items that have substantially the same function or similar functionality, and it should be understood that "first," "second," and "nth" do not have any logical or temporal dependency or limitation on the number or order of execution. It will be further understood that, although the following description uses the terms first, second, etc. to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first image can be referred to as a second image, and similarly, a second image can be referred to as a first image without departing from the scope of various described examples. The first image and the second image can both be images, and in some cases, can be separate and distinct images.
The term "at least one" is used herein to mean one or more, and the term "plurality" is used herein to mean two or more, e.g., a plurality of packets means two or more packets.
It is to be understood that the terminology used in the description of the various described examples herein is for the purpose of describing particular examples only and is not intended to be limiting. As used in the description of the various described examples and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. The term "and/or" is an associative relationship that describes an associated object, meaning that three relationships can exist, e.g., a and/or B, can mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" in the present application generally indicates that the former and latter related objects are in an "or" relationship.
It should also be understood that, in the embodiments of the present application, the size of the serial number of each process does not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
It should also be understood that determining B from a does not mean determining B from a alone, but can also determine B from a and/or other information.
It will be further understood that the terms "Comprises," "Comprising," "inCludes" and/or "inCluding," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also understood that the term "if" may be interpreted to mean "when" ("where" or "upon") or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined." or "if [ a stated condition or event ] is detected" may be interpreted to mean "upon determining.. or" in response to determining. "or" upon detecting [ a stated condition or event ] or "in response to detecting [ a stated condition or event ]" depending on the context.
The following is a description of terms involved in the present application.
Virtual scene: is a virtual scene that is displayed (or provided) by an application program when the application program runs on a terminal. The virtual scene may be a simulation environment of a real world, a semi-simulation semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, the virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as desert, city, etc., and the user may control the virtual character to move in the virtual scene.
Virtual object: refers to an object in a virtual scene that is an imaginary object used to simulate a real object or creature. Such as characters, animals, plants, oil drums, walls, rocks, etc., displayed in a virtual scene. The virtual object includes a virtual object and a virtual character, wherein the virtual object is an object with an inanimate property, for example, the virtual object may be a virtual building, a virtual vehicle, a virtual prop, or the like. A virtual character refers to an object having a life attribute, and for example, the virtual character may be a virtual character, a virtual animal, or the like.
Optionally, the virtual objects include movable virtual objects and non-movable virtual objects. Such as movable virtual vehicles, movable virtual characters, immovable virtual buildings, etc.
Virtual roles: refers to an object used to simulate a person or animal in a virtual scene. The virtual character can be a virtual character, a virtual animal, an animation character, etc., such as: characters and animals displayed in the virtual scene. The avatar may be an avatar in the virtual scene that is virtual to represent the user. The virtual scene can comprise a plurality of virtual characters, and each virtual character has a shape and a volume in the virtual scene and occupies a part of the space in the virtual scene.
Alternatively, the virtual Character may be a Player Character controlled by an operation on the client, an Artificial Intelligence (AI) set in the virtual scene fight by training, or a Non-Player Character (NPC) set in the virtual scene interaction. Alternatively, the virtual character may be a virtual character that plays a game in a virtual scene. Optionally, the number of virtual characters participating in interaction in the virtual scene may be preset, or may be dynamically determined according to the number of clients participating in interaction.
Taking a shooting game as an example, the user may control the virtual character to freely fall, glide or open a parachute to fall in the sky of the virtual scene, run, jump, crawl, bow to move on the land, or control the virtual character to swim, float or dive in the sea, or the like, and of course, the user may also control the virtual character to move in the virtual scene by riding a virtual vehicle, for example, the virtual vehicle may be a virtual car, a virtual aircraft, a virtual yacht, or the like, which is only exemplified by the above-mentioned scenes, but the present invention is not limited thereto. The user can also control the virtual character to carry out the interaction of modes such as fight through virtual props with other virtual characters, for example, virtual props can include multiple, for example can be throw type virtual props such as paste burning agent, grenade, mine tied in a bundle, smog bullet, bomb, burning bottle or viscidity grenade (for short "glue thunder"), also can be shooting type virtual props such as machine gun, pistol, rifle, this application does not specifically limit the type of virtual props.
The following describes an embodiment of the present application.
Fig. 1 is a schematic diagram of an implementation environment of a virtual item control method provided in an embodiment of the present application, and referring to fig. 1, the implementation environment includes: a first terminal 120, a server 140, and a second terminal 160.
The first terminal 120 is installed and operated with an application program supporting a virtual scene. The application program may be any one of a First-Person shooter game (FPS), a third-Person shooter game, a Multiplayer Online Battle Arena game (MOBA), a virtual reality application program, a three-dimensional map program, a military simulation program, or a Multiplayer gunfight type live game. The first terminal 120 may be a terminal used by a first user, and the first user uses the first terminal 120 to operate a first virtual character located in a virtual scene for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the first avatar is a first virtual character, such as a simulated persona or an animated persona. Illustratively, the first virtual character may be a first virtual animal, such as a simulated monkey or other animal.
The first terminal 120 and the second terminal 160 are connected to the server 140 through a wireless network or a wired network.
The server 140 may include at least one of a server, a plurality of servers, a cloud computing platform, or a virtualization center. The server 140 is used to provide background services for applications that support virtual scenarios. Alternatively, the server 140 may undertake primary computational tasks and the first and second terminals 120, 160 may undertake secondary computational tasks; alternatively, the server 140 undertakes the secondary computing work and the first terminal 120 and the second terminal 160 undertakes the primary computing work; alternatively, the server 140, the first terminal 120, and the second terminal 160 perform cooperative computing by using a distributed computing architecture.
The second terminal 160 is installed and operated with an application program supporting a virtual scene. The application program can be any one of an FPS, a third person named shooting game, an MOBA, a virtual reality application program, a three-dimensional map program, a military simulation program or a multi-person gunfight survival game. The second terminal 160 may be a terminal used by a second user, who uses the second terminal 160 to operate a second virtual character located in the virtual scene for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the second avatar is a second virtual character, such as a simulated persona or an animated persona. Illustratively, the second virtual character may be a second virtual animal, such as a simulated monkey or other animal.
Optionally, the first virtual character controlled by the first terminal 120 and the second virtual character controlled by the second terminal 160 are in the same virtual scene, and the first virtual character may interact with the second virtual character in the virtual scene. In some embodiments, the first virtual character and the second virtual character may be in a hostile relationship, for example, the first virtual character and the second virtual character may belong to different teams and organizations, and the hostile virtual characters may interact with each other in a battle manner on land in a manner of shooting each other.
In other embodiments, the first avatar and the second avatar may be in a teammate relationship, for example, the first avatar and the second avatar may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights.
Alternatively, the applications installed on the first terminal 120 and the second terminal 160 are the same, or the applications installed on the two terminals are the same type of application of different operating system platforms. The first terminal 120 may generally refer to one of a plurality of terminals, and the second terminal 160 may generally refer to one of a plurality of terminals, and this embodiment is only illustrated by the first terminal 120 and the second terminal 160. The device types of the first terminal 120 and the second terminal 160 are the same or different, and include: at least one of a smart phone, a tablet computer, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III), an MP4(Moving Picture Experts Group Audio Layer IV), a laptop portable computer, and a desktop computer. For example, the first terminal 120 and the second terminal 160 may be smart phones, or other handheld portable gaming devices. The following embodiments are illustrated with the terminal comprising a smartphone.
Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only one, or several tens or hundreds of the terminals, or more. The number of terminals and the type of the device are not limited in the embodiments of the present application.
Fig. 2 is a flowchart of a method for controlling a virtual item, provided in an embodiment of the present application, where the method is applied to an electronic device, the electronic device is a terminal or a server, and referring to fig. 2, taking the application of the method to the terminal as an example, the method includes the following steps.
201. And the terminal responds to the throwing instruction of the virtual prop and controls the virtual prop to move in the virtual scene according to the target motion track.
The virtual prop is a virtual object capable of interacting with a virtual character, and the virtual character can control the virtual prop to compete with other virtual characters. For example, the virtual items include attack-type virtual items, which can cause damage to the virtual character. For example, a grenade, a paste burning agent, and the like belong to attack virtual props, and can cause damage to virtual characters within a certain range during explosion or burning. The virtual prop can also comprise an auxiliary virtual prop, and the auxiliary virtual prop is used for assisting the virtual character to complete certain actions or achieve certain purposes. Such as a smoke bomb, a grapple, etc., can assist the virtual character in obscuring the figure or in assisting the virtual character in moving, etc.
The virtual props may include throw-type virtual props, shoot-type virtual props, and application-type virtual props. The throwing type virtual prop can be thrown by the virtual character and acts after being thrown. The shooting-type virtual prop can be used by virtual characters and can cause damage to other virtual characters by throwing a throwing object. The application-type virtual prop includes a plurality of using modes, for example, the pharmaceutical-type virtual prop can be used for improving the virtual life value of the virtual character, and for example, the virtual prop such as a knife and a pan can be used by the virtual character to directly attack other virtual characters.
In this application embodiment, can provide a neotype virtual stage property of throwing, the user can throw the operation on the terminal, and the terminal receives the throwing instruction that throws the operation and trigger, can control virtual character and throw this virtual stage property, then this virtual stage property can move according to certain movement track.
The target motion track of the virtual prop can be related to the visual angle direction during throwing operation, and a user can adjust the visual angle direction of a virtual scene through visual angle adjusting operation during throwing operation, so that the target motion track of the virtual prop is adjusted. The visual angle direction can be used as the initial motion direction of the virtual prop, and after the initial motion direction of the virtual prop is determined, the target motion track of the virtual prop can be obtained according to the initial motion direction and the initial speed.
202. And the terminal responds to the collision of the virtual prop with any virtual object on the target motion track in the virtual scene, controls the virtual prop to be adhered to the surface of the virtual object and deforms at the adhered position.
The virtual prop is not required to hit a virtual character to damage the virtual character, can deform after being thrown, can damage surrounding virtual characters when deforming, and can form an area after being deformed to continuously damage the virtual characters in the area. Such an attack that causes injury to a region or area may be referred to as a range attack. Therefore, the user does not need to accurately aim at the virtual object, the virtual object can be in the injury range of the virtual prop, the user operation can be simplified, and the operation difficulty is reduced.
After the virtual prop is thrown, if a virtual object exists on the target motion track, the virtual prop collides with the virtual object, and is different from the phenomenon that the virtual prop in the related technology collides with the virtual object to bounce after being thrown, the novel throwing type virtual prop has adhesive capacity, cannot bounce after being thrown and colliding with any virtual object, but adheres to the surface of the virtual object, and deforms at the adhesive position.
Therefore, the situation that the rebound motion track deviates from the expected motion track of the user can not occur, the deformation position of the virtual prop meets the expectation of the user, the user can accurately control the virtual prop, the precision of controlling the virtual prop is improved, and the control effect on the virtual prop is good. Through deformation, the user can know the specific position of the virtual prop and dodge the position of the deformation, or the user throwing the virtual prop can see the position of the virtual prop, so that whether the position accords with the expectation of the user is judged, and accurate information is provided for the next competition as reference.
203. And the terminal displays the first target special effect in a first area corresponding to the adhesion position.
The virtual lane can burn within a certain range around after being deformed, and if the virtual character is within the range, the virtual character can be continuously damaged. The terminal can display a first target special effect in a first area corresponding to the adhesion position to prompt the effect that the virtual prop burns in a certain range.
In the embodiment of the application, on the one hand, a novel throwing type virtual prop is introduced, virtual prop in the correlation technique is different from and is thrown and then run into the bounce of a virtual object, the novel throwing type virtual prop has the adhesion capability, can not be rebounded after running into any virtual object when being thrown, but adhere to the surface of the virtual object, the situation that the rebound leads to the expected movement track of a deviated user can not occur, the deformation position and the like of the virtual prop accord with the expectation of the user, the user can accurately control the virtual prop, the precision of controlling the virtual prop is improved, and the control effect on the virtual prop is good. On the other hand, the virtual prop can deform after being adhered and display a first target special effect, the type of the virtual prop and the effect generated after the virtual prop is adhered can be visually prompted through the deformation and the first target special effect, and the display effect is better.
Fig. 3 is a flowchart of a method for controlling a virtual item provided in an embodiment of the present application, and referring to fig. 3, the method includes the following steps.
301. And the terminal displays the view picture of the currently controlled virtual character and the throwing control of the virtual prop.
The visual field picture is a picture simulating observation of a virtual scene from the perspective of a virtual character, or a picture simulating observation of a virtual scene from the perspective of a virtual camera in the vicinity of the virtual character. The throwing control of the virtual prop is used for detecting throwing operation of the virtual prop. If the virtual prop is to be thrown, touch operation is carried out on the throwing control piece, and then a throwing instruction of the virtual prop can be triggered.
The terminal can display the visual field picture and the throwing control in the user graphical interface, so that a user can decide how to throw the virtual prop to hurt other virtual characters according to the condition of the virtual characters or virtual objects in the displayed visual field picture.
Optionally, the terminal may also display other information or controls in the user graphical interface. For example, the terminal can display information (e.g., network connection status, power level, etc.) of the terminal in the user graphical interface. The terminal may also display information related to the currently controlled virtual character in the user graphical interface, for example, the related information may include attribute information (e.g., virtual life value, virtual props of equipment, etc.). The terminal may also display a thumbnail of a virtual scene, e.g. a thumbnail of a local or global virtual scene, also referred to as a minimap, in the user graphical interface. The terminal can also display a control for the virtual character in the user graphical interface, such as a control for controlling the movement of the virtual object (e.g., a control for controlling the virtual object to move automatically, such as a control for switching the controlled virtual prop, such as a backpack control for the virtual character, and the like. Of course, the terminal may also display other information, which is not listed here, and the display content of the user graphical interface is not specifically limited in this embodiment of the application.
For the virtual prop, the virtual prop may include multiple obtaining modes, two modes are provided below, and the embodiment of the present application may obtain the virtual prop in any mode.
The first method is as follows: the acquisition mode is picked up.
In the first mode, the terminal may display a plurality of virtual items in the virtual scene, for example, shooting-type virtual items or throwing-type virtual items. When the user sees the virtual prop, the user can control the virtual character to pick up the virtual prop through the picking-up operation.
Specifically, the virtual prop may be displayed on the ground or a virtual object of a virtual scene, when a distance between a currently controlled virtual character and the virtual prop is smaller than a target threshold, a pickup option of the virtual prop is displayed in the virtual scene, and when a trigger operation on the pickup option is detected, the terminal may control the virtual character to pickup the virtual prop. Optionally, after the picking up is completed, the terminal may further display the virtual item on a virtual target portion in the virtual scene, so as to prompt the virtual character to equip the virtual item.
The target portion may be a hand, a shoulder, a back, a waist, etc., and the target portion is not limited in the embodiments of the present application. The target threshold may be set by a person skilled in the art according to requirements, and is not limited in this embodiment of the application.
The second method comprises the following steps: and (4) calling acquisition mode.
In the second mode, the terminal may display a call control in the virtual scene, and when the user wants to call the virtual prop, the terminal may perform a trigger operation on the call control, and then the terminal may receive the trigger operation on the call control, generate a creation instruction, and thereby create the virtual prop in response to the creation instruction. The calling control is used for calling the virtual prop to enter the virtual scene, and the shape of the calling control can be a button which is displayed in a suspended mode in the virtual scene.
In one possible implementation, the virtual items may be classified, so that when viewing the information of the virtual items, the types of the virtual items can be viewed as an index. The virtual property can be classified as combat equipment because the virtual property is harmful, and the virtual property belonging to the same type of combat equipment can also comprise grenades, viscous mines and the like, which are not listed herein. Optionally, the virtual prop may also be classified as a type of blasting mine with explosive injury.
For example, fig. 4 is a schematic diagram of an introduction interface of a virtual item provided in this embodiment, as shown in fig. 4, a user may select a type (e.g., combat equipment) 401 of the virtual item, and the terminal may display a type 402 of the virtual item. If the user wants to know specific information of some virtual item 402, the user can also select the displayed virtual item 402, and the terminal can display information 403 of the selected virtual item 402. Virtual item 402 may be displayed to the left in fig. 4, and the user may observe the shape of virtual item 402. Information 403 of virtual prop 402 may be displayed to the right in fig. 4. For example, the flame-retardant agent may be referred to as a flame-retardant agent or a flame-retardant bottle. The fire sticking agent can be stuck on the surface of any object or any person. The player can use the fire-sticking agent in the game after the fire-sticking agent is assembled on the fighting equipment interface.
302. And the terminal responds to the touch operation of the throwing control of the virtual prop and controls the virtual prop to move in a virtual scene according to the target motion track.
After the virtual object is equipped with the virtual prop, the user can select the virtual prop, and throw the virtual prop under the condition that the virtual character controls the virtual prop, so as to control the virtual character to throw the virtual prop in a virtual scene. Specifically, when the user can perform a throwing operation and the terminal detects the throwing operation, the target motion trajectory of a throwing object of the virtual prop can be acquired based on the view direction of the current virtual scene in response to the throwing operation on the virtual prop, so that the throwing object of the virtual prop is controlled to move in the virtual scene according to the target motion trajectory.
For the throwing operation, the user can press the throwing control for a long time, the initial motion direction of the virtual prop is adjusted through the visual angle adjusting operation, and after the throwing is determined, the user can cancel the long-press operation on the throwing control.
The throwing operation can trigger a throwing instruction, the throwing instruction comprises throwing operation information, and the terminal receives the throwing instruction, can respond to the throwing instruction and acquire a target motion track of the virtual prop according to the initial motion direction and the initial speed of the virtual prop in the throwing instruction. The visual angle direction can be used as the initial motion direction of the virtual prop, the initial speed can be a preset numerical value, and the terminal can also determine the operation duration or the pressing strength according to the throwing operation.
Optionally, the initial motion direction may be embodied as a throwing angle, an included angle between a viewing angle direction of the current virtual scene and a horizontal direction in the virtual scene may be used as the throwing angle, and the target motion trajectory may be obtained according to the throwing angle and stress information of the virtual prop. For example, the force information may be a vertical downward gravity, and for example, the force is a vertical downward gravity and an air resistance opposite to the movement direction of the virtual prop. The embodiment of the application does not specifically limit the stress information of the virtual prop. In one possible implementation, the target motion trajectory is a parabola.
In one possible implementation manner, when the user presses the throwing control, the terminal may determine and display the candidate motion trajectory according to the current viewing angle direction, so that the user can determine whether the current throwing direction is in accordance with the expectation according to the candidate motion trajectory, and if not, the terminal may also provide assistance for the adjusting step.
In one specific example, the virtual prop can throw a fire carrier, for example, the virtual prop can emit flame after being deformed, and the flame can burn around the virtual prop. The user can select the virtual prop, press the firing button, release the hands when adjusting to a proper throwing angle, the terminal can throw the virtual prop according to the current throwing angle, and the virtual prop can fly in the air according to a target motion track corresponding to the current throwing angle.
The following describes the throwing process of the virtual prop by using a specific example. As shown in fig. 5, a user may select a virtual prop 501 owned by the virtual character, the terminal controls the virtual character to hold the virtual prop, the terminal may display a throwing control 502 in a user graphical interface, the number of the throwing control 502 may be one, or two or more, and this is not limited in this embodiment of the present application. When the user presses the throwing control 502 when the virtual object controls the virtual item, the terminal may display the candidate motion trajectory 503 according to the current view direction. If the user wants to adjust, the user can perform the view angle adjustment operation, and the terminal can adjust the candidate motion trajectory according to the view angle adjustment operation until the user is satisfied, and the terminal can stop the touch operation on the throwing control 502, so that the virtual character is controlled to throw the virtual prop. And the candidate motion track when the touch operation is stopped is the target motion track. Taking the virtual prop as a fire adhesive as an example, the process is as follows: by default, the user uses the firearm weapon of the main weapon, and the user clicks on the throw weapon equipment bar to switch out the viscous agent (i.e., virtual prop 501), and then presses the fire key (i.e., throw control 502) to enter the pre-throw state, which displays a pre-throw line (i.e., candidate trajectory 503).
For the target motion trajectory (i.e., the pre-cast line), as shown in fig. 6, the obtaining process may obtain the magnitude and the direction of the initial velocity (i.e., the initial velocity and the initial motion direction), and optionally may also obtain the acceleration, a parabolic trajectory may be calculated according to these parameters, then a waypoint position 601 is obtained at intervals, and the waypoint position list is assigned to the special effect line, so that a parabolic trajectory 602 may be formed, where the parabolic trajectory is also the target motion trajectory. It should be noted that fig. 6 only shows the position of the waypoint to illustrate the position, and the position is not displayed when the terminal interface is displayed.
It should be noted that, when the virtual character controls another virtual item or does not control a virtual item, the throwing control may also be another type of control, for example, a triggering control of another virtual item or a triggering control of another action may be used before the user selects the throwing virtual item. For example, if the user selected a virtual item of the firearm type, the control may also be referred to as a firing control. If the user does not select any virtual prop, the control may be a punch control.
In a possible implementation manner, the terminal may display the control according to the state of the currently controlled virtual item and the display style corresponding to the state. For example, when the control is a firing control, the display style of the control is as follows: the button center shows a bullet. If the control is a throwing control, the display style of the control is as follows: the throwing type virtual prop is displayed in the center of the button. If the control is a punch control, the display style of the control is as follows: the center of the button displays a fist.
In a possible implementation manner, in this step 302, the terminal may further display a target motion trajectory of the virtual item, so that the user can observe the motion condition of the launcher of the virtual item more intuitively and clearly.
It should be noted that, in step 301 and step 302, a process of controlling the virtual prop to move according to a target motion trajectory in a virtual scene in response to a throwing instruction for the virtual prop is described, which is only described by displaying a throwing control and controlling throwing of the virtual prop according to a touch operation for the throwing control, and the throwing instruction may also be triggered by other manners, for example, controlling throwing of the virtual prop may be controlled through a gesture operation or a gyroscope operation. Specifically, the terminal responds to the target gesture and controls the virtual prop to move in the virtual scene according to the target motion track. Or the terminal determines the direction of the gyroscope according to the rotation data of the gyroscope, determines the posture of the terminal according to the direction of the gyroscope, and executes the step of controlling the movement of the virtual prop in response to the terminal being in the target posture.
303. And the terminal responds to the collision of the virtual prop with any virtual object on the target motion track in the virtual scene, and controls the virtual prop to be adhered to the surface of the virtual object.
After the terminal controls the virtual character to throw the virtual prop, when the virtual prop moves along the target motion track, if a virtual object exists on the target motion track, the virtual object may be a virtual character or a virtual object, and the virtual prop may collide with the virtual object. Because the virtual item has adhesiveness, unlike the related art in which the virtual item encounters object bounce, the virtual item can adhere to the surface of the virtual object that collides.
Therefore, the virtual prop cannot bounce, the motion track thrown by the user is the actual motion track of the virtual prop, and the deformation position of the virtual prop is a certain point position on the target motion track. Specifically, if the virtual object is touched, the deformation position is the collision position of the target motion track and the virtual object; if the virtual prop does not collide with other virtual objects until the virtual prop reaches the end position of the target motion trajectory, but collides with a virtual object (e.g., a virtual object such as the ground, a virtual building, etc.) at the end point, the deformation position is the position of the end point.
Therefore, the actual motion track of the virtual prop is the motion track expected by the user, the user can accurately know the possible deformation position of the virtual prop when throwing, so that the virtual prop can be accurately controlled, the precision of controlling the virtual prop is improved, the control effect on the virtual prop is good, the situation that the motion track is possibly rebounded and changed by the user is not needed, and the operation complexity and difficulty of the user can be reduced.
In one possible implementation, when the virtual item collides with the virtual object, the adhesion position may be the position where the collision occurs. Specifically, the terminal responds to the fact that the virtual prop collides with any virtual object on the target motion track in the virtual scene, and controls the virtual prop to be adhered to the surface of the virtual object at the collision position according to the collision position of the virtual prop and the surface of the virtual object. Therefore, the position of the virtual prop is not changed when the virtual prop is collided and adhered, the position synchronization is kept, the adhesion of the virtual prop can be well embodied, the adhesion position can be controlled more accurately, the control on the virtual prop is more accurate, the control effect is good, the display condition is more real, and the display effect is better.
In a possible implementation manner, the collision position may be determined by a target motion track and a surface of a virtual object, and the collision position is an intersection point between the target motion track and the surface of the virtual object, and the terminal may control the virtual prop to be adhered on the surface of the virtual object in response to that the target motion track and the surface of any virtual object in the virtual scene have an intersection point.
In a specific possible embodiment, the collision process may be implemented by a ray detection method, and in the process that the virtual prop moves according to the target motion trajectory, the terminal may start from the current position of the virtual prop, emit a ray in a tangential direction of the current position on the target motion trajectory, determine whether the ray passes through the surface of the virtual object, and if so, may use an intersection point of the ray and the virtual object as the adhesion position.
Optionally, the length of the ray is a target length, and the target length may be set by a relevant technician as required, for example, the length of the ray may be a small value, so that whether the target motion trajectory collides with the virtual object can be determined by the ray, and if the length is set to be too large, it may cause that the ray passes through but the target motion trajectory does not pass through the virtual object, which is not limited in the embodiment of the present application. For example, as shown in fig. 7, a virtual item 701 moves according to a target motion trajectory 702, and assuming that a current position is a during the movement, a terminal may emit a ray with a as a starting point, where the ray is AB, and if the ray AB detects a virtual object, the virtual item collides with the virtual object in a next frame, and a collision point (that is, a collision position, which is subsequently used as an adhesion position) may be obtained, and the virtual item may adhere to the collision point and deform.
304. And the terminal controls the virtual prop to deform at the adhesion position.
The virtual prop can deform after being adhered to the surface of the virtual object. In one possible implementation, the virtual item may be deformed while being adhered, or may be deformed immediately after being adhered, so as to simulate the occurrence of deformation triggered after the virtual item collides with a virtual object. For example, taking the virtual prop as a paste burning agent as an example, the deformation may be in the form of explosion, and after the paste burning agent collides with the virtual object, the paste burning agent adheres to the surface of the virtual object and explodes.
In a possible implementation manner, the virtual prop has a certain deformation direction when deformed, so that the situation of the virtual prop fragmentation or collision explosion is truly simulated. Specifically, the terminal may determine a deformation direction of the virtual prop according to the surface of the virtual object and the adhesion position, and control the virtual prop to deform along the deformation direction at the adhesion position.
This deformation direction can be set by relevant technical personnel as required, and in a possible implementation, this deformation direction can be the normal direction of the surface of virtual object, and then the step that the terminal confirmed the deformation direction can be: and the terminal acquires the normal direction of the adhesion position on the surface of the virtual object and determines the normal direction as the deformation direction of the virtual prop.
In this implementation, the positions of the surfaces of different virtual objects in the virtual scene are different, and then the corresponding deformation directions are different. If the virtual object is the ground in the virtual scene, and the ground is generally in the horizontal direction, the normal direction is the vertical direction, and the determined deformation direction is also the vertical direction. If the virtual object is a wall in a virtual scene, and the wall is generally in the vertical direction, the normal direction of the wall is the horizontal direction, and the determined deformation direction is also the horizontal direction. If the virtual object is a number of buildings, the normal direction of the virtual object can be determined as the deformation direction according to the surfaces of the buildings.
In a specific possible embodiment, when the virtual prop deforms along the deformation direction, a deformation region can be formed, that is, the terminal can determine the deformation region according to the deformation direction, or the terminal determines the deformation region of the virtual prop according to the surface of the virtual object and the adhesion position, and controls the virtual prop to deform in the deformation region.
It will be appreciated that the deformation region is located in the deformation direction. For example, as shown in fig. 8, if the virtual object is the ground 801 in the virtual scene, and the ground 801 is generally horizontal, the normal direction is vertical, and the determined deformation region 802 is located in the vertical direction of the ground 801. As shown in fig. 9, if the virtual object is a wall 901 in a virtual scene, and the wall 901 is generally vertical, the normal direction of the wall 901 is a horizontal direction, and the determined deformation region 902 is in the horizontal direction of the wall 901.
In one possible implementation, the virtual prop can damage the virtual characters in the surrounding area when deformed. For example, if the deformation is that the virtual prop explodes, the surrounding area can be affected during explosion. The terminal can determine the area affected by the deformation, and determine whether to execute the corresponding function according to whether the area has the virtual role.
Specifically, the terminal acquires a second area corresponding to the adhesion position according to the adhesion position, and controls the virtual life value of the virtual character to be reduced in response to any virtual character being located in the second area when the virtual prop deforms.
And for the second area, the virtual character in the second area is damaged when the virtual prop deforms. In one possible implementation, the second area may be an area centered on the adhesion location. The shape of the second area may be a circle or a sphere, and of course, the second area may also be a square or other shapes, the size of the second area may be a first target size, and the first target size may be set by a person skilled in the relevant art according to requirements, and the size and the shape of the second area are not limited in this application. In a specific possible embodiment, the terminal may acquire, as the second area, an area having a radius of the first radius with the adhesion position as a center. The first radius may be set by the skilled person as desired.
For the process of reducing the virtual life value, the target virtual life value can be reduced when the virtual character is influenced by the deformation of the virtual prop. Optionally, the target virtual life value may be set by a relevant technician according to a requirement, that is, the terminal may control the virtual life value of the virtual character to decrease the preset target virtual life value. For example, the virtual item deformation may result in a loss of virtual life value of 40 points. The first region includes two virtual characters, wherein one virtual character having a virtual life value of 200 originally is affected by deformation, and the virtual life value is changed to 160. After another virtual character with the virtual life value of 40 originally is affected by the deformation, the virtual life value is changed into 0, and the virtual character is eliminated.
Alternatively, the target virtual life value may be determined according to the distance between the virtual character and the position of the attachment (i.e., the center point where the deformation occurs). That is, the terminal may determine the target virtual life value of the virtual character according to the distance between the virtual character and the attachment position, and control the virtual life value of the virtual character to decrease the target virtual life value.
Alternatively, the target virtual life value may be inversely related to the distance. That is, the larger the distance, the smaller the target virtual life value. The smaller the distance, the larger the target virtual life value.
For example, when two virtual objects are included in the first area, each virtual character has a virtual life value of 200, and one virtual character is closer to the attachment position and the loss of the virtual life value is determined to be 50, and the other virtual character is farther from the attachment position and the loss of the virtual life value is determined to be 30, the virtual life values of the two virtual characters become 150 and 170, respectively.
In one possible implementation, the target virtual life value may be less than a maximum virtual life value of the virtual character. For example, the target virtual life value may be lower than the damage caused by the grenade, and if the virtual life value of the virtual character is attached to the virtual prop and deformed when being at the maximum value, the virtual life value of the virtual character is not reduced to zero, that is, is not eliminated immediately.
For example, as shown in fig. 10, the virtual item collides with a wall 1001 in the virtual scene, does not bounce, but adheres to the wall 1001, explodes immediately, and causes a certain damage to players in an explosion range (i.e., the second area), and the terminal displays an explosion effect 1002. It should be noted that the damage of the virtual item (for example, sticking a burning agent) is lower than that of the torpedo due to explosion, and if the player is full (that is, the virtual life value is at the maximum), the virtual item is stuck and the explosion is not eliminated.
305. And the terminal acquires a first area corresponding to the adhesion position according to the adhesion position.
After the virtual prop deforms, the virtual prop can burn in a range, and the terminal can display a first target special effect in a first area to reflect the burning effect in the range.
As for the first region, the first region may be a region centered on the adhering position. The shape of the second area may be a circle or a sphere, and of course, the second area may also be a square or other shapes, the size of the first area may be a second target size, and the second target size may be set by a person skilled in the art according to the needs, and the size and the shape of the second area are not limited in this application.
In one possible implementation manner, the terminal acquires an area with the adhesion position as the center and the radius as the target radius as the first area corresponding to the adhesion position. The target radius may be set by a person skilled in the art according to requirements, and is not limited in the embodiment of the present application.
306. And the terminal displays the first target special effect in the first area.
After the terminal determines the first area, a first target special effect can be displayed in the first area, the type and the effect generated by deformation of the virtual prop can be visually prompted through the first target special effect, and the display effect is better. The user can clearly know that the virtual prop can be burnt in a surrounding area after being deformed through the first target special effect, so that other influences are generated by virtual characters in the area. For example, the combustion can cause sustained damage to the virtual character. Or the first target special effect is only used for prompting that the virtual prop is deformed, and the combustion does not produce other influences on the virtual character. For example, the following description will take the case that the virtual item will continuously damage the virtual character in a certain area after being deformed, which may be specifically referred to in step 307 and step 308.
In one possible implementation, the first target effect may be a flame effect, and may also be other types of effects, such as a continuous flash effect, a continuous explosion effect, a smoke or toxic effect, and the like. The embodiments of the present application do not limit this.
It should be noted that, step 305 and step 306 are processes of displaying the first target special effect in the first area corresponding to the adhering position, in this process, after the terminal determines the first area according to the adhering position, the terminal can display the first target special effect in the first area, and of course, the terminal may not perform the step of acquiring the first area, but directly display the first target special effect with the adhering position as the center. The embodiment of the present application does not specifically limit which manner is used.
In a possible implementation manner, the area where the virtual prop burns to damage the virtual character may be the first area, and the virtual character located in the first area is continuously damaged. The terminal may control the virtual life value of any virtual character to be continuously decreased in response to the virtual character being located in the first area.
In another possible implementation manner, the area in which the virtual prop burns to damage the virtual character is a third area different from the first area, and the terminal determines the third area for continuous damage determination. See steps 307 and 308 below.
307. And the terminal acquires a third area corresponding to the adhesion position according to the adhesion position.
The third area is an injury area after the virtual prop is deformed, and the virtual character in the third area is continuously injured.
Alternatively, the range of the third region may be the same as the range of the first region, that is, the first region and the third region are the same.
Optionally, the first region and the third region may also be different, for example, in a specific possible embodiment, the third region includes the first region, and the third region is larger than the first region. For another example, in another specific possible embodiment, the third area is located within the first area, and the third area is smaller than the first area.
The size and shape of the third region are not limited in the embodiments of the present application, similarly to the first region and the second region. In a specific possible embodiment, the terminal may acquire an area with a radius of the second radius as a third area corresponding to the adhering position, with the adhering position as a center. The second radius may be set by a person skilled in the art according to requirements, for example, the second radius may be the same as, greater than, or less than the target radius of the first area, which is not limited in the embodiments of the present application.
308. And the terminal responds to any virtual character in the third area and controls the virtual life value of the virtual character to continuously reduce.
After the terminal determines the third area, if there is a virtual character in the third area, the virtual character will be continuously damaged, and the terminal may control the virtual life value of the virtual character to be continuously reduced.
In one possible implementation manner, the terminal may control the virtual life value of the virtual character to decrease the target life value every preset time. For example, the terminal may control the virtual life value of the virtual character to be decreased by 20 virtual life values every second.
In one possible implementation, when the virtual character located in the third area is continuously injured, the user controlling the virtual character can be prompted to be injured currently through the second target special effect. Specifically, the terminal may control the virtual life value of the currently controlled virtual character to continuously decrease in response to that the currently controlled virtual character is located in the third area, and display a second target special effect in the user graphical interface, where the second target special effect is used to prompt that the virtual life value of the currently controlled virtual character is decreasing. For example, the second target effect presents a light red color to the user graphical interface, and of course, the second target effect may also be another form of effect, which is not limited in this embodiment of the application.
For example, as shown in (a) in fig. 11, after the virtual item is deformed, a flame effect 1101 can be displayed around the virtual item, and if the currently controlled virtual character is in an injury area 1102 (i.e., a third area) of the virtual item, a graphical user interface displayed by the terminal appears reddish, and a virtual life value 1103 of the virtual character is reduced. As shown in fig. 11 (b), the damaged region 1102 is a circular region having a radius R with the adhering position as the center, and R is a positive number. In an implementation where the first region and the second region are also circular regions, the first region and the second region are obtained in the same manner as the injury region 1102, except that R may be the same or different.
The virtual objects may be static virtual objects or dynamic virtual objects. For example, some virtual objects (e.g., virtual buildings, etc.) cannot move, i.e., are static virtual objects. For another example, a virtual character or a virtual vehicle can move in a virtual scene, and if the virtual character or the virtual vehicle is currently moving, the virtual character or the virtual vehicle is a dynamic virtual object; if not moving, then it is a static virtual object.
In a possible implementation manner, considering that the virtual object may be a dynamic virtual object, the virtual prop is adhered to the surface of the virtual object, and if the virtual object is moving, the virtual prop can be driven to move, and then the injury area of the virtual prop can move along with the movement. Specifically, the terminal controls the third area to move as the virtual object moves in response to the virtual object moving in the virtual scene. Thus, the virtual character entering the third zone is continuously injured because the third zone may also move. Through the following movement characteristic of the injury area, the adhesion of the virtual prop can be well embodied, the display is more in line with the real situation, and the display effect is better.
In a possible implementation manner, the display duration of the first target special effect is a target duration, and a function that is located in the third area after the target duration and causes damage to the virtual character disappears. The combustion state of the virtual prop is not always kept, a target time length can be set, the virtual prop takes effect within the target time length, and after the target time length, the virtual prop does not have an effect. The target time period may be set by a skilled person as required, for example, the target time period may be 10 seconds. The embodiments of the present application do not limit this.
As shown in fig. 12, the terminal may perform a step 1201 of controlling a virtual object to equip with a fire-sticking agent (i.e., to stick a combustion agent or the virtual prop), perform a step 1202 of determining whether to press a firing key (i.e., to throw a control), and if so, enter a pre-throw state 1203, where the pre-throw state is a state when the throwing control is not pressed loose, and is a state where the candidate motion trajectory is being adjusted and is not thrown. If not, the terminal continues to detect. After entering the pre-cast state 1203, the terminal may perform a step 1204 of determining whether to release the hand, and if not, the pre-cast state is maintained. If so, the terminal can perform the step 1205 of throwing a fire-adhesive projectile. The terminal may continue to the step 1206 of determining whether an object is encountered, and if the agent does not encounter an object, then it is still in a state where it is thrown, and if it does encounter an object, then the step 1207 of the agent sticking to the surface of the object and exploding (i.e., deforming) to create a burning zone (i.e., a third zone) is performed. The terminal may continue to determine if the object is moving 1208 and, if not, keep the virtual item burning in this burning zone. Whereas if the object moves, the combustion zone follows movement 1209. The terminal may perform a step 1210 of determining whether a target (i.e., a virtual character) enters the burning zone, and if not, continue the detection. If so, the terminal may perform a step 1211 of deducting the damage (i.e., the virtual life value is decreased) for the target. The terminal may continue with the step 1212 of determining whether the combustion zone is over, and if so, the terminal may perform the step 1213 of disappearing the flame effect (i.e., the first target effect). If not, the terminal may continue to detect.
In the embodiment of the application, on the one hand, a novel throwing type virtual prop is introduced, virtual prop in the correlation technique is different from and is thrown and then run into the bounce of a virtual object, the novel throwing type virtual prop has the adhesion capability, can not be rebounded after running into any virtual object when being thrown, but adhere to the surface of the virtual object, the situation that the rebound leads to the expected movement track of a deviated user can not occur, the deformation position and the like of the virtual prop accord with the expectation of the user, the user can accurately control the virtual prop, the precision of controlling the virtual prop is improved, and the control effect on the virtual prop is good. On the other hand, the virtual prop can deform after being adhered and display a first target special effect, the type of the virtual prop and the effect generated after the virtual prop is adhered can be visually prompted through the deformation and the first target special effect, and the display effect is better.
All the above optional technical solutions can be combined arbitrarily to form optional embodiments of the present application, and are not described herein again.
Fig. 13 is a schematic structural diagram of a virtual item control device provided in an embodiment of the present application, and referring to fig. 13, the device includes:
the control module 1301 is configured to, in response to a throwing instruction for the virtual prop, control the virtual prop to move in a virtual scene according to a target motion trajectory;
the control module 1301 is further configured to control the virtual prop to adhere to a surface of the virtual object and deform at an adhesion position in response to the virtual prop colliding with any virtual object on the target motion trajectory in the virtual scene;
a display module 1302, configured to display a first target special effect in a first area corresponding to the adhesion position.
In one possible implementation manner, the control module 1301 is configured to, in response to a collision of the virtual prop with any virtual object on the target motion trajectory in the virtual scene, control the virtual prop to adhere to the surface of the virtual object at a collision position according to the collision position of the virtual prop with the surface of the virtual object.
In one possible implementation, the control module 1301 is configured to, in response to that there is an intersection between the target motion trajectory and a surface of any virtual object in the virtual scene, control the virtual prop to be attached to the surface of the virtual object at the intersection.
In one possible implementation, the control module 1301 includes a determination unit and a control unit;
the determining unit is used for determining the deformation direction of the virtual prop according to the surface of the virtual object and the adhesion position;
the control unit is used for controlling the virtual prop to deform along the deformation direction on the adhesion position.
In a possible implementation manner, the determining unit is configured to obtain a normal direction at the adhesion position on the surface of the virtual object, and determine the normal direction as the deformation direction of the virtual prop.
In one possible implementation, the display module 1302 includes an acquisition unit and a display unit;
the acquisition unit is used for acquiring a first area corresponding to the adhesion position according to the adhesion position;
the display unit is used for displaying a first target special effect in the first area.
In one possible implementation manner, the acquiring unit is configured to acquire, as the first area corresponding to the adhesion position, an area with the adhesion position as a center and a radius as a target radius.
In one possible implementation, the apparatus further includes:
the first acquisition module is used for acquiring a second area corresponding to the adhesion position according to the adhesion position;
the control module 1301 is further configured to control a virtual life value of the virtual character to decrease in response to that any virtual character is located in the second area when the virtual prop is deformed.
In one possible implementation, the apparatus further includes:
the second acquisition module is used for acquiring a third area corresponding to the adhesion position according to the adhesion position;
the control module 1301 is further configured to control the virtual life value of any virtual character to continuously decrease in response to the virtual character being located in the third area.
In a possible implementation, the first region and the third region are the same, or the third region includes the first region and the third region is larger than the first region; or, the third area is located in the first area, and the third area is smaller than the first area.
In one possible implementation manner, the control module 1301 is further configured to control the virtual life value of the currently controlled virtual character to continuously decrease in response to the currently controlled virtual character being located in the third area;
the display module 1302 is further configured to display a second target special effect in the user graphical interface, where the second target special effect is used to prompt that the virtual life value of the currently controlled virtual character is decreasing.
In one possible implementation, the control module 1301 is further configured to perform controlling the third area to move as the virtual object moves in response to the virtual object moving in the virtual scene.
In a possible implementation manner, the display duration of the first target special effect is a target duration, and a function that is located in the third area after the target duration and causes damage to the virtual character disappears.
The device that this application embodiment provided, on the one hand, a neotype virtual stage property of throwing has been introduced, virtual stage property in being different from the correlation technique is thrown and is run into virtual object bounce-back after, this neotype virtual stage property of throwing has the adhesion ability, run into arbitrary virtual object when throwing and can not bounce-back, but the adhesion is on this virtual object's surface, the condition that the bounce-back leads to skew user's anticipated movement track can not take place, the deformation position etc. of this virtual stage property accords with user expectation like this, this virtual stage property can be controlled accurately to the user, the precision of controlling virtual stage property has been improved, it is effectual to the control of virtual stage property. On the other hand, the virtual prop can deform after being adhered and display a first target special effect, the type of the virtual prop and the effect generated after the virtual prop is adhered can be visually prompted through the deformation and the first target special effect, and the display effect is better.
It should be noted that: when the virtual item control device provided in the above embodiment controls a virtual item, the division of each function module is merely used for illustration, and in practical applications, the function allocation can be completed by different function modules as needed, that is, the internal structure of the virtual item control device is divided into different function modules to complete all or part of the above-described functions. In addition, the virtual item control device and the virtual item control method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
The electronic device in the above method embodiment can be implemented as a terminal. For example, fig. 14 is a block diagram of a terminal according to an embodiment of the present disclosure. The terminal 1400 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3(Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4) player, a notebook computer or a desktop computer. Terminal 1400 can also be referred to as user equipment, a portable terminal, a laptop terminal, a desktop terminal, or other names.
In general, terminal 1400 includes: a processor 1401, and a memory 1402.
Processor 1401 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 1401 may be implemented in at least one hardware form of DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array). Processor 1401 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1401 may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, processor 1401 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1402 may include one or more computer-readable storage media, which may be non-transitory. Memory 1402 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1402 is used to store at least one instruction for execution by processor 1401 to implement the virtual prop control method provided by method embodiments herein.
In some embodiments, terminal 1400 may further optionally include: a peripheral device interface 1403 and at least one peripheral device. The processor 1401, the memory 1402, and the peripheral device interface 1403 may be connected by buses or signal lines. Each peripheral device may be connected to the peripheral device interface 1403 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1404, a display 1405, a camera assembly 1406, audio circuitry 1407, a positioning assembly 1408, and a power supply 1409.
The peripheral device interface 1403 can be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1401 and the memory 1402. In some embodiments, the processor 1401, memory 1402, and peripheral interface 1403 are integrated on the same chip or circuit board; in some other embodiments, any one or both of the processor 1401, the memory 1402, and the peripheral device interface 1403 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 1404 is for receiving and projecting RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 1404 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 1404 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1404 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1404 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1404 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1405 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1405 is a touch display screen, the display screen 1405 also has the ability to capture touch signals at or above the surface of the display screen 1405. The touch signal may be input to the processor 1401 for processing as a control signal. At this point, the display 1405 may also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, the display 1405 may be one, disposed on the front panel of the terminal 1400; in other embodiments, display 1405 may be at least two, respectively disposed on different surfaces of terminal 1400 or in a folded design; in other embodiments, display 1405 may be a flexible display disposed on a curved surface or on a folded surface of terminal 1400. Even further, the display 1405 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display 1405 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-emitting diode), and the like.
The camera assembly 1406 is used to capture images or video. Optionally, camera assembly 1406 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1406 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1407 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1401 for processing or inputting the electric signals to the radio frequency circuit 1404 to realize voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of terminal 1400. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is then used to convert electrical signals from the processor 1401 or the radio frequency circuit 1404 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1407 may also include a headphone jack.
The positioning component 1408 serves to locate the current geographic position of the terminal 1400 for navigation or LBS (location based Service). The positioning component 1408 may be based on the positioning component of the GPS (global positioning System) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 1409 is used to power the various components of terminal 1400. The power source 1409 may be alternating current, direct current, disposable or rechargeable. When the power source 1409 comprises a rechargeable battery, the rechargeable battery can be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1400 also includes one or more sensors 1410. The one or more sensors 1410 include, but are not limited to: acceleration sensor 1411, gyroscope sensor 1412, pressure sensor 1413, fingerprint sensor 1414, optical sensor 1415, and proximity sensor 1416.
The acceleration sensor 1411 may detect the magnitude of acceleration on three coordinate axes of a coordinate system established with the terminal 1400. For example, the acceleration sensor 1411 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1401 can control the display 1405 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1411. The acceleration sensor 1411 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 1412 may detect a body direction and a rotation angle of the terminal 1400, and the gyro sensor 1412 and the acceleration sensor 1411 may cooperate to collect a 3D motion of the user on the terminal 1400. The processor 1401 can realize the following functions according to the data collected by the gyro sensor 1412: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 1413 may be disposed on the side frames of terminal 1400 and/or underlying display 1405. When the pressure sensor 1413 is disposed on the side frame of the terminal 1400, the user's holding signal of the terminal 1400 can be detected, and the processor 1401 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1413. When the pressure sensor 1413 is disposed at the lower layer of the display screen 1405, the processor 1401 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 1405. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1414 is used for collecting a fingerprint of a user, and the processor 1401 identifies the user according to the fingerprint collected by the fingerprint sensor 1414, or the fingerprint sensor 1414 identifies the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, processor 1401 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for, and changing settings, etc. Fingerprint sensor 1414 may be disposed on the front, back, or sides of terminal 1400. When a physical button or vendor Logo is provided on terminal 1400, fingerprint sensor 1414 may be integrated with the physical button or vendor Logo.
The optical sensor 1415 is used to collect ambient light intensity. In one embodiment, processor 1401 may control the display brightness of display 1405 based on the ambient light intensity collected by optical sensor 1415. Specifically, when the ambient light intensity is high, the display luminance of the display screen 1405 is increased; when the ambient light intensity is low, the display brightness of the display screen 1405 is reduced. In another embodiment, the processor 1401 can also dynamically adjust the shooting parameters of the camera assembly 1406 according to the intensity of the ambient light collected by the optical sensor 1415.
Proximity sensor 1416, also known as a distance sensor, is typically disposed on the front panel of terminal 1400. The proximity sensor 1416 is used to collect the distance between the user and the front surface of the terminal 1400. In one embodiment, when proximity sensor 1416 detects that the distance between the user and the front face of terminal 1400 is gradually decreased, processor 1401 controls display 1405 to switch from a bright screen state to a dark screen state; when proximity sensor 1416 detects that the distance between the user and the front face of terminal 1400 is gradually increasing, display 1405 is controlled by processor 1401 to switch from the sniff state to the brighten state.
Those skilled in the art will appreciate that the configuration shown in fig. 14 is not intended to be limiting with respect to terminal 1400 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
The electronic device in the above method embodiment can be implemented as a server. For example, fig. 15 is a schematic structural diagram of a server provided in this embodiment of the present application, where the server 1500 may generate relatively large differences due to different configurations or performances, and can include one or more processors (CPUs) 1501 and one or more memories 1502, where at least one program code is stored in the memory 1502, and the at least one program code is loaded and executed by the processor 1501 to implement the virtual prop control method provided in each method embodiment described above. Certainly, the server can also have components such as a wired or wireless network interface and an input/output interface to facilitate input and output, and the server can also include other components for implementing the functions of the device, which is not described herein again.
In an exemplary embodiment, a computer-readable storage medium, such as a memory, including at least one program code, which is executable by a processor to perform the virtual prop control method in the above embodiments, is also provided. For example, the computer-readable storage medium can be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided, in one aspect, a computer program product or a computer program is provided that includes one or more program codes stored in a computer readable storage medium. One or more processors of the electronic device can read the one or more program codes from the computer-readable storage medium, and the one or more processors execute the one or more program codes, so that the electronic device can execute the virtual prop control method.
It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
It should be understood that determining B from a does not mean determining B from a alone, but can also determine B from a and/or other information.
Those skilled in the art will appreciate that all or part of the steps for implementing the above embodiments can be implemented by hardware, or can be implemented by a program for instructing relevant hardware, and the program can be stored in a computer readable storage medium, and the above mentioned storage medium can be read only memory, magnetic or optical disk, etc.
The above description is intended only to be an alternative embodiment of the present application, and not to limit the present application, and any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. A virtual item control method is characterized by comprising the following steps:
responding to a throwing instruction of the virtual prop, and controlling the virtual prop to move in a virtual scene according to a target motion track;
responding to collision between the virtual prop and any virtual object on the target motion track in the virtual scene, controlling the virtual prop to be adhered to the surface of the virtual object, and deforming at an adhered position;
and displaying a first target special effect in a first area corresponding to the adhesion position.
2. The method of claim 1, wherein the controlling the virtual prop to adhere to the surface of the virtual object in response to the virtual prop colliding with any virtual object on the target motion trajectory in the virtual scene comprises:
responding to the collision of the virtual prop with any virtual object on the target motion track in the virtual scene, and controlling the virtual prop to be adhered to the surface of the virtual object at the collision position according to the collision position of the virtual prop and the surface of the virtual object.
3. The method of claim 1, wherein said deforming at the adhesion site comprises:
determining the deformation direction of the virtual prop according to the surface of the virtual object and the adhesion position;
and controlling the virtual prop to deform along the deformation direction on the adhesion position.
4. The method of claim 3, wherein determining the direction of deformation of the virtual prop from the surface of the virtual object and the position of the adhesion comprises:
and acquiring the normal direction of the adhesion position on the surface of the virtual object, and determining the normal direction as the deformation direction of the virtual prop.
5. The method of claim 1, wherein displaying a first target effect in a first area corresponding to the attachment location comprises:
acquiring a first area corresponding to the adhesion position according to the adhesion position;
and displaying a first target special effect in the first area.
6. The method of claim 5, wherein obtaining the first area corresponding to the adhesion position according to the adhesion position comprises:
and acquiring an area with the adhering position as the center and the radius as the target radius as a first area corresponding to the adhering position.
7. The method of claim 1, further comprising:
acquiring a second area corresponding to the adhesion position according to the adhesion position;
and controlling the virtual life value of the virtual character to be reduced in response to any virtual character being located in the second area when the virtual prop is deformed.
8. The method of claim 1, wherein after the deformation at the adhesion site, the method further comprises:
acquiring a third area corresponding to the adhesion position according to the adhesion position;
and controlling the virtual life value of the virtual character to continuously reduce in response to any virtual character being located in the third area.
9. The method of claim 8, wherein the first region and the third region are the same, or wherein the third region comprises the first region and the third region is larger than the first region; or, the third area is located in the first area, and the third area is smaller than the first area.
10. The method of claim 8, further comprising:
controlling the virtual life value of the currently controlled virtual character to continuously decrease in response to the currently controlled virtual character being located within the third region;
and displaying a second target special effect in the user graphical interface, wherein the second target special effect is used for prompting that the virtual life value of the currently controlled virtual character is reduced.
11. The method of claim 8, further comprising:
in response to the virtual object moving in the virtual scene, controlling the third area to move as the virtual object moves.
12. The method according to claim 1 or 9, wherein a display duration of the first target special effect is a target duration, and a function of damaging the virtual character in a third area after the target duration disappears.
13. A virtual prop control apparatus, the apparatus comprising:
the control module is used for responding to a throwing instruction of the virtual prop and controlling the virtual prop to move in a virtual scene according to a target motion track;
the control module is further used for responding to the collision between the virtual prop and any virtual object on the target motion track in the virtual scene, controlling the virtual prop to be adhered to the surface of the virtual object and deform at the adhered position;
and the display module is used for displaying the first target special effect in the first area corresponding to the adhesion position.
14. An electronic device, comprising one or more processors and one or more memories having stored therein at least one program code, the at least one program code being loaded and executed by the one or more processors to implement the virtual item control method of any one of claims 1 to 12.
15. A computer-readable storage medium, wherein at least one program code is stored therein, which is loaded and executed by a processor to implement the virtual prop control method of any one of claims 1 to 12.
CN202010808039.1A 2020-08-12 2020-08-12 Virtual item control method, device, equipment and storage medium Pending CN111760284A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010808039.1A CN111760284A (en) 2020-08-12 2020-08-12 Virtual item control method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010808039.1A CN111760284A (en) 2020-08-12 2020-08-12 Virtual item control method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111760284A true CN111760284A (en) 2020-10-13

Family

ID=72729581

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010808039.1A Pending CN111760284A (en) 2020-08-12 2020-08-12 Virtual item control method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111760284A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112138384A (en) * 2020-10-23 2020-12-29 腾讯科技(深圳)有限公司 Using method, device, terminal and storage medium of virtual throwing prop
CN112221141A (en) * 2020-11-05 2021-01-15 腾讯科技(深圳)有限公司 Method and device for controlling virtual object to use virtual prop
CN113559518A (en) * 2021-07-30 2021-10-29 网易(杭州)网络有限公司 Interaction detection method and device of virtual model, electronic equipment and storage medium
CN113713377A (en) * 2021-09-01 2021-11-30 网易(杭州)网络有限公司 Projection game control method, projection game control device, electronic device, and storage medium
WO2023179292A1 (en) * 2022-03-21 2023-09-28 北京字跳网络技术有限公司 Virtual prop driving method and apparatus, electronic device and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110538459A (en) * 2019-09-05 2019-12-06 腾讯科技(深圳)有限公司 Method, apparatus, device and medium for throwing virtual explosives in virtual environment
CN110585712A (en) * 2019-09-20 2019-12-20 腾讯科技(深圳)有限公司 Method, device, terminal and medium for throwing virtual explosives in virtual environment
CN110721468A (en) * 2019-09-30 2020-01-24 腾讯科技(深圳)有限公司 Interactive property control method, device, terminal and storage medium
CN111282275A (en) * 2020-03-06 2020-06-16 腾讯科技(深圳)有限公司 Method, device, equipment and storage medium for displaying collision traces in virtual scene

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110538459A (en) * 2019-09-05 2019-12-06 腾讯科技(深圳)有限公司 Method, apparatus, device and medium for throwing virtual explosives in virtual environment
CN110585712A (en) * 2019-09-20 2019-12-20 腾讯科技(深圳)有限公司 Method, device, terminal and medium for throwing virtual explosives in virtual environment
CN110721468A (en) * 2019-09-30 2020-01-24 腾讯科技(深圳)有限公司 Interactive property control method, device, terminal and storage medium
CN111282275A (en) * 2020-03-06 2020-06-16 腾讯科技(深圳)有限公司 Method, device, equipment and storage medium for displaying collision traces in virtual scene

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
SG: ""《使命召唤OL》战术装备心得:粘性手雷", pages 1, Retrieved from the Internet <URL:http://codol.17173.com/news/02272015/103428826.shtml> *
堡垒王炮儿: "堡垒之夜:让人又爱又恨的粘性手雷介绍!", Retrieved from the Internet <URL:https://www.bilibili.com/video/BV1RW411m7gm/?spm_id_from=333.337.search-card.all.click&vd_source=fc01b8139073eb2c2757c1c0340924c5> *
第一游: "使命召唤手游粘性手雷效果一览", pages 1, Retrieved from the Internet <URL:https://www.diyiyou.com/smzhsy/gl/269729.html> *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112138384A (en) * 2020-10-23 2020-12-29 腾讯科技(深圳)有限公司 Using method, device, terminal and storage medium of virtual throwing prop
CN112138384B (en) * 2020-10-23 2022-06-07 腾讯科技(深圳)有限公司 Using method, device, terminal and storage medium of virtual throwing prop
CN112221141A (en) * 2020-11-05 2021-01-15 腾讯科技(深圳)有限公司 Method and device for controlling virtual object to use virtual prop
CN112221141B (en) * 2020-11-05 2022-08-23 腾讯科技(深圳)有限公司 Method and device for controlling virtual object to use virtual prop
CN113559518A (en) * 2021-07-30 2021-10-29 网易(杭州)网络有限公司 Interaction detection method and device of virtual model, electronic equipment and storage medium
CN113713377A (en) * 2021-09-01 2021-11-30 网易(杭州)网络有限公司 Projection game control method, projection game control device, electronic device, and storage medium
WO2023179292A1 (en) * 2022-03-21 2023-09-28 北京字跳网络技术有限公司 Virtual prop driving method and apparatus, electronic device and readable storage medium

Similar Documents

Publication Publication Date Title
CN111282275B (en) Method, device, equipment and storage medium for displaying collision traces in virtual scene
CN110448891B (en) Method, device and storage medium for controlling virtual object to operate remote virtual prop
CN110721468B (en) Interactive property control method, device, terminal and storage medium
CN110917619B (en) Interactive property control method, device, terminal and storage medium
CN111589150B (en) Control method and device of virtual prop, electronic equipment and storage medium
CN111475573B (en) Data synchronization method and device, electronic equipment and storage medium
CN110538459A (en) Method, apparatus, device and medium for throwing virtual explosives in virtual environment
CN112221141B (en) Method and device for controlling virtual object to use virtual prop
CN111760284A (en) Virtual item control method, device, equipment and storage medium
CN111228809A (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN111589149B (en) Using method, device, equipment and storage medium of virtual prop
CN112057857B (en) Interactive property processing method, device, terminal and storage medium
CN112870715B (en) Virtual item putting method, device, terminal and storage medium
CN111389005B (en) Virtual object control method, device, equipment and storage medium
CN112138384A (en) Using method, device, terminal and storage medium of virtual throwing prop
CN111265857A (en) Trajectory control method, device, equipment and storage medium in virtual scene
CN112933601A (en) Virtual throwing object operation method, device, equipment and medium
CN111298441A (en) Using method, device, equipment and storage medium of virtual prop
CN111475029A (en) Operation method, device, equipment and storage medium of virtual prop
CN112402964B (en) Using method, device, equipment and storage medium of virtual prop
CN112717410B (en) Virtual object control method and device, computer equipment and storage medium
CN112704875B (en) Virtual item control method, device, equipment and storage medium
CN111659122B (en) Virtual resource display method and device, electronic equipment and storage medium
CN111589137B (en) Control method, device, equipment and medium of virtual role
CN111111181A (en) Method, device and equipment for setting props in virtual environment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40031417

Country of ref document: HK