CN112755518A - Interactive property control method and device, computer equipment and storage medium - Google Patents

Interactive property control method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN112755518A
CN112755518A CN202110164952.7A CN202110164952A CN112755518A CN 112755518 A CN112755518 A CN 112755518A CN 202110164952 A CN202110164952 A CN 202110164952A CN 112755518 A CN112755518 A CN 112755518A
Authority
CN
China
Prior art keywords
interactive prop
prop
interactive
virtual
triggering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110164952.7A
Other languages
Chinese (zh)
Other versions
CN112755518B (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110164952.7A priority Critical patent/CN112755518B/en
Publication of CN112755518A publication Critical patent/CN112755518A/en
Application granted granted Critical
Publication of CN112755518B publication Critical patent/CN112755518B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/847Cooperative playing, e.g. requiring coordinated actions from several players to achieve a common goal

Abstract

The application discloses an interactive prop control method, an interactive prop control device, computer equipment and a storage medium, and belongs to the technical field of computers. This application is through setting up the trigger area for first interactive stage property, throw the completion back at first interactive stage property to the mode of attacking is carried out to the trigger area of first interactive stage property, triggers first interactive stage property, makes the trigger opportunity of first interactive stage property of user's ability control in a flexible way, thereby reduces the applied degree of difficulty of first interactive stage property, improves the human-computer interaction efficiency when the user uses first interactive stage property to carry out the interaction, also can improve the interest of first interactive stage property.

Description

Interactive property control method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for controlling an interactive property, a computer device, and a storage medium.
Background
With the development of computer technology, more and more network games are presented, wherein shooting games gradually become an extremely important class in network games. Various interactive props are set in shooting games, for example, virtual bombs and the like, and users can control virtual objects to throw the virtual bombs and attack virtual objects such as virtual vehicles and virtual buildings in virtual scenes. Under normal conditions, after the virtual bomb is thrown, the virtual bomb can only be automatically detonated after the countdown is finished, and a user cannot flexibly control the detonation opportunity of the virtual bomb, so that interactive props such as the virtual bomb have low hit rate and poor destructive power on movable virtual objects such as a virtual carrier, that is, the difficulty of the user in attacking the virtual bomb is high, the success rate is low, and the human-computer interaction efficiency of the user in interacting the virtual bomb is low.
Disclosure of Invention
The embodiment of the application provides an interactive prop control method, an interactive prop control device, computer equipment and a storage medium, which can reduce the application difficulty of a first interactive prop and improve the human-computer interaction efficiency of a user when the user applies the first interactive prop. The technical scheme is as follows:
in one aspect, an interactive prop control method is provided, and the method includes:
in response to a throwing operation on a first interactive prop, displaying the first interactive prop at a target position in a virtual scene;
displaying a trigger area corresponding to the first interactive prop at the target position of the virtual scene;
and triggering the first interactive prop in response to detecting that the triggering area is attacked.
In one aspect, an interactive prop control apparatus is provided, the apparatus comprising:
the first display module is used for responding to the throwing operation of the first interactive prop and displaying the first interactive prop at a target position in a virtual scene;
the second display module is used for displaying a trigger area corresponding to the first interactive prop at the target position of the virtual scene;
and the triggering module is used for triggering the first interactive prop in response to the detection that the triggering area is attacked.
In one possible implementation, the triggering module is configured to:
controlling a second interactive prop to launch a sub prop to the trigger area of the first interactive prop;
and triggering the first interactive prop in response to detecting that any one of the sub props hits the trigger area.
In one possible implementation, the triggering module is configured to:
and triggering the first interactive prop in response to detecting that a third interactive prop is triggered and the action area of the third interactive prop intersects with the trigger area of the first interactive prop.
In one possible implementation, the apparatus further includes:
and the third display module is used for responding to the triggering of the first interactive prop and displaying the action effect corresponding to the first interactive prop in the virtual scene.
In one possible implementation, the third display module is configured to:
responding to a hit of a sub-prop emitted by a second interactive prop to the trigger area, and determining a first action effect corresponding to the first interactive prop based on a first distance between the hit of the sub-prop in the trigger area and a central point of the trigger area;
and displaying the first effect in the virtual scene.
In one possible implementation, the third display module is configured to:
responding to the attack of the trigger area by a third interactive prop, and determining a second action effect corresponding to the first interactive prop based on the shortest distance between the edge of the action area of the third interactive prop and the central point of the trigger area;
and displaying the second effect in the virtual scene.
In one possible implementation, the first display module is configured to:
responding to the throwing operation of the first interactive prop, and controlling the first interactive prop to move in the virtual scene according to a first motion track;
and responding to the collision of the first interactive prop with a virtual object in the virtual scene in the motion process, and displaying the first interactive prop at the collision position of the first interactive prop and the virtual object.
In one possible implementation, the apparatus further includes:
and the control module is used for controlling the first interactive prop to move in the virtual scene according to a second motion track in response to the first interactive prop colliding with a virtual object in the virtual scene in the motion process, wherein the motion direction of the second motion track is opposite to the motion direction of the first motion track.
In one possible implementation, the apparatus further includes:
and the fourth display module is used for responding to the state that the first interactive prop is not triggered, displaying a prop picking control, and the prop picking control is used for recovering the first interactive prop.
In one possible implementation, the fourth display module is configured to:
responding to the first interactive prop in an un-triggered state, and acquiring a second distance between a first virtual object in the virtual scene and the first interactive prop, wherein the first virtual object is a virtual object for throwing the first interactive prop;
and in response to the second distance being less than or equal to a distance threshold, displaying the prop pickup control.
In one possible implementation, the second display module is configured to perform at least one of:
highlighting the outline of the trigger area at the target location;
the trigger area is displayed as a reference color at the target location.
In one possible implementation, the triggering module is configured to:
displaying a trigger countdown corresponding to the first interactive prop;
and in response to the triggering countdown not ending and the detection that the triggering area is attacked, executing the step of triggering the first interactive prop.
In one aspect, a computer device is provided and includes one or more processors and one or more memories, where at least one computer program is stored in the one or more memories, and loaded and executed by the one or more processors to implement the operations performed by the interactive prop control method.
In one aspect, a computer-readable storage medium is provided, in which at least one computer program is stored, and the at least one computer program is loaded and executed by a processor to implement the operations performed by the interactive prop control method.
In one aspect, a computer program product is provided that includes at least one computer program stored in a computer readable storage medium. The processor of the computer device reads the at least one computer program from the computer-readable storage medium, and executes the at least one computer program, so that the computer device realizes the operations performed by the interactive prop control method.
The technical scheme that this application embodiment provided through setting up the trigger area for first interactive stage property, throw at first interactive stage property and accomplish the back to the mode of attacking is carried out to the trigger area of first interactive stage property, triggers first interactive stage property, makes the user can control the trigger opportunity of first interactive stage property in a flexible way, thereby reduces the applied degree of difficulty of first interactive stage property, improves the human-computer interaction efficiency when the user uses first interactive stage property to carry out the interaction, also can improve the interest of first interactive stage property.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment of an interactive prop control method according to an embodiment of the present application;
fig. 2 is a flowchart of an interactive prop control method according to an embodiment of the present disclosure;
FIG. 3 is a schematic view of a display of a first interactive prop according to an embodiment of the present disclosure;
fig. 4 is a flowchart of an interactive prop control method according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a detection line provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of a trigger zone provided in an embodiment of the present application;
fig. 7 is a schematic diagram of attacking a trigger area according to an embodiment of the present application;
fig. 8 is a flowchart of an interactive prop triggering method according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a control for picking up a piece of track equipment according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of an interactive prop control apparatus according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
To make the purpose, technical solutions and advantages of the present application clearer, the following will describe embodiments of the present application in further detail with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in this application are used for distinguishing between similar items and items that have substantially the same function or similar functionality, and it should be understood that "first," "second," and "nth" do not have any logical or temporal dependency or limitation on the number or order of execution.
In order to facilitate understanding of the technical processes of the embodiments of the present application, some terms referred to in the embodiments of the present application are explained below:
virtual scene: is a virtual scene that is displayed (or provided) by an application program when the application program runs on a terminal. The virtual scene may be a simulation environment of a real world, a semi-simulation semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, and a three-dimensional virtual scene, which is not limited in this application. For example, a virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as deserts, cities, etc., and a user may control a virtual object to move in the virtual scene.
Virtual object: refers to a virtual character that can move in a virtual scene, and the movable object can be a virtual character, a virtual animal, an animation character, and the like. The virtual object may be an avatar in the virtual scene that is virtual to represent the user. The virtual scene may include a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene and occupying a portion of the space in the virtual scene. Alternatively, the virtual object may be a Character controlled by an operation on the client, an Artificial Intelligence (AI) set in a virtual environment battle by training, or a Non-Player Character (NPC) set in a virtual scene battle. Optionally, the virtual object is a virtual character playing a game in a virtual scene. Optionally, the number of virtual objects in the virtual scene battle may be preset, or may be dynamically determined according to the number of clients participating in the battle, which is not limited in the embodiment of the present application. In one possible implementation, the user may control the virtual object to move in the virtual scene, for example, control the virtual object to run, jump, crawl, etc., and may also control the virtual object to interact with other virtual objects using skills, interactive props, etc. provided by the application program.
Virtual object: the virtual objects are objects except for virtual objects in a virtual scene, and for example, the virtual objects are virtual vehicles such as virtual cars, virtual tanks, virtual aircrafts, virtual yachts, or various types of virtual buildings, or virtual plants, virtual decorations, and the like.
And (3) interaction of props: refers to a prop that can interact with a virtual object or a virtual object in a virtual scene. In some embodiments, the interactive props can affect the attribute values of the virtual objects and the virtual objects, for example, shooting games are provided with throwing interactive props such as virtual bombs and virtual grenades, and shooting interactive props such as virtual firearms and virtual crossbows, which can damage the attacked virtual objects and the virtual objects. In some embodiments, the interactive prop can assist the virtual object in achieving a certain purpose, such as a smoke cartridge, which can assist the virtual object in obscuring the shape. It should be noted that, the type of the interactive prop is not limited in the embodiment of the present application.
Fig. 1 is a schematic diagram of an implementation environment of an interactive prop control method provided in an embodiment of the present application, and referring to fig. 1, the implementation environment includes: a first terminal 110, a server 140 and a second terminal 160.
The first terminal 110 is installed and operated with an application program supporting the display of a virtual scene. Illustratively, the application is any one of a virtual reality application, a three-dimensional map program, a military simulation program, a Role-Playing Game (RPG), a Multiplayer Online Battle Arena Game (MOBA), and a Multiplayer gunfight survival Game. The first terminal 110 is a terminal used by a first user, and the first user uses the first terminal 110 to operate a first virtual object located in a virtual scene for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the first virtual object is a first virtual persona, such as a simulated persona or an animated persona.
The first terminal 110 is connected to the server 140 through a wireless network or a wired network.
The server 140 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 140 is used to provide background services for applications supporting virtual scene display. Alternatively, the server 140 undertakes primary computational tasks and the first terminal 110 and the second terminal 160 undertakes secondary computational tasks; alternatively, the server 140 undertakes the secondary computing work and the first terminal 110 and the second terminal 160 undertakes the primary computing work; alternatively, the server 140, the first terminal 110, and the second terminal 160 perform cooperative computing by using a distributed computing architecture.
The second terminal 160 is installed and operated with an application program supporting the display of a virtual scene. Illustratively, the application is any one of a virtual reality application, a three-dimensional map program, a military simulation program, a Role-Playing Game (RPG), a Multiplayer Online Battle Arena Game (MOBA), and a Multiplayer gunfight survival Game. The second terminal 160 is a terminal used by a second user, and the second user uses the second terminal 160 to operate a second virtual object located in the virtual scene for activities, including but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the second virtual object is a second virtual character, such as a simulated persona or an animated persona.
The second terminal 160 is connected to the server 140 through a wireless network or a wired network.
Optionally, the first virtual object controlled by the first terminal 110 and the second virtual object controlled by the second terminal 160 are in the same virtual scene, and the first virtual object may interact with the second virtual object in the virtual scene. In some embodiments, the first virtual object and the second virtual object are in a hostile relationship, for example, the first virtual object and the second virtual object belong to different groups, and different skills or interactive props can be applied between the virtual objects in the hostile relationship to attack each other, so as to perform a battle-type interaction, and display the performance effect triggered by the skills in the first terminal 110 and the second terminal 160. In other embodiments, the first virtual object and the second virtual object are in a teammate relationship, for example, the first virtual object and the second virtual object belong to the same group, have a friend relationship or have temporary communication rights.
Alternatively, the applications installed on the first terminal 110 and the second terminal 160 are the same, or the applications installed on the two terminals are the same type of application of different operating system platforms. The first terminal 110 may generally refer to one of a plurality of terminals, and the second terminal 160 may generally refer to one of a plurality of terminals, and this embodiment is only illustrated by the first terminal 110 and the second terminal 160. The device types of the first terminal 110 and the second terminal 160 are the same or different, and include: at least one of a smart phone, a tablet computer, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III, motion Picture Experts compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, motion Picture Experts compression standard Audio Layer 4) player, a laptop portable computer, and a desktop computer. For example, the first terminal 110 and the second terminal 160 may be smart phones, or other handheld portable gaming devices. The following embodiments are illustrated with the terminal comprising a smartphone.
Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only one, or several tens or hundreds of the terminals, or more. The number of terminals and the type of the device are not limited in the embodiments of the present application.
The interactive prop control method provided by the embodiment of the application can be applied to various types of application programs, so that a user can flexibly control the trigger time of the interactive prop. For example, the scheme can be applied to shooting games, the interactive prop set by the shooting games comprises a virtual bomb, and after the virtual bomb is thrown by a first virtual object controlled by a user, the virtual bomb can be detonated by attacking a trigger area of the virtual bomb, for example, by shooting the trigger area of the virtual bomb by using a virtual gun, so that the detonation opportunity of the virtual bomb is flexibly controlled. By applying the technical scheme of the embodiment of the application, the interestingness of the interactive prop in the application program can be improved, the application difficulty of the interactive prop is reduced, and the human-computer interaction efficiency is improved.
Fig. 2 is a flowchart of an interactive prop control method according to an embodiment of the present application. The method can be applied to any terminal in the above implementation environment, in the embodiment of the present application, a terminal is used as an execution subject, and the interactive prop control method is described, referring to fig. 2, in a possible implementation manner, the embodiment includes the following steps:
201. the terminal responds to the throwing operation of the first interactive prop and displays the first interactive prop at a target position in the virtual scene.
The first interactive prop is a throwing prop, such as a virtual bomb, a virtual torpedo, and the like.
In one possible implementation manner, the user controls the first virtual object to move in the virtual scene through the terminal, for example, using the interactive prop to interact with a virtual object, other virtual objects, and the like in the virtual scene. The terminal responds to the fact that the user selects the first interactive prop, throwing operation is carried out on the first interactive prop, the terminal controls the first virtual object to throw the first interactive prop in a virtual scene, the position of the first interactive prop in the virtual scene is determined, the position of the first interactive prop in the virtual scene is taken as the target position, and the first interactive prop is displayed at the target position in the virtual scene.
202. And the terminal displays a trigger area corresponding to the first interactive prop at the target position of the virtual scene.
In this embodiment of the present application, the position of the trigger area is determined based on the target position, that is, the position of the trigger area is determined based on the display position of the first interactive prop in the virtual scene. Fig. 3 is a schematic diagram of a display manner of a first interactive prop provided in an embodiment of the present application, referring to fig. 3, where the first interactive prop hits a target position 301 in a virtual scene, and a terminal displays a trigger area 302 corresponding to the first interactive prop at the target position.
In a possible implementation manner, the first interactive prop refers to a type of interactive prop, for example, a throwing type prop, which includes a virtual bomb, a virtual torpedo, and the like, in the type of interactive prop, areas of trigger regions corresponding to the interactive props are different, and a correspondence between the areas of the interactive props and the trigger regions is set by a developer, which is not limited in the embodiment of the present application.
It should be noted that, the above steps 201 and 202 may be executed simultaneously, that is, the trigger area is displayed while the first interactive prop is displayed at the target location; alternatively, step 201 is executed first, and then step 202 is executed, which is not limited in this embodiment of the application.
203. And the terminal triggers the first interactive prop in response to detecting that the triggering area is attacked.
In one possible implementation, the first virtual object currently controlled by the user, or other virtual objects in the virtual scene, can launch an attack on the trigger area. Illustratively, the attack on the trigger area includes shooting the trigger area by using interactive props such as virtual firearms; or, when the interactive prop such as a virtual bomb, a virtual torpedo and the like is thrown, and when the interactive prop is triggered, if the action area of the interactive prop intersects with the trigger area, namely, an external explosion injury enters the trigger area, the trigger area is determined to be attacked. It should be noted that the above description of the method for attacking the trigger area is only an exemplary description of one possible implementation manner, and the embodiment of the present application does not limit which method is specifically adopted to attack the trigger area.
The technical scheme that this application embodiment provided through setting up the trigger area for first interactive stage property, throw at first interactive stage property and accomplish the back to the mode of attacking is carried out to the trigger area of first interactive stage property, triggers first interactive stage property, makes the user can control the trigger opportunity of first interactive stage property in a flexible way, thereby reduces the applied degree of difficulty of first interactive stage property, improves the human-computer interaction efficiency when the user uses first interactive stage property to carry out the interaction, also can improve the interest of first interactive stage property.
The above embodiment is a brief introduction to the interactive prop control method provided in the present application, and the method is described below with reference to fig. 4. Fig. 4 is a flowchart of an interactive prop control method provided in an embodiment of the present application, where the method is applied in the implementation environment shown in fig. 1, and referring to fig. 4, in a possible implementation manner, the embodiment includes the following steps:
401. and the terminal responds to the throwing operation of the first interactive prop and controls the first interactive prop to move in the virtual scene according to the first motion track.
In the embodiment of the application, the terminal is provided with and runs an application program supporting virtual scene display, for example, the application program is a shooting game, and virtual objects controlled by various users can compete in the virtual scene. In a possible implementation manner, after a user enters a local competition, a terminal displays an operation interface corresponding to the local competition, illustratively, the operation interface includes a virtual scene, a virtual object, a first virtual object currently controlled by the user, virtual objects controlled by other users, an interactive prop and the like, and the user can control the first virtual object to use any interactive prop to interact with other virtual objects and virtual objects. In a possible implementation manner, the terminal detects a throwing operation of the user on the first interactive prop, controls the first virtual object to throw the first interactive prop, and enables the thrown first interactive prop to move in the virtual scene according to the first motion track. The first motion trajectory is determined based on the throwing direction, throwing force and the like of the user, and the first motion trajectory is not limited in the embodiment of the application.
402. The terminal responds to the fact that the first interactive prop collides with a virtual object in the virtual scene in the motion process, determines the collision position of the first interactive prop and the virtual object as the target position, displays the first interactive prop at the target position, and displays a trigger area corresponding to the first interactive prop at the target position.
In a possible implementation manner, when the terminal controls the first interactive prop to move in the virtual scene, based on the first motion trajectory, a section of detection line is added in front of the first interactive prop, as shown in fig. 5, where fig. 5 is a schematic diagram of a detection line provided in this embodiment of the present application, and a section of detection line 502 is transmitted in front of the first interactive prop 501 of the terminal. The detection line is used for determining the position of the first interactive prop in the next frame, and performing collision detection, namely detecting whether the first interactive prop collides with a virtual object or other virtual objects in a virtual scene.
In one possible implementation, if the first interactive prop collides with a virtual object, the collision location is determined as a target location, and the first interactive prop is adhered to the target location, that is, to the surface of the virtual object. As shown in fig. 3, when the first interactive prop collides with the virtual vehicle, it adheres to the surface of the virtual vehicle. In one possible implementation, the first interactive prop is rebounded if the first interactive prop hits a virtual object. That is, the terminal controls the first interactive prop to move in the virtual scene according to a second motion track in response to the first interactive prop colliding with a virtual object in the virtual scene during the movement process, wherein the movement direction of the second motion track is opposite to the movement direction of the first motion track. And in the process that the first interactive prop moves according to the second motion track, the terminal continues to perform collision detection on the first interactive prop until the first interactive prop collides with a certain virtual object, and the first interactive prop stops moving and adheres to the surface of the collided virtual object. Of course, in some embodiments, after the first interactive prop collides with the virtual object, the first interactive prop can also adhere to the surface of the collided virtual object, which is not limited in this application.
In this embodiment of the application, after the first interactive prop finishes moving, that is, after the first interactive prop collides with the virtual object, a triggering area is generated, for example, the triggering area is a spherical area with the target position as a center and a radius of EarlyDetectRadiu, where EarlyDetectRadiu is greater than 0, and a specific value of EarlyDetectRadiu is set by a developer. Fig. 6 is a schematic diagram of a trigger area provided in the embodiment of the present application, and as shown in fig. 6, the first interactive prop corresponds to a spherical detection area 601. In one possible implementation manner, the terminal highlights a trigger area of the first interactive prop, so that a user can determine the range of the trigger area, and thus trigger the first interactive prop. Illustratively, the terminal displays the trigger area as a reference color at the target position; alternatively, the outline of the trigger area is highlighted at the target position, and the outline of the trigger area is displayed as a reference color, etc., which is not limited in the embodiment of the present application. In some embodiments, the trigger area is visible only to the target user who throws the first interactive prop, and other users cannot see the trigger area, that is, the trigger area is not displayed on the terminals of other users participating in the competitive fight of the local game; or the trigger area is visible to a target user who throws the first interactive prop, visible to a user who belongs to the same group as the target user, and invisible to a user who belongs to a different group from the target user. Based on the trigger area display mechanism, the first interactive prop can be prevented from being triggered by an enemy user in a preemptive manner, and the interference on the operation of a target user throwing the first interactive prop is avoided. It should be noted that, in the embodiment of the present application, the display condition of the trigger area on each terminal is not limited.
It should be noted that, in step 401 and step 402, in response to a throwing operation on the first interactive prop, the first interactive prop is displayed at a target position in a virtual scene, and a trigger area corresponding to the first interactive prop is displayed at the target position. In the above embodiments, only the first interactive prop is collided with the virtual object and then the trigger area is displayed as an example, in some embodiments, when the first interactive prop is thrown out and moves according to a certain motion trajectory, the terminal can also display the trigger area corresponding to the first interactive prop in real time, and the trigger area moves along with the movement of the first interactive prop, that is, when the first interactive prop is not landed and does not collide with the virtual object, the trigger area of the first interactive prop is allowed to be attacked to trigger the first interactive prop, so as to improve the interestingness and the flexibility of the application of the first interactive prop. Taking the first interactive prop as a virtual bomb as an example, the triggering mode can achieve the effect that the virtual bomb is detonated when moving in the air, and the attack effect of the virtual bomb is enriched.
403. And the terminal displays the trigger countdown corresponding to the first interactive prop.
The duration of the trigger countdown is set by a developer, and the embodiment of the present application does not limit this.
In a possible implementation manner, after the first interactive prop finishes moving, namely, after the first interactive prop collides with a virtual object, the countdown is triggered, and the terminal displays the countdown in the operation interface. Of course, the terminal may also start triggering countdown when detecting the throwing operation of the user on the first interactive prop, which is not limited in this embodiment of the present application. In this embodiment of the application, during the trigger countdown display process, the terminal continues to execute step 404, that is, during the countdown process, the user can attack the trigger area. In a possible implementation manner, after the trigger countdown is finished, the terminal automatically triggers the first interactive prop; or after the trigger countdown is finished, the terminal does not display the trigger area of the first interactive prop any more, and the first interactive prop cannot be triggered.
404. And the terminal responds to the situation that the trigger countdown is not finished and detects that the trigger area is attacked, and triggers the first interactive prop.
In one possible implementation, attacking the trigger zone includes any of the following implementations:
in the first implementation mode, the terminal controls the second interactive prop to transmit the sub prop to the trigger area of the first interactive prop, and in response to detecting that any of the sub props hits the trigger area, the terminal determines that the trigger area is attacked and triggers the first interactive prop. The secondary prop is arranged in the second interactive prop and is matched with the second interactive prop for use. For example, the second interactive prop is a virtual firearm and the prop is a virtual bullet. Fig. 7 is a schematic diagram of attacking a trigger area according to an embodiment of the present application, and referring to fig. 7, a terminal controls a second interactive prop, that is, a virtual firearm 701 shoots a trigger area 702, and launches a sub prop to the trigger area 702.
And in the second implementation mode, the terminal triggers the first interactive prop in response to the fact that the third interactive prop is triggered and the action area of the third interactive prop is intersected with the trigger area of the first interactive prop. The third interactive prop can be a virtual bomb, a virtual grenade and the like, corresponds to an action area, and can damage virtual objects, virtual objects and the like in the action area after being triggered. The third interactive prop may be thrown by the first virtual object currently controlled by the terminal, or thrown by other virtual objects in the virtual scene, which is not limited in the embodiment of the present application.
It should be noted that, the above steps 403 and 404 are steps of triggering the first interactive prop in response to detecting that the trigger area is attacked, and the embodiment of the application is not limited to which method is specifically adopted to attack the trigger area. In some embodiments, step 403 is optional, that is, a trigger countdown may not be set for the first interactive prop, and the first interactive prop is displayed at the target location until the trigger area is detected to be attacked, and then the first interactive prop is triggered.
In the embodiment of the application, the user can trigger the first interactive prop through multiple triggering modes, playing methods of the first interactive prop in the application program are effectively enriched, interestingness of using the first interactive prop in the application program is improved, and the user can flexibly control triggering time of the first interactive prop according to own attack strategies.
405. And the terminal responds to the triggering of the first interactive prop and displays an action effect corresponding to the first interactive prop in the virtual scene.
The effect that this first interactive stage property corresponds can be explosion effect, combustion effect, release smog effect etc. and this application embodiment does not do the restriction to the effect of first interactive stage property.
In a possible implementation manner, the first interactive prop is triggered in different manners, and the action effect displayed by the terminal is also different, for example, after the countdown is triggered, the action effect exhibited when the first interactive prop is automatically triggered is different from the action effect exhibited when the first interactive prop is triggered by attacking the trigger area, the action effects corresponding to the different trigger manners are configured by developers, and the embodiment of the present application does not limit the action effects. In some embodiments, the first interactive prop is triggered by attacking a trigger area, and the action effect of the first interactive prop is different when the first interactive prop is attacked at different positions in the trigger area. Taking an example of transmitting a sub-prop to a trigger area by using a second interactive prop, a terminal responds to that the first interactive prop is triggered, namely, responds to that the sub-prop transmitted by the second interactive prop hits the second interactive area, determines a first action effect corresponding to the first interactive prop based on a first distance between the hit point of the sub-prop in the trigger area and a central point of the trigger area, and displays the action effect in a virtual scene. Taking an example of attacking the trigger area by using the third interactive prop, the terminal responds to the first interactive prop being triggered, that is, when responding to the third interactive prop being triggered, the damage generated by the third interactive prop enters the trigger area, the terminal determines a second action effect corresponding to the first interactive prop based on the shortest distance between the edge of the action area of the third interactive prop and the central point of the trigger area, and the terminal displays the second action effect in the virtual scene. In a possible implementation manner, the terminal stores a corresponding relationship between an attack distance and an action effect of the first interactive prop, wherein the attack distance is the first distance or the shortest distance, the terminal responds to the trigger area and is attacked, after the attack distance is obtained, the corresponding relationship is inquired based on the attack distance, and the action effect corresponding to the first interactive prop is determined. It should be noted that the above description of the method for determining the action effect of the first interactive prop is only an exemplary description, and the embodiment of the present application does not limit which method is specifically used to determine the action effect of the first interactive prop.
In this embodiment, after the first interactive prop is triggered, damage may be generated in a target action area, that is, damage may be generated on a virtual object, and the like located in the target action area, where the target action area is centered on the target position, and a damageradius is a radius area, and the damageradius is a value greater than 0. In a possible implementation manner, the first interactive prop is triggered by different triggering manners, and the damage values generated by the first interactive prop in the target action area are different, for example, by attacking the triggering area, the first interactive prop is triggered in advance, so that a larger damage value can be generated. Illustratively, after the trigger countdown is finished, when the first interactive prop is automatically triggered, the damage value in the target action area is a first damage value, and when the first interactive prop is triggered by attacking the trigger area, the generated damage value in the target action area of the first interactive prop is a second damage value. Wherein, this first injury value and second injury value are set up by the developer, and this application has implemented and has not restricted this.
In some embodiments, for the triggering mode of triggering the first interactive prop through the attack trigger area, different positions in the trigger area are hit, and the generated attack strength is different, that is, the damage value generated by the first interactive prop is different. Illustratively, the terminal stores a injury weight MultiDamage, and the numerical value of the MultiDamage is set by a developer, and the injury weight is used for adjusting the size of an injury value generated in the target action region. In one possible implementation, the damage value generated within the target region of action is expressed as the following equation (1):
Damage*[MultiDamage–(MultiDamage–1)/EarlyDetectRadiu*X] (1)
wherein Damage represents a central point injury value, and the numerical value of Damage is set by a developer; x represents the distance between the attacked location in the trigger area and the center point of the trigger area, i.e., the first distance or the shortest distance.
In one possible implementation manner, the damage value generated by the first interactive prop in the target action region has an attenuation effect, for example, the damage value is attenuated from the center to the edge of the target action region, that is, the damage value in the target action region is linearly changed from the center to the edge, a virtual object or a virtual object located in the center of the action region is subjected to a larger damage value, and a virtual object or a virtual object located at the edge of the action region is subjected to a smaller damage value. In this embodiment of the application, the terminal may store a target configuration parameter attentiationtotankk, where the target configuration parameter is used to indicate whether the damage value in the target action region is attenuated, for example, if the value of the target configuration parameter attentiationtotankk is true, it indicates that the damage value in the target action region is attenuated, and if the value of the target configuration parameter attentiationtotankk is false, it indicates that the damage value in the target action region is not attenuated. In some embodiments, the target configuration parameters can be configured for various types of virtual objects and virtual objects, for example, only the value of the target configuration parameter corresponding to the virtual vehicle is set to true, and the values of the target configuration parameters corresponding to other virtual objects and virtual objects are set to false. In one possible implementation, the damage value at the center point of the target region of action may be represented by the above formula (1), and the damage value at the edge of the target region of action may be represented by the following formula (2):
MinDamageValue*[MultiDamage–(MultiDamage–1)/EarlyDetectRadiu*X] (2)
wherein MinDamageValue represents an edge injury value, and the numerical value of MinDamageValue is set by a developer; x represents the distance between the attacked location in the trigger area and the center point of the trigger area, i.e., the first distance or the shortest distance.
It should be noted that the above description of the method for determining the damage value of the first interactive prop is only an exemplary description of one possible implementation manner, and the embodiment of the present application does not limit which method is specifically used to determine the damage value.
The technical scheme that this application embodiment provided through setting up the trigger area for first interactive stage property, throw at first interactive stage property and accomplish the back to the mode of attacking is carried out to the trigger area of first interactive stage property, triggers first interactive stage property, promptly, makes the user can control the opportunity of triggering of first interactive stage property in a flexible way, thereby reduces the applied degree of difficulty of first interactive stage property, improves the human-computer interaction efficiency when the user uses first interactive stage property to carry out the interaction, also can improve the interest of first interactive stage property.
Fig. 8 is a flowchart of an interactive prop triggering method according to an embodiment of the present application, and the following describes the above process with reference to fig. 8. As shown in fig. 8, taking an application running on a terminal as a shooting game and the first interactive property as a virtual bomb as an example, in a possible implementation manner, after a user enters a local competition, the terminal first performs step 801 of acquiring the virtual bomb, for example, the terminal acquires the virtual bomb based on a property pickup operation of the user, and the method for acquiring the virtual bomb is not limited in the embodiment of the present application. The terminal determines whether the user throws a virtual bomb, if so, continues to perform step 802, and in step 802, the terminal controls the virtual bomb to move in the virtual scene according to the first movement trajectory, and during the movement of the virtual bomb, the terminal determines whether the virtual bomb collides with a virtual object, if so, rebounds the virtual bomb, and if not, performs step 803. In step 803, the terminal adheres the virtual bomb to the virtual object surface in response to the virtual bomb colliding with the virtual object. The terminal judges whether the triggering countdown of the virtual bomb is finished, if so, step 804 is executed, and if not, step 805 is executed. In step 804, the terminal detonates the virtual bomb and in step 805, the terminal acquires an attack on the trigger zone of the virtual bomb. The terminal responds to the triggering area being attacked, the step 804 is continuously executed, after the virtual bomb is detonated, the terminal judges whether an attack target is included in a target acting area of the virtual bomb, the attack target can be a virtual object, a virtual object and the like, and the terminal executes step 806 to determine an injury value of the attack target. In the embodiment of the application, the first interactive prop can be triggered in advance based on user operation, and has a good attack effect on movable virtual objects such as virtual vehicles in a virtual scene by applying the first interactive prop, so that a user can trigger the first interactive prop in time before the virtual objects are moved away, the hit rate and the attack effect of the first interactive prop are improved, the difficulty of attacking by the user through the first interactive prop is reduced, and the human-computer interaction efficiency is improved.
The above embodiment mainly introduces a process of setting a trigger area for a first interactive prop, and triggering the first interactive prop by attacking the trigger area, so that a user can flexibly control the trigger time of the first interactive prop. In this embodiment, the user can withdraw the first interactive prop when the first interactive prop is not triggered. Illustratively, the terminal responds to the condition that the first interactive prop is not triggered, and displays a prop picking control which is used for recovering the first interactive prop. In one possible implementation, only throwing the first virtual object of the first interactive prop can retrieve the first interactive prop, and other virtual objects in the virtual scene cannot pick up the first interactive prop. For example, when the first virtual object is close to the first interactive prop, the first virtual object can trigger the terminal to display the prop pickup control, that is, trigger the terminal to open a pickup function for the first interactive prop. In a possible implementation manner, the terminal responds to that the first interactive prop is in an un-triggered state, and obtains a second distance between a first virtual object in the virtual scene and the first interactive prop, wherein the first virtual object is a virtual object for throwing the first interactive prop; and in response to the second distance being less than or equal to a distance threshold, displaying the prop pickup control. The distance threshold is set by a developer, and the embodiment of the present application is not limited thereto. Fig. 9 is a schematic view of a property pickup control provided in an embodiment of the present application, and referring to fig. 9, when a first virtual object approaches the first interactive property, the terminal displays the property pickup control 901 on an operation interface. In this application embodiment, the recovery mechanism for the first interactive prop is set, so that the use effect of the first interactive prop can be effectively improved, for example, if the user fails to throw the first interactive prop to an ideal position when throwing the first interactive prop, the first interactive prop can be timely withdrawn, and the first interactive prop is prevented from being wasted.
All the above optional technical solutions may be combined arbitrarily to form optional embodiments of the present application, and are not described herein again.
Fig. 10 is a schematic structural diagram of an interactive prop control device provided in an embodiment of the present application, and referring to fig. 10, the device includes:
a first display module 1001, configured to display a first interactive prop in a target position in a virtual scene in response to a throwing operation on the first interactive prop;
a second display module 1002, configured to display a trigger area corresponding to the first interactive prop at the target position of the virtual scene;
a triggering module 1003, configured to trigger the first interactive prop in response to detecting that the triggering area is attacked.
In one possible implementation, the triggering module 1003 is configured to:
controlling a second interactive prop to launch a sub prop to the trigger area of the first interactive prop;
and triggering the first interactive prop in response to detecting that any one of the sub props hits the trigger area.
In one possible implementation, the triggering module 1003 is configured to:
and triggering the first interactive prop in response to detecting that a third interactive prop is triggered and the action area of the third interactive prop intersects with the trigger area of the first interactive prop.
In one possible implementation, the apparatus further includes:
and the third display module is used for responding to the triggering of the first interactive prop and displaying the action effect corresponding to the first interactive prop in the virtual scene.
In one possible implementation, the third display module is configured to:
responding to a hit of a sub-prop emitted by a second interactive prop to the trigger area, and determining a first action effect corresponding to the first interactive prop based on a first distance between the hit of the sub-prop in the trigger area and a central point of the trigger area;
and displaying the first effect in the virtual scene.
In one possible implementation, the third display module is configured to:
responding to the attack of the trigger area by a third interactive prop, and determining a second action effect corresponding to the first interactive prop based on the shortest distance between the edge of the action area of the third interactive prop and the central point of the trigger area;
and displaying the second effect in the virtual scene.
In one possible implementation, the first display module 1001 is configured to:
responding to the throwing operation of the first interactive prop, and controlling the first interactive prop to move in the virtual scene according to a first motion track;
and responding to the collision of the first interactive prop with a virtual object in the virtual scene in the motion process, and displaying the first interactive prop at the collision position of the first interactive prop and the virtual object.
In one possible implementation, the apparatus further includes:
and the control module is used for controlling the first interactive prop to move in the virtual scene according to a second motion track in response to the first interactive prop colliding with a virtual object in the virtual scene in the motion process, wherein the motion direction of the second motion track is opposite to the motion direction of the first motion track.
In one possible implementation, the apparatus further includes:
and the fourth display module is used for responding to the state that the first interactive prop is not triggered, displaying a prop picking control, and the prop picking control is used for recovering the first interactive prop.
In one possible implementation, the fourth display module is configured to:
responding to the first interactive prop in an un-triggered state, and acquiring a second distance between a first virtual object in the virtual scene and the first interactive prop, wherein the first virtual object is a virtual object for throwing the first interactive prop;
and in response to the second distance being less than or equal to a distance threshold, displaying the prop pickup control.
In one possible implementation, the second display module 1002 is configured to perform at least one of the following:
highlighting the outline of the trigger area at the target location;
the trigger area is displayed as a reference color at the target location.
In one possible implementation, the triggering module 1003 is configured to:
displaying a trigger countdown corresponding to the first interactive prop;
and in response to the triggering countdown not ending and the detection that the triggering area is attacked, executing the step of triggering the first interactive prop.
The device that this application embodiment provided through setting up the trigger area for first interactive stage property, throw at first interactive stage property and accomplish the back to the mode of attacking is carried out to the trigger area of first interactive stage property, triggers first interactive stage property, promptly, makes the user can control the trigger opportunity of first interactive stage property in a flexible way, thereby reduces the applied degree of difficulty of first interactive stage property, improves the human-computer interaction efficiency when the user uses first interactive stage property to carry out the interaction, also can improve the interest of first interactive stage property.
It should be noted that: the interactive prop control device provided in the above embodiment is exemplified by only the division of the above functional modules when controlling the interactive prop, and in practical applications, the above function allocation can be completed by different functional modules as needed, that is, the internal structure of the device is divided into different functional modules to complete all or part of the above described functions. In addition, the interactive prop control device and the interactive prop control method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
Fig. 11 is a schematic structural diagram of a terminal according to an embodiment of the present application. The terminal 1100 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 1100 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and so forth.
In general, terminal 1100 includes: one or more processors 1101 and one or more memories 1102.
Processor 1101 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1101 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1101 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1101 may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, the processor 1101 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1102 may include one or more computer-readable storage media, which may be non-transitory. Memory 1102 can also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1102 is used to store at least one computer program for execution by processor 1101 to implement the interactive prop control methods provided by method embodiments herein.
In some embodiments, the terminal 1100 may further include: a peripheral interface 1103 and at least one peripheral. The processor 1101, memory 1102 and peripheral interface 1103 may be connected by a bus or signal lines. Various peripheral devices may be connected to the peripheral interface 1103 by buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1104, display screen 1105, camera assembly 1106, audio circuitry 1107, positioning assembly 1108, and power supply 1109.
The peripheral interface 1103 may be used to connect at least one peripheral associated with I/O (Input/Output) to the processor 1101 and the memory 1102. In some embodiments, the processor 1101, memory 1102, and peripheral interface 1103 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1101, the memory 1102 and the peripheral device interface 1103 may be implemented on separate chips or circuit boards, which is not limited by this embodiment.
The Radio Frequency circuit 1104 is used to receive and transmit RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuit 1104 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1104 converts an electric signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electric signal. Optionally, the radio frequency circuit 1104 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1104 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1104 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1105 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1105 is a touch display screen, the display screen 1105 also has the ability to capture touch signals on or over the surface of the display screen 1105. The touch signal may be input to the processor 1101 as a control signal for processing. At this point, the display screen 1105 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 1105 may be one, providing the front panel of terminal 1100; in other embodiments, the display screens 1105 can be at least two, respectively disposed on different surfaces of the terminal 1100 or in a folded design; in some embodiments, display 1105 can be a flexible display disposed on a curved surface or on a folded surface of terminal 1100. Even further, the display screen 1105 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The Display screen 1105 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
Camera assembly 1106 is used to capture images or video. Optionally, camera assembly 1106 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1106 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1107 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1101 for processing or inputting the electric signals to the radio frequency circuit 1104 to achieve voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of terminal 1100. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1101 or the radio frequency circuit 1104 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1107 may also include a headphone jack.
Positioning component 1108 is used to locate the current geographic position of terminal 1100 for purposes of navigation or LBS (Location Based Service). The Positioning component 1108 may be a Positioning component based on the united states GPS (Global Positioning System), the chinese beidou System, the russian graves System, or the european union galileo System.
Power supply 1109 is configured to provide power to various components within terminal 1100. The power supply 1109 may be alternating current, direct current, disposable or rechargeable. When the power supply 1109 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1100 can also include one or more sensors 1110. The one or more sensors 1110 include, but are not limited to: acceleration sensor 1111, gyro sensor 1112, pressure sensor 1113, fingerprint sensor 1114, optical sensor 1115, and proximity sensor 1116.
Acceleration sensor 1111 may detect acceleration levels in three coordinate axes of a coordinate system established with terminal 1100. For example, the acceleration sensor 1111 may be configured to detect components of the gravitational acceleration in three coordinate axes. The processor 1101 may control the display screen 1105 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1111. The acceleration sensor 1111 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1112 may detect a body direction and a rotation angle of the terminal 1100, and the gyro sensor 1112 may cooperate with the acceleration sensor 1111 to acquire a 3D motion of the user with respect to the terminal 1100. From the data collected by gyroscope sensor 1112, processor 1101 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1113 may be disposed on a side bezel of terminal 1100 and/or underlying display screen 1105. When the pressure sensor 1113 is disposed on the side frame of the terminal 1100, the holding signal of the terminal 1100 from the user can be detected, and the processor 1101 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1113. When the pressure sensor 1113 is disposed at the lower layer of the display screen 1105, the processor 1101 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 1105. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1114 is configured to collect a fingerprint of the user, and the processor 1101 identifies the user according to the fingerprint collected by the fingerprint sensor 1114, or the fingerprint sensor 1114 identifies the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the user is authorized by the processor 1101 to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. Fingerprint sensor 1114 may be disposed on the front, back, or side of terminal 1100. When a physical button or vendor Logo is provided on the terminal 1100, the fingerprint sensor 1114 may be integrated with the physical button or vendor Logo.
Optical sensor 1115 is used to collect ambient light intensity. In one embodiment, the processor 1101 may control the display brightness of the display screen 1105 based on the ambient light intensity collected by the optical sensor 1115. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1105 is increased; when the ambient light intensity is low, the display brightness of the display screen 1105 is reduced. In another embodiment, processor 1101 may also dynamically adjust the shooting parameters of camera assembly 1106 based on the ambient light intensity collected by optical sensor 1115.
Proximity sensor 1116, also referred to as a distance sensor, is typically disposed on a front panel of terminal 1100. Proximity sensor 1116 is used to capture the distance between the user and the front face of terminal 1100. In one embodiment, when the proximity sensor 1116 detects that the distance between the user and the front face of the terminal 1100 is gradually decreased, the display screen 1105 is controlled by the processor 1101 to switch from a bright screen state to a dark screen state; when the proximity sensor 1116 detects that the distance between the user and the front face of the terminal 1100 becomes progressively larger, the display screen 1105 is controlled by the processor 1101 to switch from a breath-screen state to a light-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 11 does not constitute a limitation of terminal 1100, and may include more or fewer components than those shown, or may combine certain components, or may employ a different arrangement of components.
Fig. 12 is a schematic structural diagram of a server 1200 according to an embodiment of the present application, where the server 1200 may generate a relatively large difference due to a difference in configuration or performance, and may include one or more processors (CPUs) 1201 and one or more memories 1202, where the one or more memories 1202 store at least one computer program, and the at least one computer program is loaded and executed by the one or more processors 1201 to implement the methods provided by the foregoing method embodiments. Certainly, the server 1200 may further have components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input and output, and the server 1200 may further include other components for implementing the functions of the device, which is not described herein again.
In an exemplary embodiment, a computer-readable storage medium, such as a memory including at least one computer program, is also provided, where the at least one computer program is executable by a processor to perform the interactive prop control method in the above embodiments. For example, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided, the computer program product comprising at least one computer program, the at least one computer program being stored in a computer readable storage medium. The processor of the computer device reads the at least one computer program from the computer-readable storage medium, and executes the at least one computer program, so that the computer device realizes the operations performed by the interactive prop control method.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, and the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. An interactive prop control method, comprising:
in response to a throwing operation on a first interactive prop, displaying the first interactive prop at a target position in a virtual scene;
displaying a trigger area corresponding to the first interactive prop at the target position of the virtual scene;
triggering the first interactive prop in response to detecting that the trigger zone is attacked.
2. The method of claim 1, wherein triggering the first interactive prop in response to detecting the trigger zone being attacked comprises:
controlling a second interactive prop to launch a sub prop to the trigger area of the first interactive prop;
triggering the first interactive prop in response to detecting that any of the sub-props hit the trigger region.
3. The method of claim 1, wherein triggering the first interactive prop in response to detecting the trigger zone being attacked comprises:
triggering a first interactive prop in response to detecting that a third interactive prop is triggered and an action area of the third interactive prop intersects with the trigger area of the first interactive prop.
4. The method of claim 1, wherein after triggering the first interactive prop in response to detecting that the trigger zone is attacked, the method further comprises:
and responding to the triggering of the first interactive prop, and displaying an action effect corresponding to the first interactive prop in the virtual scene.
5. The method of claim 4, wherein the displaying, in response to the first interactive prop being triggered, an effect corresponding to the first interactive prop in the virtual scene comprises:
responding to a hit of a sub-prop emitted by a second interactive prop to the trigger area, and determining a first action effect corresponding to the first interactive prop based on a first distance between the hit of the sub-prop in the trigger area and a central point of the trigger area;
displaying the first effect in the virtual scene.
6. The method of claim 4, wherein the displaying, in response to the first interactive prop being triggered, an effect corresponding to the first interactive prop in the virtual scene comprises:
in response to the triggering area being attacked by a third interactive prop, determining a second action effect corresponding to the first interactive prop based on a shortest distance between an edge of an action area of the third interactive prop and a center point of the triggering area;
displaying the second effect in the virtual scene.
7. The method of claim 1, wherein said displaying the first interactive prop in a target location in a virtual scene in response to a throwing operation of the first interactive prop comprises:
responding to a throwing operation of a first interactive prop, and controlling the first interactive prop to move in the virtual scene according to a first motion track;
responding to the collision of the first interactive prop with a virtual object in the virtual scene in the motion process, and displaying the first interactive prop at the collision position of the first interactive prop and the virtual object.
8. The method of claim 7, wherein after displaying the first interactive prop after moving in the virtual scene according to a first motion trajectory in response to a throwing operation of the first interactive prop, the method further comprises:
and in response to the collision of the first interactive prop with a virtual object in the virtual scene in the motion process, controlling the first interactive prop to move in the virtual scene according to a second motion track, wherein the motion direction of the second motion track is opposite to the motion direction of the first motion track.
9. The method of claim 1, wherein the first interactive prop is displayed after a target location in a virtual scene in response to a throwing operation of the first interactive prop, the method further comprising:
and responding to the condition that the first interactive prop is not triggered, displaying a prop picking control, wherein the prop picking control is used for recycling the first interactive prop.
10. The method of claim 9, wherein said displaying a prop pickup control in response to said first interactive prop being in an unactuated state comprises:
responding to the first interactive prop in an un-triggered state, and acquiring a second distance between a first virtual object in the virtual scene and the first interactive prop, wherein the first virtual object is a virtual object for throwing the first interactive prop;
in response to the second distance being less than or equal to a distance threshold, displaying the prop pickup control.
11. The method according to claim 1, wherein displaying the trigger area corresponding to the first interactive prop in the target position of the virtual scene includes at least one of:
highlighting an outline of the trigger region at the target location;
displaying the trigger area as a reference color at the target location.
12. The method of claim 1, wherein triggering the first interactive prop in response to detecting the trigger zone being attacked comprises:
displaying trigger countdown corresponding to the first interactive prop;
and in response to the triggering countdown not ending and the detection that the triggering area is attacked, executing the step of triggering the first interactive prop.
13. An interactive prop control apparatus, the apparatus comprising:
the first display module is used for responding to throwing operation of a first interactive prop and displaying the first interactive prop at a target position in a virtual scene;
the second display module is used for displaying a trigger area corresponding to the first interactive prop at the target position of the virtual scene;
and the triggering module is used for triggering the first interactive prop in response to the detection that the triggering area is attacked.
14. A computer device, characterized in that the computer device comprises one or more processors and one or more memories, in which at least one computer program is stored, which is loaded and executed by the one or more processors to implement the operations performed by the interactive prop control method according to any one of claims 1 to 12.
15. A computer-readable storage medium, having at least one computer program stored therein, the at least one computer program being loaded and executed by a processor to perform the operations performed by the interactive prop control method of any one of claims 1 to 12.
CN202110164952.7A 2021-02-05 2021-02-05 Interactive property control method and device, computer equipment and storage medium Active CN112755518B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110164952.7A CN112755518B (en) 2021-02-05 2021-02-05 Interactive property control method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110164952.7A CN112755518B (en) 2021-02-05 2021-02-05 Interactive property control method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112755518A true CN112755518A (en) 2021-05-07
CN112755518B CN112755518B (en) 2022-11-08

Family

ID=75705196

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110164952.7A Active CN112755518B (en) 2021-02-05 2021-02-05 Interactive property control method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112755518B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113490063A (en) * 2021-08-26 2021-10-08 上海盛付通电子支付服务有限公司 Method, device, medium and program product for live broadcast interaction

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
黑桐谷歌: "《哔哩哔哩https://www.bilibili.com/video/BV1is41127jz?p=2&vd_source=76d3264acb028cc08fccd0a145e89a77》", 23 October 2015 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113490063A (en) * 2021-08-26 2021-10-08 上海盛付通电子支付服务有限公司 Method, device, medium and program product for live broadcast interaction

Also Published As

Publication number Publication date
CN112755518B (en) 2022-11-08

Similar Documents

Publication Publication Date Title
CN111265869B (en) Virtual object detection method, device, terminal and storage medium
CN110427111B (en) Operation method, device, equipment and storage medium of virtual prop in virtual environment
CN110448891B (en) Method, device and storage medium for controlling virtual object to operate remote virtual prop
CN111282275B (en) Method, device, equipment and storage medium for displaying collision traces in virtual scene
CN110917619B (en) Interactive property control method, device, terminal and storage medium
CN110755841B (en) Method, device and equipment for switching props in virtual environment and readable storage medium
CN110721468B (en) Interactive property control method, device, terminal and storage medium
CN111408133B (en) Interactive property display method, device, terminal and storage medium
CN110585710B (en) Interactive property control method, device, terminal and storage medium
CN111589150B (en) Control method and device of virtual prop, electronic equipment and storage medium
CN111475573B (en) Data synchronization method and device, electronic equipment and storage medium
CN110538459A (en) Method, apparatus, device and medium for throwing virtual explosives in virtual environment
CN111035924A (en) Prop control method, device, equipment and storage medium in virtual scene
CN112221141B (en) Method and device for controlling virtual object to use virtual prop
CN111228809A (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN111202975B (en) Method, device and equipment for controlling foresight in virtual scene and storage medium
CN111475029B (en) Operation method, device, equipment and storage medium of virtual prop
CN111714893A (en) Method, device, terminal and storage medium for controlling virtual object to recover attribute value
CN111389005B (en) Virtual object control method, device, equipment and storage medium
CN111760284A (en) Virtual item control method, device, equipment and storage medium
CN111265857A (en) Trajectory control method, device, equipment and storage medium in virtual scene
CN112138384A (en) Using method, device, terminal and storage medium of virtual throwing prop
CN113289331A (en) Display method and device of virtual prop, electronic equipment and storage medium
CN111672106A (en) Virtual scene display method and device, computer equipment and storage medium
CN112076476A (en) Virtual object control method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40043486

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant